Arquivo da tag: Incerteza

Stop bullying the ‘soft’ sciences (L.A.Times)

OP-ED

The social sciences are just that — sciences.

By Timothy D. Wilson

July 12, 2012

Sociology studentA student is seen at the UC Irvine archive doing research for her sociology dissertation. (Los Angeles Times / July 9, 2009)

Once, during a meeting at my university, a biologist mentioned that he was the only faculty member present from a science department. When I corrected him, noting that I was from the Department ofPsychology, he waved his hand dismissively, as if I were a Little Leaguer telling a member of the New York Yankees that I too played baseball.

There has long been snobbery in the sciences, with the “hard” ones (physics, chemistry, biology) considering themselves to be more legitimate than the “soft” ones ( psychology, sociology). It is thus no surprise that many members of the general public feel the same way. But of late, skepticism about the rigors of social science has reached absurd heights.

The U.S. House of Representativesrecently voted to eliminate funding for political science research through the National Science Foundation. In the wake of that action, an opinion writer for the Washington Post suggested that the House didn’t go far enough. The NSF should not fund any research in the social sciences, wrote Charles Lane, because “unlike hypotheses in the hard sciences, hypotheses about society usually can’t be proven or disproven by experimentation.”

Lane’s comments echoed ones by Gary Gutting in the Opinionator blog of the New York Times. “While the physical sciences produce many detailed and precise predictions,” wrote Gutting, “the social sciences do not. The reason is that such predictions almost always require randomized controlled experiments, which are seldom possible when people are involved.”

This is news to me and the many other social scientists who have spent their careers doing carefully controlled experiments on human behavior, inside and outside the laboratory. What makes the criticism so galling is that those who voice it, or members of their families, have undoubtedly benefited from research in the disciplines they dismiss.

Most of us know someone who has suffered from depression and sought psychotherapy. He or she probably benefited from therapies such as cognitive behavioral therapy that have been shown to work in randomized clinical trials.

Problems such as child abuse and teenage pregnancy take a huge toll on society. Interventions developed by research psychologists, tested with the experimental method, have been found to lower the incidence of child abuse and reduce the rate of teenage pregnancies.

Ever hear of stereotype threat? It is the double jeopardy that people face when they are at risk of confirming a negative stereotype of their group. When African American students take a difficult test, for example, they are concerned not only about how well they will do but also about the possibility that performing poorly will reflect badly on their entire group. This added worry has been shown time and again, in carefully controlled experiments, to lower academic performance. But fortunately, experiments have also showed promising ways to reduce this threat. One intervention, for example, conducted in a middle school, reduced the achievement gap by 40%.

If you know someone who was unlucky enough to be arrested for a crime he didn’t commit, he may have benefited from social psychological experiments that have resulted in fairer lineups and interrogations, making it less likely that innocent people are convicted.

An often-overlooked advantage of the experimental method is that it can demonstrate what doesn’t work. Consider three popular programs that research psychologists have debunked: Critical Incident Stress Debriefing, used to prevent post-traumatic stress disorders in first responders and others who have witnessed horrific events; the D.A.R.E. anti-drug program, used in many schools throughout America; and Scared Straight programs designed to prevent at-risk teens from engaging in criminal behavior.

All three of these programs have been shown, with well-designed experimental studies, to be ineffective or, in some cases, to make matters worse. And as a result, the programs have become less popular or have changed their methods. By discovering what doesn’t work, social scientists have saved the public billions of dollars.

To be fair to the critics, social scientists have not always taken advantage of the experimental method as much as they could. Too often, for example, educational programs have been implemented widely without being adequately tested. But increasingly, educational researchers are employing better methodologies. For example, in a recent study, researchers randomly assigned teachers to a program called My Teaching Partner, which is designed to improve teaching skills, or to a control group. Students taught by the teachers who participated in the program did significantly better on achievement tests than did students taught by teachers in the control group.

Are the social sciences perfect? Of course not. Human behavior is complex, and it is not possible to conduct experiments to test all aspects of what people do or why. There are entire disciplines devoted to the experimental study of human behavior, however, in tightly controlled, ethically acceptable ways. Many people benefit from the results, including those who, in their ignorance, believe that science is limited to the study of molecules.

Timothy D. Wilson is a professor of psychology at the University of Virginia and the author of “Redirect: The Surprising New Science of Psychological Change.”

Scientific particles collide with social media to benefit of all (Irish Times)

The Irish Times – Thursday, July 12, 2012

xxx Large Hadron Collider at Cern: the research body now has 590,000 followers on Twitter

xxx Large Hadron Collider at Cern: the research body now has 590,000 followers on Twitter

MARIE BORAN

IN 2008 CERN switched on the Large Hadron Collider (LHC) in Geneva – around the same time it sent out its first tweet. Although the first outing of the LHC didn’t go according to plan, the Twitter account gained 10,000 followers within the first day, according to James Gillies, head of communications at Cern.

Speaking at the Euroscience Open Forum in Dublin this week, Gillies explained the role social media plays in engaging the public with the particle physics research its laboratory does. The Twitter account now has 590,000 followers and Cern broke important news via it in March 2010 by joyously declaring: “Experiment have seen collisions.”

“Why do we communicate at Cern? If you talk to the scientists who work there they will tell you it’s a good thing to do and they all want to do it,” Gillies said, adding that Cern is publicly funded so engaging with the people who pay the bills is important.

When the existence of the Higgs particle was announced last week, it wasn’t an exclusive press event. Live video was streamed across the web, questions were taken not only from journalists but also from Twitter followers, and Cern used this as a chance to announce jobs via Facebook.

While Cern appears to be the social media darling of the science world, other research institutes and scientists are still weighing up the pros and cons of platforms like Facebook, Twitter or YouTube.

There is a certain stigma attached to social networking sites, not just because much of the content is perceived as banal, but also because too much tweeting could be damaging to your image as a scientist.

Bora Zivkovic is blogs editor at Scientific American, organiser of the fast-growing science conference ScienceOnline and speaker at the social media panel this Saturday at the Euroscience Open Forum. He says the adoption of social media by scientists is slow but growing.

“Academics are quite risk-averse and are shy about trying new things that have a perceived potential to remove the edge they may have in the academic hierarchy, either through lost time or lost reputation.”

Zivkovic talks about fear of the “Sagan effect”, named after the late Carl Sagan. A talented astronomer and astrophysicist, he was loved by the public but snubbed by the science community.

“Many still see social media as self-promotion, which is still in some scientific circles viewed as a negative thing to do. The situation is reminiscent of the very slow adoption of email by researchers back in the early 1990s.

“Once the scientists figure out how to include social media in their daily workflow, realise it does not take away from their time but actually makes them more effective in reaching their academic goals, and realise that the ‘Sagan effect’ on reputation is a thing of the past, they will readily incorporate social media into their normal work.”

Many researchers still rely heavily on specialist mailing lists. The broadcast capability on social media is far greater and bespoke, claims Dr Matthew Rowe, research associate at the Knowledge Media Institute with the Open University.

“If I was to email people about some recent work I would presume that it would be marked as spam. However, if I was to announce the release of some work through social media, then a debate and conversation could evolve surrounding the topic; I have seen this happen many times on Facebook.”

Conversations on social media sites are often seen as trivial – for scientists, the end goal is “publish or perish”. Results must be published in a reputable academic journal and preferably cited by those in their area.

Twitter, it seems, can help. A 2011 paper from researcher Gunther Eysenbach found a correlation between Twitter activity and highly cited articles. The microblogging site may help citation rate or serve as a measure of how “citable” your paper may be.

In addition, a 2010 survey on Twitter found one-third of academics said they use it for sharing information with peers, communicating with students or as a real-time news source.

For some the argument for social media is the potential for connecting with volunteers and providing valuable data from the citizen scientist. Yolanda Melero Cavero’s MinkApp has connected locals with an effort to control the mink population in Scotland.

“The most interesting thing about MinkApp, for me, was the fact that the scientist was able to get 600 volunteers for her ecological study. Social media has the grassroots potential to engage with willing volunteers,” says Nancy Salmon, researcher at the department of occupational therapy at the University of Limerick.

Rowe gives some sage social media advice for academics about keeping on topic and your language jargon-free.

But there’s always room for humour as demonstrated by the Higgs boson jokes on Twitter and Facebook last week. As astronomer Phil Platt tweeted: “I’ve got 99.9999% problems, but a Higgs ain’t one.”

Local Weather Patterns Affect Beliefs About Global Warming (Science Daily)

People living in places experiencing warmer-than-normal temperatures at the time they were surveyed were significantly more likely than others to say there is evidence for global warming. (Credit: © Rafael Ben-Ari / Fotolia)

ScienceDaily (July 25, 2012) — Local weather patterns temporarily influence people’s beliefs about evidence for global warming, according to research by political scientists at New York University and Temple University. Their study, which appears in theJournal of Politics, found that those living in places experiencing warmer-than-normal temperatures at the time they were surveyed were significantly more likely than others to say there is evidence for global warming.

“Global climate change is one of the most important public policy challenges of our time, but it is a complex issue with which Americans have little direct experience,” wrote the study’s co-authors, Patrick Egan of New York University and Megan Mullin of Temple University. “As they try to make sense of this difficult issue, many people use fluctuations in local temperature to reassess their beliefs about the existence of global warming.”

Their study examined five national surveys of American adults sponsored by the Pew Research Center: June, July, and August 2006, January 2007, and April 2008. In each survey, respondents were asked the following question: “From what you’ve read and heard, is there solid evidence that the average temperature on earth has been getting warmer over the past few decades, or not?” On average over the five surveys, 73 percent of respondents agreed that Earth is getting warmer.

Egan and Mullin wondered about variation in attitudes among the survey’s respondents, and hypothesized that local temperatures could influence perceptions. To measure the potential impact of temperature on individuals’ opinions, they looked at zip codes from respondents in the Pew surveys and matched weather data to each person surveyed at the time of each poll. They used local weather data to determine if the temperature in the location of each respondent was significantly higher or lower than normal for that area at that time of year.

Their results showed that an abnormal shift in local temperature is associated with a significant shift in beliefs about evidence for global warming. Specifically, for every three degrees Fahrenheit that local temperatures in the past week have risen above normal, Americans become one percentage point more likely to agree that there is ”solid evidence” that Earth is getting warmer. The researchers found cooler-than-normal temperatures have similar effects on attitudes — but in the opposite direction.

The study took into account other variables that may explain the results — such as existing political attitudes and geography — and found the results still held.

The researchers also wondered if heat waves — or prolonged higher-than-normal temperatures — intensified this effect. To do so, they looked at respondents living in areas that experienced at least seven days of temperatures of 10° or more above normal in the three weeks prior to interview and compared their views with those who experienced the same number of hot days, but did not experience a heat wave.

Their estimates showed that the effect of a heat wave on opinion is even greater, increasing the share of Americans believing in global warming by 5.0 to 5.9 percentage points.

However, Egan and Mullin found the effects of temperature changes to be short-lived — even in the wake of heat waves. Americans who had been interviewed after 12 or more days had elapsed since a heat wave were estimated to have attitudes that were no different than those who had not been exposed to a heat wave.

“Under typical circumstances, the effects of temperature fluctuations on opinion are swiftly wiped out by new weather patterns,” they wrote. “More sustained periods of unusual weather cause attitudes to change both to a greater extent and for a longer period of time. However, even these effects eventually decay, leaving no long-term impact of weather on public opinion.”

The findings make an important contribution to the political science research on the relationship between personal experience and opinion on a larger issue, which has long been studied with varying results.

“On issues such as crime, the economy, education, health care, public infrastructure, and taxation, large shares of the public are exposed to experiences that could logically be linked to attitude formation,” the researchers wrote. “But findings from research examining how these experiences affect opinion have been mixed. Although direct experience — whether it be as a victim of crime, a worker who has lost a job or health insurance, or a parent with children in public schools — can influence attitudes, the impact of these experiences tends to be weak or nonexistent after accounting for typical predictors such as party identification and liberal-conservative ideology.”

“Our research suggests that personal experience has substantial effects on political attitudes,” Egan and Mullin concluded. “Rich discoveries await those who can explore these questions in ways that permit clean identification of these effects.”

Egan is an assistant professor in the Wilf Family Department of Politics at NYU and Mullin is an associate professor in the Department of Political Science at Temple University

Concerns Over Accuracy of Tools to Predict Risk of Repeat Offending (Science Daily)

ScienceDaily (July 24, 2012) — Use of risk assessment instruments to predict violence and antisocial behavior in 73 samples involving 24,827 people: systematic review and meta-analysis

Tools designed to predict an individual’s risk of repeat offending are not sufficient on their own to inform sentencing and release or discharge decisions, concludes a study published on the British Medical Journal website.

Although they appear to identify low risk individuals with high levels of accuracy, the authors say “their use as sole determinants of detention, sentencing, and release is not supported by the current evidence.”

Risk assessment tools are widely used in psychiatric hospitals and criminal justice systems around the world to help predict violent behavior and inform sentencing and release decisions. Yet their predictive accuracy remains uncertain and expert opinion is divided.

So an international research team, led by Seena Fazel at the University of Oxford, set out to investigate the predictive validity of tools commonly used to assess the risk of violence, sexual, and criminal behavior.

They analyzed risk assessments conducted on 24,827 people from 13 countries including the UK and the US. Of these, 5,879 (24%) offended over an average of 50 months.

Differences in study quality were taken into account to identify and minimize bias.

Their results show that risk assessment tools produce high rates of false positives (individuals wrongly identified as being at high risk of repeat offending) and predictive accuracy at around chance levels when identifying risky persons. For example, 41% of individuals judged to be at moderate or high risk by violence risk assessment tools went on to violently offend, while 23% of those judged to be at moderate or high risk by sexual risk assessment tools went on to sexually offend.

Of those judged to be at moderate or high risk of committing any offense, just over half (52%) did. However, of those predicted not to violently offend, 91% did not, suggesting that these tools are more effective at screening out individuals at low risk of future offending.

Factors such as gender, ethnicity, age or type of tool used did not appear to be associated with differences in predictive accuracy.

Although risk assessment tools are widely used in clinical and criminal justice settings, their predictive accuracy varies depending on how they are used, say the authors.

“Our review would suggest that risk assessment tools, in their current form, can only be used to roughly classify individuals at the group level, not to safely determine criminal prognosis in an individual case,” they conclude. The extent to which these instruments improve clinical outcomes and reduce repeat offending needs further research, they add.

Chris Hedges | Totalitarian Systems Always Begin by Rewriting the Law (Truth Out)

Monday, 26 March 2012 09:06By Chris Hedges, Truthdig | Op-Ed

Chris Hedges speaks at Occupy DC, January 9, 2012.

Chris Hedges speaks at Occupy DC, January 9, 2012. (Photo: Shrieking Tree)

I spent four hours in a third-floor conference room at 86 Chambers St. in Manhattan on Friday as I underwent a government deposition. Benjamin H. Torrance, an assistant U.S. attorney, carried out the questioning as part of the government’s effort to decide whether it will challenge my standing as a plaintiff in the lawsuit I have brought with others against President Barack Obama and Secretary of Defense Leon Panetta over the National Defense Authorization Act (NDAA), also known as the Homeland Battlefield Bill.

The NDAA implodes our most cherished constitutional protections. It permits the military to function on U.S. soil as a civilian law enforcement agency. It authorizes the executive branch to order the military to selectively suspend due process and habeas corpus for citizens. The law can be used to detain people deemed threats to national security, including dissidents whose rights were once protected under the First Amendment, and hold them until what is termed “the end of the hostilities.” Even the name itself—the Homeland Battlefield Bill—suggests the totalitarian concept that endless war has to be waged within “the homeland” against internal enemies as well as foreign enemies.

Judge Katherine B. Forrest, in a session starting at 9 a.m. Thursday in the U.S. District Court for the Southern District of New York, will determine if I have standing and if the case can go forward. The attorneys handling my case, Bruce Afran and Carl Mayer, will ask, if I am granted standing, for a temporary injunction against the Homeland Battlefield Bill. An injunction would, in effect, nullify the law and set into motion a fierce duel between two very unequal adversaries—on the one hand, the U.S. government and, on the other, myself, Noam Chomsky, Daniel Ellsberg, the Icelandic parliamentarian Birgitta Jónsdóttir and three other activists and journalists. All have joined me as plaintiffs and begun to mobilize resistance to the law through groups such as Stop NDAA.

The deposition was, as these things go, conducted civilly. Afran and Mayer, the attorneys bringing the suit on my behalf, were present. I was asked detailed questions by Torrance about my interpretation of Section 1021 and Section 1022 of the NDAA. I was asked about my relationships and contacts with groups on the U.S. State Department terrorism list. I was asked about my specific conflicts with the U.S. government when I was a foreign correspondent, a period in which I reported from El Salvador, Nicaragua, the Middle East, the Balkans and other places. And I was asked how the NDAA law had impeded my work.

It is in conference rooms like this one, where attorneys speak in the arcane and formal language of legal statutes, that we lose or save our civil liberties. The 2001 Authorization to Use Military Force Act, the employment of the Espionage Act by the Obama White House against six suspected whistle-blowers and leakers, and the Homeland Battlefield Bill have crippled the work of investigative reporters in every major newsroom in the country. Government sources that once provided information to counter official narratives and lies have largely severed contact with the press. They are acutely aware that there is no longer any legal protection for those who dissent or who expose the crimes of state. The NDAA threw in a new and dangerous component that permits the government not only to silence journalists but imprison them and deny them due process because they “substantially supported” terrorist groups or “associated forces.”

Those of us who reach out to groups opposed to the U.S. in order to explain them to the American public will not be differentiated from terrorists under this law. I know how vicious the government can be when it feels challenged by the press. I covered the wars in El Salvador and Nicaragua from 1983 to 1988. Press members who reported on the massacres and atrocities committed by the Salvadoran military, as well as atrocities committed by the U.S.-backed Contra forces in Nicaragua, were repeatedly denounced by senior officials in the Reagan administration as fellow travelers and supporters of El Salvador’s Farabundo Marti National Liberation (FMLN) rebels or the leftist Sandinista government in Managua, Nicaragua.

The Reagan White House, in one example, set up an internal program to distort information and intimidate and attack those of us in the region who wrote articles that countered the official narrative. The program was called “public diplomacy.”Walter Raymond Jr., a veteran CIA propagandist, ran it. The goal of the program was to manage “perceptions” about the wars in Central America among the public. That management included aggressive efforts to destroy the careers of reporters who were not compliant by branding them as communists or communist sympathizers. If the power to lock us up indefinitely without legal representation had been in the hands of Elliott Abrams or Oliver North or Raymond, he surely would have used it.

Little has changed. On returning not long after 9/11 from a speaking engagement in Italy I was refused entry into the United States by customs officials at the Newark, N.J., airport. I was escorted to a room filled with foreign nationals. I was told to wait. A supervisor came into the room an hour later. He leaned over the shoulder of the official seated at a computer in front of me. He said to this official: “He is on a watch. Tell him he can go.” When I asked for further information I was told no one was authorized to speak to me. I was handed my passport and told to leave the airport.

Glenn Greenwald, the columnist and constitutional lawyer, has done the most detailed analysis of the NDAA bill. He has pointed out that the crucial phrases are “substantially supported” and “associated forces.” These two phrases, he writes, allow the government to expand the definition of terrorism to include groups that were not involved in the 9/11 attacks and may not have existed when those attacks took place.

It is worth reading Sections 1021 and 1022 of the bill. Section 1021 of the NDAA “includes the authority for the Armed Forces of the United States to detain covered persons (as defined in subsection (b)) pending disposition under the law of war.” Subsection B defines covered persons like this: “(b) Covered Persons—A covered person under this section is any person as follows: (1) A person who planned, authorized, committed, or aided the terrorist attacks that occurred on September 11, 2001, or harbored those responsible for those attacks. (2) A person who was a part of or substantially supported Al-Qaeda, the Taliban, or associated forces that are engaged in hostilities against the U.S. or its coalition partners.” Section 1022, Subsection C, goes on to declare that covered persons are subject to: “(1) Detention under the law of war without trial until the end of the hostilities authorized by the Authorization for Use of Military Force.” And Section 1022, Subsection A, Item 4, allows the president to waive the requirement of legal evidence in order to condemn a person as an enemy of the state if that is believed to be in the “national security interests of the United States.”

The law can be used to detain individuals who are not members of terrorist organizations but have provided, in the words of the bill, substantial support even to “associated forces.” But what constitutes substantial? What constitutes support? What are these “associated forces”? What is defined under this law as an act of terror? What are the specific activities of those purportedly “engaged in hostilities against the United States”? None of this is answered. And this is why, especially as acts of civil disobedience proliferate, the NDAA law is so terrifying. It can be used by the military to seize and detain citizens and deny legal recourse to anyone who defies the corporate state.

Torrance’s questions to me about incidents that occurred during my reporting were typified by this back and forth, which I recorded:

Torrance: In paragraph eight of your declaration you refer to the type of journalism we have just been discussing, which conveyed opinions, programs and ideas as being brought within the scope of Section 1021’s provision defining a covered people as one who has substantially supported or directly supported the acts and activities of such individuals or organizations and allies of associated forces. Why do you believe journalistic activity could be brought within that statute?

Hedges: Because anytime a journalist writes and reports in a way that challenges the official government narrative they come under fierce attack.

Torrance: What kind of attack do they come under?

Hedges: It is a range. First of all, the propaganda attempts to discredit the reporting. It would be an attempt to discredit the individual reporter. It would be a refusal to intercede when allied governments physically detain and expel the reporter because of reporting that both that allied government and the United States did not want. And any foreign correspondent that is any good through their whole career has endured all of this.

Torrance: Remind me, the phrase you used that you believed would trigger that was “coverage disfavorable to the United States”?

Hedges: I didn’t say that.

Torrance: Remind me of the phrase.

Hedges: I said it was coverage that challenged the official narrative.

Torrance: Have you ever been detained by the United States government?

Hedges: Yes.

Torrance: When and where?

Hedges: The First Gulf War.

Torrance: What were the circumstances of that?

Hedges: I was reporting outside of the pool system.

Torrance: How did that come about that you were detained?

Hedges: I was discovered by military police without an escort.

Torrance: And they took you into custody?

Hedges: Yes.

Torrance: For how long?

Hedges: Not a long time. They seized my press credentials and they called Dhahran, which is where the sort of central operations were, and I was told that within a specified time—and I don’t remember what that time was—I had to report to the authorities in Dhahran.

Torrance: Where is Dhahran?

Hedges: Saudi Arabia.

Torrance: And that was a U.S. military headquarters of some sort?

Hedges: Well, it was the press operations run by the U.S. Army.

Torrance: And what was the asserted basis for detaining you?

Hedges: That I had been reporting without an escort.

Torrance: And was that a violation of some law or regulation that you know of?

Afran: Note, object to form. Laws and regulations are two different things.

Hedges: Not in my view. …

Torrance: Did the people who detained you specify any law or regulation that in their view you violated?

Hedges: Let me preface that by saying that as a foreign correspondent with a valid journalistic visa, which I had, in a country like Saudi Arabia, the United States does not have the authority to detain me or tell me what I can report on. They attempted to do that, but neither I [nor] The New York Times [my employer at the time] recognized their authority.

Torrance: When you obtained that journalistic visa did you agree to any conditions on what you would do or where you would be permitted to go?

Hedges: From the Saudis?

Torrance: The visa was issued by the Saudi government?

Hedges: Of course, I need a visa from the Saudi government to get into Saudi.

Torrance: Did you agree to any such conditions?

Hedges: No. Not with the Saudis.

Torrance: Were there any other journalists of which you were aware who [were] reporting outside of the pool system?

Hedges: Yes.

Torrance: Were they also detained, to your knowledge?

Hedges: Yes.

The politeness of the exchanges, the small courtesies extended when we needed a break, the idle asides that took place during the brief recesses, masked the deadly seriousness of the proceeding. If there is no rolling back of the NDAA law we cease to be a constitutional democracy.

Totalitarian systems always begin by rewriting the law. They make legal what was once illegal. Crimes become patriotic acts. The defense of freedom and truth becomes a crime. Foreign and domestic subjugation merges into the same brutal mechanism. Citizens are colonized. And it is always done in the name of national security. We obey the new laws as we obeyed the old laws, as if there was no difference. And we spend our energy and our lives appealing to a dead system.

Franz Kafka understood the totalitarian misuse of law, the ability by the state to make law serve injustice and yet be held up as the impartial arbiter of good and evil. In his stories “The Trial” and “The Castle” Kafka presents pathetic supplicants before the law who are passed from one doorkeeper, administrator or clerk to the next in an endless and futile quest for justice. In the parable “Before the Law” the supplicant dies before even being permitted to enter the halls of justice. In Kafka’s dystopian vision, the law is the mechanism by which injustice and tyranny are perpetuated. A bureaucratic legal system uses the language of justice to defend injustice. The cowed populations in tyrannies become for Kafka so broken, desperate and passive that they are finally complicit in their own enslavement. The central character in “The Trial,” known as Josef K, offers little resistance at the end of the story when two men arrive to oversee his execution. Josef K. leads them to a quarry where he is expected to kill himself. He cannot. The men do it for him. His last words are: “Like a dog!”

A Century Of Weather Control (POP SCI)

Posted 7.19.12 at 6:20 pm – http://www.popsci.com

 

Keeping Pilots Updated, November 1930

It’s 1930 and, for obvious reasons, pilots want regular reports on the weather. What to do? Congress’s solution was to give the U.S. Weather Bureau cash to send them what they needed. It was a lot of cash, too: $1.4 million, or “more than one third the sum it spend annually for all of its work.”

About 13,000 miles of airway were monitored for activity, and reports were regularly sent via the now quaintly named “teletype”–an early fax machine, basically, that let a typed message be reproduced. Pilots were then radioed with the information.

From the article “Weather Man Makes the Air Safe.”

 

Battling Hail, July 1947

We weren’t shy about laying on the drama in this piece on hail–it was causing millions in damage across the country and we were sick of it. Our writer says, “The war against hail has been declared.” (Remember: this was only two years after World War II, which was a little more serious. Maybe our patriotism just wouldn’t wane.)

The idea was to scatter silver iodide as a form of “cloud seeding”–turning the moisture to snow before it hails. It’s a process that’s still toyed with today.

From the article “The War Against Hail.”

 

Hunting for a Tornado “Cure,” March 1958

1957 was a record-breaking year for tornadoes, and PopSci was forecasting even rougher skies for 1958. As described by an official tornado watcher: ‘”They’re coming so fast and thick … that we’ve lost count.'”

To try to stop it, researchers wanted to learn more. Meteorologists asked for $5 million more a year from Congress to be able to study tornadoes whirling through the Midwest’s Tornado Alley, then, hopefully, learn what they needed to do to stop them.

From the article “What We’re Learning About Tornadoes.”

 

Spotting Clouds With Nimbus, November 1963

Weather satellites were a boon to both forecasters and anyone affected by extreme weather. The powerful Hurricane Esther was discovered two days before anything else spotted it, leaving space engineers “justifiably proud.” The next satellite in line was the Nimbus, which Popular Science devoted multiple pages to covering, highlighting its ability to photograph cloud cover 24 hours a day and give us better insight into extreme weather.

Spoiler: the results really did turn out great, with Nimbus satellites paving the way for modern GPS devices.

From the article “The Weather Eye That Never Blinks.”

 

Saving Money Globally With Forecasts, November 1970

Optimism for weather satellites seemed to be reaching a high by the ’70s, with Popular Science recounting all the disasters predicted–how they “saved countless lives through early hurricane warnings”–and now even saying they’d save your vacation.

What they were hoping for then was an accurate five-day forecast for the world, which they predicted would save billions and make early warnings even better.

From the article “How New Weather Satellites Will Give You More Reliable Forecasts.”

 

Extreme Weather Alerts on the Radio, July 1979

Those weather alerts that come on your television during a storm–or at least one radio version of those–were documented byPopular Science in 1979. But rather than being something that anyone could tune in to, they were specialized radios you had to purchase, which seems like a less-than-great solution to the problem. But at this point the government had plans to set up weather monitoring stations near 90 percent of the country’s population, opening the door for people to find out fast what the weather situation was.

From the article “Weather-Alert Radios–They Could Save Your Life.”

 

Stopping “Bolts From the Blue,” May 1990

Here Popular Science let loose a whooper for anyone with a fear of extreme weather: lightning kills a lot more people every year than you think, and sometimes a lightning bolt will come and hit you even when there’s not a storm. So-called “bolts from the blue” were a part of the story on better predicting lightning, a phenomenon more manic than most types of weather. Improved sensors played a major part in better preparing people before a storm.

From the article “Predicting Deadly Lightning.”

 

Infrared Views of Weather, August 1983

Early access to computers let weather scientists get a 3-D, radar-based view of weather across the country. The system culled information from multiple sources and placed it in one viewable display. (The man pictured looks slightly bored for how revolutionary it is.) The system was an attempt to take global information and make it into “real-time local predictions.”

From the article “Nowcasting: New Weather Computers Pinpoint Deadly Storms.”

 

Modernizing the National Weather Service, August 1997

A year’s worth of weather detection for every American was coming at the price of “a Big Mac, fries, and a Coke,” the deputy director of the National Weather Service said in 1997. The computer age better tied together the individual parts of weather forecasting for the NWS, leaving a unified whole that could grab complicated meteorological information and interpret it in just a few seconds.

From the article “Weather’s New Outlook.”

 

Modeling Weather With Computers, September 2001

Computer simulations, we wrote, would help us predict future storms more accurately. But it took (at the time) the largest supercomputer around to give us the kinds of models we wanted. Judging by the image, we might’ve already made significant progress on the weather modeling front.

Anarchists attack science (Nature)

Armed extremists are targeting nuclear and nanotechnology workers.

Leigh Phillips
28 May 2012

Investigations of the shooting of nuclear-engineering head Roberto Adinolfi have confirmed the involvement of an eco-anarchist group. P. RATTINI/AFP/GETTY

A loose coalition of eco-anarchist groups is increasingly launching violent attacks on scientists.

A group calling itself the Olga Cell of the Informal Anarchist Federation International Revolutionary Front has claimed responsibility for the non-fatal shooting of a nuclear-engineering executive on 7 May in Genoa, Italy. The same group sent a letter bomb to a Swiss pro-nuclear lobby group in 2011; attempted to bomb IBM’s nanotechnology laboratory in Switzerland in 2010; and has ties with a group responsible for at least four bomb attacks on nanotechnology facilities in Mexico. Security authorities say that such eco-anarchist groups are forging stronger links.

On 11 May, the cell sent a four-page letter to the Italian newspaper Corriere della Sera claiming responsibility for the shooting of Roberto Adinolfi, the chief executive of Ansaldo Nucleare, the nuclear-engineering subsidiary of aerospace and defence giant Finmeccanica. Believed by authorities to be genuine, the letter is riddled with anti-science rhetoric. The group targeted Adinolfi because he is a “sorcerer of the atom”, it wrote. “Adinolfi knows well that it is only a matter of time before a European Fukushima kills on our continent.”

“Science in centuries past promised us a golden age, but it is pushing us towards self-destruction and total slavery,” the letter continues. “With this action of ours, we return to you a tiny part of the suffering that you, man of science, are pouring into this world.” The group also threatened to carry out further attacks.

The Italian Ministry of the Interior has subsequently beefed up security at thousands of potential political, industrial and scientific targets. The measures include assigning bodyguards to 550 individuals.

The Olga Cell, named after an imprisoned Greek anarchist, is part of the Informal Anarchist Federation, which, in April 2011, claimed responsibility for sending a parcel bomb that exploded at the offices of the Swiss nuclear lobby group, Swissnuclear, in Olten. A letter found in the remains of the bomb demanded the release of three individuals who had been detained for plotting an attack on IBM’s flagship nanotechnology facility in Zurich earlier that year. In a situation report published this month, the Swiss Federal Intelligence Service explicitly linked the federation to the IBM attack.

The Informal Anarchist Federation argues that technology, and indeed civilization, is responsible for the world’s ills, and that scientists are the handmaidens of capitalism. “Finmeccanica means bio- and nanotechnology. Finmeccanica means death and suffering, new frontiers of Italian capitalism,” the letter reads.

Gathering momentum
The cell says that it is uniting with eco-anarchist groups in other countries, including Mexico, Chile, Greece and the United Kingdom. Mexico has already seen similar attacks: in August 2011, a group called Individuals Tending Towards Savagery sent a parcel bomb that wounded two nanotechnology researchers at the Monterrey Institute of Technology. One received burns to his legs and a perforated eardrum and the other had his lung pierced by shrapnel (G. Herrera Corral Nature 476,373; 2011). The package contained enough explosive to collapse part of the building, according to police, but failed to detonate properly.

Earlier that year, the same group sent two bombs to the nanotechnology facility at the Polytechnic University of the Valley of Mexico. One was intercepted before anyone could be harmed, but the second detonated, injuring a security guard. It is not clear how closely the group is tied to the Informal Anarchist Federation, but in online forums the two bodies offer “direct support” for each other’s activities and talk of a “blossoming” of a more organized eco-anarchist movement.

In the wake of the Mexican bombings, the Monterrey Institute installed metal detectors, began to use police sniffer dogs and started random inspections of vehicles and packages. After a letter bomb addressed to a nanotechnology researcher at the Polytechnic University of Pachuca in Hidalgo exploded in December last year, the institute installed a perimeter fence and scanners, and campuses across the state heightened security measures.

Italian police investigating the shooting say that they are concerned about the rise in violent action by anarchist groups amid Europe’s economic crisis. On 23 May, for example, members of the Informal Anarchist Federation attacked railway signals in Bristol, UK, causing severe transport delays. An online message from the group said that the targets had been chosen to disrupt employees of the Ministry of Defence and defence-technology businesses in the area, including Raytheon and QinetiQ.

The Swiss report also noted signs of “an increasing degree of international networking between perpetrators”. The level of risk to scientists depends on their field of work, says Simon Johner, a spokesman for the Swiss Federal Intelligence Service. “We are not able to tell them what to do. We can only make them aware of the dangers. It’s up to institutions to take preventative actions.” The agency is working with police forces, businesses and research communities to assess and tackle the threat.

“These people do not represent mainstream opinion. But I am still pretty frightened by this violence,” says Michael Hagmann, a biochemist and head of corporate communications for the Swiss Federal Laboratories for Materials Science and Technology near Zurich, a public-sector partner of the IBM facility that also does nanotechnology research.

“Just a few weeks after the attempted bombing, we were due to have a large conference on nanotechnology and we were really quite nervous” about going ahead with it, Hagmann says. “But we concluded that the public discussion was more important and didn’t want to scare people by having 20 police guarding us. It would have sent the wrong message.”

Nature 485, 561 (31 May 2012) doi:10.1038/485561a

*   *   *

Published online 22 August 2011 | Nature 476, 373 (2011) | doi:10.1038/476373a

Column: World View

Stand up against the anti-technology terrorists

Home-made bombs are being sent to physicists in Mexico. Colleagues around the world should ensure their own security, urges Gerardo Herrera Corral.

Gerardo Herrera Corral

My elder brother, Armando Herrera Corral, was this month sent a tube of dynamite by terrorists who oppose his scientific research. The home-made bomb, which was in a shoe-box-sized package labelled as an award for his personal attention, exploded when he pulled at the adhesive tape wrapped around it. My brother, director of the technology park at the Monterrey Institute of Technology in Mexico, was standing at the time, and suffered burns to his legs and a perforated eardrum. More severely injured by the blast was his friend and colleague Alejandro Aceves López, whom my brother had gone to see in his office to share a cup of coffee and open the award. Aceves López was sitting down when my brother opened the package; he took the brunt of the explosion in his chest, and shrapnel pierced one of his lungs.

Both scientists are now recovering from their injuries, but they were extremely fortunate to survive. The bomb failed to go off properly, and only a fraction of the 20-centimetre-long cylinder of dynamite ignited. The police estimate that the package contained enough explosive to take down part of the building, had it worked as intended.

The next day, I, too, was sent a suspicious package. I have been advised by the police not to offer details of why the package was judged of concern, but it arrived by an unusual procedure, and on a Sunday. It tested positive for explosives, and was taken away by the bomb squad, which declared a false alarm after finding that the parcel contained only books. My first reaction was to leave the country. Now, I am confused as to how I should respond.

As an academic scientist, why was my brother singled out in this way? He does not work in a field that is usually considered high-risk for terrorist activity, such as medical research on animals. He works on computer science, and Aceves López is an expert in robotics. I am a high-energy physicist and coordinate the Mexican contribution to research using the Large Hadron Collider at CERN, Europe’s particle-physics laboratory; I have worked in the field for 15 years.

An extremist anarchist group known as Individuals Tending to Savagery (ITS) has claimed responsibility for the attack on my brother. This is confirmed by a partially burned note found by the authorities at the bomb site, signed by the ITS and with a message along the lines of: “If this does not get to the newspapers we will produce more explosions. Wounding or killing teachers and students does not matter to us.”

In statements posted on the Internet, the ITS expresses particular hostility towards nano­technology and computer scientists. It claims that nanotechnology will lead to the downfall of mankind, and predicts that the world will become dominated by self-aware artificial-intelligence technology. Scientists who work to advance such technology, it says, are seeking to advance control over people by ‘the system’. The group praises Theodore Kaczynski, the Unabomber, whose anti-technology crusade in the United States in 1978–95 killed three people and injured many others.

The group’s rhetoric is absurd, but I urge colleagues around the world to take the threat that it poses to researchers seriously. Information gathered by Mexican federal authorities and Interpol link it to actions in countries including Spain, France and Chile. In April this year, the ITS sent a bomb — similar to the one posted to my brother — to the head of the Nanotechnology Engineering Division at the Polytechnic University of Mexico Valley in Tultitlan, although that device did not explode. In May, the university received a second parcel bomb, with a message reading: “This is not a joke: last month we targeted Oscar Camacho, today the institution, tomorrow who knows? Open fire on nanotechnology and those who support it!”

“I believe that terror should not succeed in establishing fear and imposing conduct.”

The scientific community must be made aware of such organizations, and of their capacity for destruction. Nanotechnology-research institutes and departments, companies and professional associations must beef up their security procedures, particularly on how they receive and accept parcels and letters.

I would like to stand up and speak in this way because I believe that terror should not succeed in establishing fear and imposing conduct that takes us far from the freedom we enjoy. I would like the police to take these events seriously; they are becoming a real threat to society. I would also like to express my solidarity with the Monterrey Institute of Technology — the institution that gave me both financial support to pursue my undergraduate studies and high-level academic training.

To oppose technology is not an unacceptable way to think. We may well debate the desirability of further technical development in our society. Yet radical groups such as the ITS overlook a crucial detail: it is not technology that is the problem, but how we use it. After Alfred Nobel invented dynamite he became a rich man, because it found use in mining, quarrying, construction and demolition. But people can also decide to put dynamite into a parcel and address it to somebody with the intention of killing them.

Gerardo Herrera Corral is a physicist at the Research and Advanced Studies Centre of the National Polytechnic Institute of Mexico in Mexico City.

Climate Change Strikes Especially Hard Blow to Native Americans (PBS)

CLIMATE CHANGE — July 19, 2012 at 3:42 PM EDT

BY: SASKIA DE MELKER AND REBECCA JACOBSON

Watch Native American Communities Plan for Climate Change Future on PBS. See more from PBS NewsHour.

On Thursday’s NewsHour, NewsHour correspondent Hari Sreenivasan moderated a panel discussion on how Native American tribes are coping with climate change.

The panel included four native leaders representing their communities at the First Stewards symposium:

When we began our NewsHour coverage on communities across the United States coping with climate change, we didn’t plan to focus on Native American tribes. But we soon realized that indigenous communities are on the frontlines of America’s climate-related dangers.

Native Americans make up about one percent of the United States population, but they manage more than 95 million acres of land. Their reservations lie in some of the most diverse ecosystems in the country, ranging from Alaska to the coasts of Florida. That diversity – both geographically and culturally – makes them a sort of demographic microcosm of the United States. That means the climate shifts that they are feeling now could give clues to what other Americans can expect might see in the near future.

Recent studies, including those from the National Wildlife Federation ,the EPA, and the USDA, highlight the disproportionate vulnerability of tribes to climate-related hazards such as coastal erosion, rising temperatures and extreme weather. Tribes depend on the land and natural resources for their culture and livelihood. What’s more, reservations often have high rates of poverty, unemployment and a lack of resources that would allow them to adapt to long-term climate changes.

We’ve reported on how rising seas threaten tribal land along the Louisiana coast. We’ve looked at the impact of a depleted salmon population on Northwest tribes. And we recently visited Washington state’s Quileute tribe, which has fought to reclaim land threatened by floods and sea level rise.

View photo essay

Relocating to adapt to environmental threats or disasters declines is not always a viable option for tribes, both because of the connection to their origins but also because they may lack the resources needed to move, said Larry Wasserman, environmental policy manager for the Swinomish tribe in the Pacific Northwest.

“Rather than being a mobile society that can move away from climatic changes, they need to think about how do they stay on this piece of ground and continue to live the lifestyle that they’ve been able to live, and how can their great-great-great-grandchildren do that,” Wasserman said.

Tony Foster, chairman of the Quileute Nation said that native people are in tune with the climate of their homelands and know early on when the balance of the ecosystem has been disrupted. “The Quileute has been here for over 10,000 years,” he said. “We know the layout of the land, and we know the conditions of our environment.”

“Traditional values teach us to be good ancestors,” added Micah McCarty, chairman of the Makah Tribe in Neah Bay, Washington. “Future generations are going to look back at us and say, ‘What did you do about this?'”

That forward thinking is necessary for planning for climate change which is defined over at least a 30-year range and is often modeled on time scales looking more than hundreds of years into the future.

And Jeff Mears, member and environmental area manager for the Oneida tribe in Wisconsin, said it’s important that the tribes are defined by more than their past.

Because many tribes have a unique status as sovereign nations, they can also implement their own initiatives and models for managing their environment. The Swinomish tribe, for example, has developed its own climate adaptation plan.

Tribal governments also want more say at the federal level when it comes to addressing in climate change.

There needs to be more “recognition from western science of the value of traditional ecological knowledge,” McCarty said. “So we need to look at how we can better inform the government of what tribal leaders bring to the table in regard to responding to climate change.”

And that’s the aim of a gathering to be held at the Smithsonian’s National Museum of the American Indian in Washington D.C. this week. The First Stewards symposium will bring together hundreds of indigenous tribal elders, leaders, and scientists from across America to discuss how best to confront past, present, and future adaptation to climate change.

See all of our coverage of how Native American communities are coping with climate change:

Native Lands Wash Away as Sea Levels Rise

Native Americans’ tribal lands along the Louisiana coast are washing away as sea levels rise and marshes sink. We report from Isle de Jean Charles, a community that is slowly disappearing into the sea.

The Northwest’s Salmon People Face a Salmon-less Future

For Northwest tribes, fishing for salmon is more than a food source, it’s a way of life. Now the climate may push the fish towards extinction. Together with KCTS 9 and EarthFix, NewsHour recently visited the Swinomish Indian reservation to see how they are coping.

Climate Change Threatens the ‘Twilight’ Tribe

Washington’s Quileute tribe, thrust into the spotlight by the “Twilight” series,’ has been caught in a struggle to reclaim land threatened by floods and sea level rise. Together with KCTS9 and EarthFix, NewsHour visited the tribe to hear their story.

IMF’s Peter Doyle scorns its ‘tainted’ leadership (BBC)

20 July 2012 Last updated at 11:50 GMT

Christine LagardePeter Doyle claims there was a “fundamental illegitimacy” in Christine Lagarde’s appointment

A top economist at the International Monetary Fund has poured scorn on its “tainted” leadership and said he is “ashamed” to have worked there.

Peter Doyle said in a letter to the IMF executive board that he wanted to explain his resignation after 20 years.

He writes of “incompetence”, “failings” and “disastrous” appointments for the IMF’s managing director, stretching back 10 years.

No one from the Washington-based IMF was immediately available for comment.

Mr Doyle, former adviser to the IMF’s European Department, which is running the bailout programs for Greece, Portugal and Ireland, said the Fund’s delay in warning about the urgency of the global financial crisis was a failure of the “first order”.

In the letter, dated 18 June and obtained by the US broadcaster CNN, Mr Doyle said the failings of IMF surveillance of the financial crisis “are, if anything, becoming more deeply entrenched”.

He writes: “This fact is most clear in regard to appointments for managing director which, over the past decade, have all-too-evidently been disastrous.

“Even the current incumbent [Christine Lagarde] is tainted, as neither her gender, integrity, or elan can make up for the fundamental illegitimacy of the selection process.”

Mr Doyle is thought to be echoing here widespread criticism that the head of the IMF is always a European, while the World Bank chief is always a US appointee.

Mr Doyle concludes his letter: “There are good salty people here. But this one is moving on. You might want to take care not to lose the others.”

The IMF could not be reached immediately by the BBC. However, CNN reported that a Fund spokesman told it that there was nothing to substantiate Mr Doyle’s claims and that the IMF had held its own investigations into surveillance of the financial crisis.

Analysis

image of Andrew WalkerAndrew WalkerBBC World Service Economics correspondent

Peter Doyle’s letter is short but the criticism excoriating. Perhaps the bigger of the two main charges is that the IMF failed to warn enough about the problems that led to the global financial crises.

The IMF has had investigations which have, up to a point, made similar criticisms, but not in such inflammatory terms. The IMF did issue some warnings, but the allegation that they were not sustained or timely enough and were actively suppressed raises some very big questions about the IMF’s role.

Then there is the description of the managing director as tainted. It’s not personal. It’s a familiar attack on a process which always selects a European. It’s still striking, though, to hear it from someone so recently on the inside.

 

Disorderly Conduct: Probing the Role of Disorder in Quantum Coherence (Science Daily)

ScienceDaily (July 19, 2012) — A new experiment conducted at the Joint Quantum Institute (JQI)* examines the relationship between quantum coherence, an important aspect of certain materials kept at low temperature, and the imperfections in those materials. These findings should be useful in forging a better understanding of disorder, and in turn in developing better quantum-based devices, such as superconducting magnets.

Figure 1 (top): Two thin planes of cold atoms are held in an optical lattice by an array of laser beams. Still another laser beam, passed through a diffusing material, adds an element of disorder to the atoms in the form of a speckle pattern. Figure 2 (bottom): Interference patterns resulting when the two planes of atoms are allowed to collide. In (b) the amount of disorder is just right and the pattern is crisp. In (c) too much disorder has begun to wash out the pattern. In (a) the pattern is complicated by the presence of vortices in the among the atoms, vortices which are hard to see in this image taken from the side. (Credit: Matthew Beeler)

Most things in nature are imperfect at some level. Fortunately, imperfections — a departure, say, from an orderly array of atoms in a crystalline solid — are often advantageous. For example, copper wire, which carries so much of the world’s electricity, conducts much better if at least some impurity atoms are present.

In other words, a pinch of disorder is good. But there can be too much of this good thing. The issue of disorder is so important in condensed matter physics, and so difficult to understand directly, that some scientists have been trying for some years to simulate with thin vapors of cold atoms the behavior of electrons flowing through solids trillions of times more dense. With their ability to control the local forces over these atoms, physicists hope to shed light on more complicated case of solids.

That’s where the JQI experiment comes in. Specifically, Steve Rolston and his colleagues have set up an optical lattice of rubidium atoms held at temperature close to absolute zero. In such a lattice atoms in space are held in orderly proximity not by natural inter-atomic forces but by the forces exerted by an array of laser beams. These atoms, moreover, constitute a Bose Einstein condensate (BEC), a special condition in which they all belong to a single quantum state.

This is appropriate since the atoms are meant to be a proxy for the electrons flowing through a solid superconductor. In some so called high temperature superconductors (HTSC), the electrons move in planes of copper and oxygen atoms. These HTSC materials work, however, only if a fillip of impurity atoms, such as barium or yttrium, is present. Theorists have not adequately explained why this bit of disorder in the underlying material should be necessary for attaining superconductivity.

The JQI experiment has tried to supply palpable data that can illuminate the issue of disorder. In solids, atoms are a fraction of a nanometer (billionth of a meter) apart. At JQI the atoms are about a micron (a millionth of a meter) apart. Actually, the JQI atom swarm consists of a 2-dimensional disk. “Disorder” in this disk consists not of impurity atoms but of “speckle.” When a laser beam strikes a rough surface, such as a cinderblock wall, it is scattered in a haphazard pattern. This visible speckle effect is what is used to slightly disorganize the otherwise perfect arrangement of Rb atoms in the JQI sample.

In superconductors, the slight disorder in the form of impurities ensures a very orderly “coherence” of the supercurrent. That is, the electrons moving through the solid flow as a single coordinated train of waves and retain their cohesiveness even in the midst of impurity atoms.

In the rubidium vapor, analogously, the slight disorder supplied by the speckle laser ensures that the Rb atoms retain their coordinated participation in the unified (BEC) quantum wave structure. But only up to a point. If too much disorder is added — if the speckle is too large — then the quantum coherence can go away. Probing this transition numerically was the object of the JQI experiment. The setup is illustrated in figure 1.

And how do you know when you’ve gone too far with the disorder? How do you know that quantum coherence has been lost? By making coherence visible.

The JQI scientists cleverly pry their disk-shaped gas of atoms into two parallel sheets, looking like two thin crepes, one on top of each other. Thereafter, if all the laser beams are turned off, the two planes will collide like miniature galaxies. If the atoms were in a coherent condition, their collision will result in a crisp interference pattern showing up on a video screen as a series of high-contrast dark and light stripes.

If, however, the imposed disorder had been too high, resulting in a loss of coherence among the atoms, then the interference pattern will be washed out. Figure 2 shows this effect at work. Frames b and c respectively show what happens when the degree of disorder is just right and when it is too much.

“Disorder figures in about half of all condensed matter physics,” says Steve Rolston. “What we’re doing is mimicking the movement of electrons in 3-dimensional solids using cold atoms in a 2-dimensional gas. Since there don’t seem to be any theoretical predictions to help us understand what we’re seeing we’ve moved into new experimental territory.”

Where does the JQI work go next? Well, in figure 2a you can see that the interference pattern is still visible but somewhat garbled. That arises from the fact that for this amount of disorder several vortices — miniature whirlpools of atoms — have sprouted within the gas. Exactly such vortices among electrons emerge in superconductivity, limiting their ability to maintain a coherent state.

The new results are published in the New Journal of Physics: “Disorder-driven loss of phase coherence in a quasi-2D cold atom system,” by M C Beeler, M E W Reed, T Hong, and S L Rolston.

Another of the JQI scientists, Matthew Beeler, underscores the importance of understanding the transition from the coherent state to incoherent state owing to the fluctuations introduced by disorder: “This paper is the first direct observation of disorder causing these phase fluctuations. To the extent that our system of cold atoms is like a HTSC superconductor, this is a direct connection between disorder and a mechanism which drives the system from superconductor to insulator.”

Global CO2 Emissions Continued to Increase in 2011, With Per Capita Emissions in China Reaching European Levels (Science Daily)

ScienceDaily (July 19, 2012) — Global emissions of carbon dioxide (CO2) — the main cause of global warming — increased by 3% last year, reaching an all-time high of 34 billion tonnes in 2011. In China, the world’s most populous country, average emissions of CO2 increased by 9% to 7.2 tonnes per capita. China is now within the range of 6 to 19 tonnes per capita emissions of the major industrialised countries. In the European Union, CO2 emissions dropped by 3% to 7.5 tonnes per capita. The United States remains one of the largest emitters of CO2, with 17.3 tones per capita, despite a decline due to the recession in 2008-2009, high oil prices and an increased share of natural gas.

These are the main findings of the annual report ‘Trends in global CO2emissions’, released July 19 by the European Commission’s Joint Research Centre (JRC) and the Netherlands Environmental Assessment Agency (PBL).

Based on recent results from the Emissions Database for Global Atmospheric Research (EDGAR) and latest statistics on energy use and relevant activities such as gas flaring and cement production, the report shows that global CO2 emissions continued to grow in 2011, despite reductions in OECD countries. Weak economic conditions, a mild winter, and energy savings stimulated by high oil prices led to a decrease of 3% in CO2 emissions in the European Union and of 2% in both the United States and Japan. Emissions from OECD countries now account for only one third of global CO2 emissions — the same share as that of China and India combined, where emissions increased by 9% and 6% respectively in 2011. Economic growth in China led to significant increases in fossil fuel consumption driven by construction and infrastructure expansion. The growth in cement and steel production caused China’s domestic coal consumption to increase by 9.7%.

The 3% increase in global CO2 emissions in 2011 is above the past decade’s average annual increase of 2.7%, with a decrease in 2008 and a surge of 5% in 2010. The top emitters contributing to the 34 billion tonnes of CO2 emitted globally in 2011 are: China (29%), the United States (16%), the European Union (11%), India (6%), the Russian Federation (5%) and Japan (4%).

Cumulative CO2 emissions call for action

An estimated cumulative global total of 420 billion tonnes of CO2 were emitted between 2000 and 2011 due to human activities, including deforestation. Scientific literature suggests that limiting the rise in average global temperature to 2°C above pre-industrial levels — the target internationally adopted in UN climate negotiations — is possible only if cumulative CO2emissions in the period 2000-2050 do not exceed 1 000 to 1 500 billion tonnes. If the current global trend of increasing CO2emissions continues, cumulative emissions will surpass this limit within the next two decades.

Fortunately, this trend is being mitigated by the expansion of renewable energy supplies, especially solar and wind energy and biofuels. The global share of these so-called modern renewables, which exclude hydropower, is growing at an accelerated speed and quadrupled from 1992 to 2011. This potentially represents about 0.8 billion tonnes of CO2emissions avoided as a result of using renewable energy supplies in 2011, which is close to Germany’s total CO2emissions in 2011.

“Trends in global CO2 emissions” report:http://edgar.jrc.ec.europa.eu/CO2REPORT2012.pdf

Society’s Response to Climate Change Is Critical (Science Daily)

ScienceDaily (July 18, 2012) — Lancaster University (UK) scientists have proposed a new way of considering society’s reactions to global warming by linking societal actions to temperature change.

Using this framework to analyse climate change policies aimed at avoiding dangerous climate change, they suggest that society will have to become fifty times more responsive to global temperature change than it has been since 1990.

The researchers, Dr Andy Jarvis, Dr David Leedal and Professor Nick Hewitt from the Lancaster Environment Centre, also show that if global energy use continues to grow as it has done historically, society would have to up its decarbonization efforts from its historic (160 year) value of 0.6% per year to 13% per year.

Dr Andy Jarvis said: “In order to avoid dangerous climate change, society will have to become much more responsive to the risks and damages that growth in global greenhouse gas emissions impose.”

The research, published in Nature Climate Change on 15 July has found that the global growth of new renewable sources of energy since 1990 constitutes a climate-society feedback of a quarter percent per year in the growth rate of CO2 emissions per degree temperature rise.

Professor Nick Hewitt said “If left unmanaged, the climate damages that we experience will motivate society to act to a greater or lesser degree. This could either amplify the growth in greenhouse gas emissions as we repair these damages or dampen them through loss of economic performance. Both are unpredictable and potentially dangerous.”

In Rousseau’s footsteps: David Graeber and the anthropology of unequal society (The Memory Bank)

http://thememorybank.co.uk

By Keith Hart

July 4, 2012, 11:14 pm

A review of David Graeber Debt: The first 5,000 years (Melville House, New York, 2011, 534 pages)

Debt is everywhere today. What is “sovereign debt” and why must Greece pay up, but not the United States? Who decides that the national debt will be repaid through austerity programmes rather than job-creation schemes? Why do the banks get bailed out, while students and home-owners are forced to repay loans? The very word debt speaks of unequal power; and the world economic crisis since 2008 has exposed this inequality more than any other since the 1930s. David Graeber has written a searching book that aims to place our current concerns within the widest possible framework of anthropology and world history. He starts from a question: why do we feel that we must repay our debts? This is a moral issue, not an economic one. In market logic, the cost of bad loans should be met by creditors as a discipline on their lending practices. But paying back debts is good for the powerful few, whereas the mass of debtors have at times sought and won relief from them.

What is debt? According to Graeber, it is an obligation with a figure attached and hence debt is inseparable from money. This book devotes a lot of attention to where money comes from and what it does. States and markets each play a role in its creation, but money’s form has fluctuated historically between virtual credit and metal currency. Above all Graeber’s enquiry is framed by our unequal world as a whole. He resists the temptation to offer quick remedies for collective suffering, since this would be inconsistent with the timescale of his argument. Nevertheless, readers are offered a worldview that clearly takes the institutional pillars of our societies to be rotten and deserving of replacement. It is a timely and popular view. Debt: The first 5,000 years is an international best-seller. The German translation recently sold 30,000 copies in the first two weeks.

I place the book here in a classical tradition that I call “the anthropology of unequal society” (Hart 2006), before considering what makes David Graeber a unique figure in contemporary intellectual politics. A summary of the book’s main arguments is followed by a critical assessment, focusing on the notion of a “human economy”.

The anthropology of unequal society

Modern anthropology was born to serve the coming democratic revolution against the Old Regime. A government by the people for the people should be based on what they have in common, their “human nature” or “natural rights”. Writers from John Locke (1690) to Karl Marx (1867) identified the contemporary roots of inequality with money’s social dominance, a feature that we now routinely call “capitalism”. For Locke money was a store of wealth that allowed some individuals to accumulate property far beyond their own immediate needs. For Marx “capital” had become the driving force subordinating the work of the many to machines controlled by a few. In both cases, accumulation dissolved the old forms of society, but it also generated the conditions for its own replacement by a more just society, a “commonwealth” or “communism”. It was, however, the philosophers of the eighteenth-century liberal enlightenment who developed a systematic approach to anthropology as an intellectual source for remaking the modern world.

Following Locke’s example, they wanted to found democratic societies in place of the class system typical of agrarian civilizations. How could arbitrary social inequality be abolished and a more equal society founded on their common human nature? Anthropology was the means of answering that question. The great Victorian synthesizers, such as Morgan, Tylor and Frazer, stood on the shoulders of predecessors motivated by an urgent desire to make world society less unequal. Kant’s Anthropology from a Pragmatic Point of View, a best-seller when published in 1798, was the culmination of that Enlightenment project; but it played almost no part in the subsequent history of the discipline. The main source for nineteenth-century anthropology was rather Jean-Jacques Rousseau.  He revolutionized our understanding of politics, education, sexuality and the self in four books published in the 1760s: The Social ContractEmileJulie and The Confessions. He was forced to flee for his life from hit squads encouraged by the church. But he made his reputation earlier through two discourses of which the second, Discourse on the Origins and Foundations of Inequality among Men (1754), deserves to be seen as the source for an anthropology that combines the critique of unequal society with a revolutionary politics of democratic emancipation.

Rousseau was concerned here not with individual variations in natural endowments which we can do little about, but with the conventional inequalities of wealth, honour and the capacity to command obedience which can be changed. In order to construct a model of human equality, he imagined a pre-social state of nature, a sort of hominid phase of human evolution in which men were solitary, but healthy, happy and above all free. This freedom was metaphysical, anarchic and personal: original human beings had free will, they were not subject to rules of any kind and they had no superiors. At some point humanity made the transition to what Rousseau calls “nascent society”, a prolonged period whose economic base can best be summarized as hunter-gathering with huts. This second phase represents his ideal of life in society close to nature.

The rot set in with the invention of agriculture or, as Rousseau puts it, wheat and iron. Here he contradicted both Hobbes and Locke. The formation of a civil order (the state) was preceded by a war of all against all marked by the absence of law, which Rousseau insisted was the result of social development, not an original state of nature. Cultivation of the land led to incipient property institutions which, far from being natural, contained the seeds of entrenched inequality. Their culmination awaited the development of political society. He believed that this new social contract was probably arrived at by consensus, but it was a fraudulent one in that the rich thereby gained legal sanction for transmitting unequal property rights in perpetuity. From this inauspicious beginning, political society then usually moved, via a series of revolutions, through three stages:

The establishment of law and the right of property was the first stage, the institution of magistrates the second and the transformation of legitimate into arbitrary power the third and last stage. Thus the status of rich and poor was authorized by the first epoch, that of strong and weak by the second and by the third that of master and slave, which is the last degree of inequality and the stage to which all the others finally lead, until new revolutions dissolve the government altogether and bring it back to legitimacy (Rousseau 1984:131).

One-man-rule closes the circle. “It is here that all individuals become equal again because they are nothing, here where subjects have no longer any law but the will of the master”(Ibid: 134). For Rousseau, the growth of inequality was just one aspect of human alienation in civil society. We need to return from division of labour and dependence on the opinion of others to subjective self-sufficiency. His subversive parable ends with a ringing indictment of economic inequality which could well serve as a warning to our world. “It is manifestly contrary to the law of nature, however defined… that a handful of people should gorge themselves with superfluities while the hungry multitude goes in want of necessities” (Ibid: 137).

Lewis H. Morgan (1877) drew on Rousseau’s model for his own fiercely democratic synthesis of human history, Ancient Society, which likewise used an evolutionary classification that we now call bands, tribes and states, each stage more unequal than the one before.  Morgan’s work is normally seen as the launch of modern anthropology proper because of his ability to enrol contemporary ethnographic observations of the Iroquois in an analysis of the historical structures underlying western civilization’s origins in Greece and Rome. Marx and Engels enthusiastically took up Morgan’s work as confirmation of their own critique of the state and capitalism; and the latter, drawing on Marx’s extensive annotations ofAncient Society, made the argument more accessible as The Origin of the Family, Private Property and the State (1884). Engels’s greater emphasis on gender inequality made this a fertile source for the feminist movement in the 1960s and after.

The traditional home of inequality is supposed to be India and Andre Beteille, in Inequality among Men (1977) and other books, has made the subject his special domain, merging social anthropology with comparative sociology. In the United States, Leslie White at Michigan and Julian Steward at Columbia led teams, including Wolf, Sahlins, Service, Harris and Mintz, who took the evolution of the state and class society as their chief focus. Probably the single most impressive work coming out of this American school was Eric Wolf’s Europe and the People without History (1982). But one man tried to redo Morgan in a single book and that was Claude Lévi-Strauss in The Elementary Structures of Kinship (1949). In Tristes Tropiques (1955), Lévi-Strauss acknowledged Rousseau as his master. The aim of Elementary Structures was to revisit Morgan’s three-stage theory of social evolution, drawing on a new and impressive canvas, “the Siberia-Assam axis” and all points southeast as far as the Australian desert. Lévi-Strauss took as his motor of development the forms of marriage exchange and the logic of exogamy. The “restricted reciprocity” of egalitarian bands gave way to the unstable hierarchies of “generalized reciprocity” typical of the Highland Burma tribes. The stratified states of the region turned inwards to endogamy, to the reproduction of class differences and the negation of social reciprocity.

Jack Goody has tried to lift our profession out of a myopic ethnography into an engagement with world history that went out of fashion with the passing of the Victorian founders. Starting with Production and Reproduction (1976), he has produced a score of books over the last three decades investigating why Sub-Saharan Africa differs so strikingly from the pre-industrial societies of Europe and Asia, with a later focus on refuting the West’s claim to being exceptional, especially when compared with Asia (Hart 2006, 2011).  The common thread of Goody’s compendious work links him through the Marxist pre-historian Gordon Childe (1954) to Morgan-Engels and ultimately Rousseau. The key to understanding social forms lies in production, which for us means machine production. Civilization or human culture is largely shaped by the means of communication — once writing, now an array of mechanized forms. The site of social struggles is property, now principally conflicts over intellectual property. And his central issue of reproduction has never been more salient than at a time when the aging citizens of rich countries depend on the proliferating mass of young people out there. Kinship needs to be reinvented too.

David Graeber: the first 50 years

Graeber brings his own unique combination of interests and engagements to renewing this “anthropology of unequal society”. Who is he? He spent the 1960s as the child of working-class intellectuals and activists in New York and was a teenager in the 1970s, which turned out to be the hinge decade of our times, leading to a “neoliberal” counter-revolution against post-war social democracy. This decade was framed at one end by the US dollar being taken off the gold standard in 1971 and at the other by a massive interest rate increase in 1979 induced by a second oil price hike. The world economy has been depressed ever since, especially at its western core. Graeber says that he embraced anarchism at sixteen.

The debt crisis of the 1980s was triggered by irresponsible lending of the oil surplus by western banks to Third World kleptocrats (Hart 2000: 142-143) and by the new international regime of high interest rates. In market theory, bad loans are supposed to discipline lenders, but the IMF and World Bank insisted on every penny of added interest being repaid by the governments of poor countries. This was also the time when structural adjustment policies forced those governments to open up their national economies to the free flow of money and commodities, with terrible consequences for public welfare programmes and jobs. If the anti-colonial revolution inspired my generation in the 1960s, Graeber’s internationalism was shaped by this wholesale looting of the successor states. He took an active part in demonstrations against this new phase of “financial globalization”, a phenomenon now often referred to as the “alter-globalization movement” (Pleyers 2010), but he and his fellow activists call it the “global justice movement”. Its public impact peaked in the years following the financial crisis of 1997-98 (involving Southeast Asia, Russia, Brazil and the failure of a US hedge fund, Long-Term Capital Management), notably through mass mobilizations in Seattle, Genoa and elsewhere. In the Debt book, Graeber claims that they took on the IMF and won.

David Graeber received a doctorate in anthropology from the University of Chicago based on ethnographic and historical research on a former slave village in Madagascar. This was eventually published as a long and exemplary monograph, Lost People: Magic and the legacy of slavery in Madagascar (Graeber 2007a). The history of the slave trade, colonialism and the post-colony figure prominently in how he illustrates global inequality through a focus on debt. Before that, he published a strong collection of essays on value, Toward an Anthropological Theory of Value: The false coin of our own dreams (Graeber 2001), in which he sought to relate economic value (especially value as measured impersonally by money) and the values that shape our subjectivity in society. This hinged on revisiting both Karl Marx and Marcel Mauss, providing the main account in English of how the latter’s cooperative socialism shaped his famous work on the gift (Mauss 1925). A theme of both books is the role of magic and money fetishism in sustaining unequal society.

Politics forms a central strand of Graeber’s work, with four books published so far and more in the works: Fragments of an Anarchist Anthropology (2004), Possibilities: Essays on hierarchy, rebellion, and desire (2007b), Direct Action: An ethnography (2009a) and Revolutions in Reverse: Essays on politics, violence, art, and imagination (2011c). These titles reveal a range of political interests that take in violence, aesthetics and libido. He insists on the “elective affinity” between anthropological theory and method and an anarchist programme of resistance, rebellion and revolution; and this emphasis on “society against the state” makes him a worthy successor to Pierre Clastres (1974). Graeber’s academic career has been fitful, most notoriously when he was “let go” by Yale despite his obvious talent and productivity. This fed rumours about the academic consequences of his political activities. These have led to numerous brushes with the police, but so far not to prolonged incarceration, although his inability to find a job in American universities could be seen as a form of exile.

Debt: The first 5,000 years was published in summer 2011 and Graeber began a year’s sabbatical leave from his teaching job in London by moving to New York, where he became an ubiquitous presence in the print media, television and blogs. In August-September he helped form the first New York City General Assembly which spawned the Occupy Wall Street movement. He has been credited with being the author of that movement’s slogan, “We are the 99%”, and helped to give it an anarchist political style. OWS generated a wave of imitations in the United States and around the world, known collectively as “the Occupy movement”, inviting comparison with the “Arab Spring” and Madrid’s Los Indignados in what seemed then to be a global uprising. Some shared features of this series of political events, such as an emphasis on non-violence, consensual decision-making and the avoidance of sectarian division, evoke Jean-Jacques Rousseau’s idea of the “general will”; and it is not wholly fanciful to compare David Graeber’s career so far with his great predecessor’s.

Graeber and Rousseau both detested the mainstream institutions of the world they live in and devoted their intellectual efforts to building revolutionary alternatives. This means not being satisfied with reporting how the world is, but rather exploring the dialectic linking the actual to the possible. This in turn implies being willing to mix established genres of research and writing and to develop new ones. Both are prolific writers with an accessible prose style aimed at reaching a mass audience. Both achieved unusual fame for an intellectual and their political practice got them into trouble. Both suffered intimidation, neglect and exile for their beliefs. Both attract admiration and loathing in equal measure. Their originality is incontestable, yet each can at times be silly. There is no point in considering their relative significance. The personal parallels that I point to here reinforce my claim that Graeber’s Debt book should be seen as a specific continuation of that “anthropology of unequal society” begun by Rousseau two and a half centuries ago.

Debt: the argument

Much of the contemporary world revolves round the claims we make on each other and on things: ownership, obligations, contracts and payment of taxes, wages, rents, fees etc. David Graeber’s book, Debt: The first 5,000 years, aims to illuminate these questions through a focus on debt seen in very wide historical perspective. It is of course a central issue in global politics today, at every level of society. Every day sees another example of a class struggle between debtors and creditors to shape the distribution of costs after a long credit boom went dramatically bust.

We might be indebted to God, the sovereign or our parents for the gift of life, but Graeber rightly insists that the social logic of debt is revealed most clearly when money is involved. He cites approvingly an early twentieth-century writer who insisted that “money is debt”. This book of over 500 pages is rich in argument and knowledge. The notes and references are compendious, ranging over five millennia of the main Eurasian civilizations (ancient Mesopotamia, Egypt and the Mediterranean, medieval Europe, China, India and Islam) and the ethnography of stateless societies in Africa, the Americas and the Pacific. Its twelve chapters are framed by an introduction to our moral confusion concerning debt and a concluding sketch of the present rupture in world history that began in the early 1970s. Graeber’s case is founded on anthropological and historical comparison more than his grasp of contemporary political economy, although he has plenty to say in passing about that. There is also a current of populist culture running through the book and this is reinforced by a prose style aimed at closing the gap between author and reader that his formidable scholarship might otherwise open up.

Perhaps this aspect of the book may be illustrated by introducing a recent short film. Paul Grignon’s Money as Debt (2006, 47 minutes) — an underground hit in activist circles — seeks to explain where money comes from. Most of the money in circulation is issued by banks whenever they make a loan. The real basis of money, the film claims, is thus our signature whenever we promise to repay a debt. The banks create that money by a stroke of the pen and the promise is then bought and sold in increasingly complex ways. The total debt incurred by government, corporations, small businesses and consumers spirals continuously upwards since interest must be paid on it all. Although the general idea is an old one, it has taken on added salience at a time when the supply of money, which could once plausibly be represented as public currency in circulation, has been overtaken by the creation of private debt.

The film’s attempt to demystify money is admirable, but its message is misleading.  Debt and credit are two sides of the same coin, the one evoking passivity in the face of power, the other individual empowerment. The origin of money in France and Germany is considered to be debt, whereas in the United States and Britain it is traditionally conceived of as credit. Either term alone is loaded, missing the dialectical character of the relations involved. Money as Debt demonizes the banks and interest in particular, letting the audience off the hook by not showing the active role most of us play in sustaining the system. Money today is issued by a dispersed global network of economic institutions of many kinds; and the norm of economic growth is fed by a widespread desire for self-improvement, not just by bank interest.

David Graeber offers a lot more than this, of course; but his book also feeds off popular currents too, which is not surprising given how much time he spends outside the classroom and his study. His analytical framework is spelled out in great detail over six chapters. The first two tackle the origins of money in barter and “primordial debt” respectively. He shows, forcefully and elegantly, how implausible the standard liberal origin myth of money as a medium of exchange is; but he also rejects as a nationalist myth the main opposing theory that traces money’s origins as a means of payment and unit of account to state power. In the first case he follows Polanyi (1944), but by distancing himself from the second, he highlights the interdependence of states and markets in money’s origins.  A short chapter shows that money was always both a commodity and a debt-token (“the two sides of the coin”, Hart 1986), giving rise to a lot of political and moral contestation, especially in the ancient world. Following Nietzsche, Graeber argues that money introduced for the first time a measure of the unequal relations between buyer and seller, creditor and debtor. Whereas Rousseau traced inequality to the invention of property, he locates the roots of human bondage, slavery, tribute and organized violence in debt relations. The contradictions of indebtedness, fed by money and markets, led the first world religions to articulate notions of freedom and redemption in response to escalating class conflict between creditors and debtors, often involving calls for debt cancellation.

The author now lays out his positive story to counter the one advanced by mainstream liberal economics. “A brief treatise on the moral grounds of economic relations” makes explicit his critique of the attempt to construct “the economy” as a sphere separate from society in general. This owes something to Polanyi’s (1957) universal triad of distributive mechanisms – reciprocity, redistribution and market – here identified as “everyday communism”, hierarchy and reciprocity. By the first Graeber means a human capacity for sharing or “baseline sociality”; the second is sometimes confused with the third, since unequal relations are often represented as an exchange – you give me your crops in return for not being beaten up. The difference between hierarchy and reciprocity is that debt is permanent in the first case, but temporary in the second. The western middle classes train their children to say please and thank you as a way of limiting the debt incurred by being given something. All three principles are present everywhere, but their relative emphasis is coloured by dominant economic forms. Thus “communism” is indispensable to modern work practices, but capitalism is a lousy way of harnessing our human capacity for cooperation.

The next two chapters introduce what is for me the main idea of the book, the contrast between “human economies” and those dominated by money and markets (Graeber prefers to call them “commercial economies” and sometimes “capitalism”). First he identifies the independent characteristics of human economies and then shows what happens when they are forcefully incorporated into the economic orbit of larger “civilisations”, including our own. This is to some extent a great divide theory of history, although, as Mauss would insist, elements of human economy persist in capitalist societies. There is a sense in which “human economies” are a world we have lost, but might recover after the revolution. Graeber is at pains to point out that these societies are not necessarily more humane, just that “they are economic systems primarily concerned not with the accumulation of wealth, but with the creation, destruction, and rearranging of human beings” (2011a: 130). They use money, but mainly as “social currencies” whose aim is to maintain relations between people rather than to purchase things.

“In a human economy, each person is unique and of incomparable value, because each is a unique nexus of relations with others” (Ibid: 158). Yet their money forms make it possible to treat people as quantitatively identical in exchange and that requires a measure of violence. Brutality — not just conceptual, but physical too — is omnipresent, more in some cases than others. Violence is inseparable from money and debt, even in the most “human” of economies, where ripping people out of their familiar context is commonplace. This, however, gets taken to another level when they are drawn into systems like the Atlantic slave trade or the western colonial empires of yesteryear. The following extended reflection on slavery and freedom — a pair that Graeber sees as being driven by a culture of honour and indebtedness — culminates in the ultimate contradiction underpinning modern liberal economics, a worldview that conceives of individuals as being socially isolated in a way that could only be prepared for by a long history of enslaving conquered peoples. Since we cannot easily embrace this account of our own history, it is not surprising that we confuse morality and power when thinking about debt.

So far, Graeber has relied heavily on anthropological material, especially from African societies, to illustrate the world that the West transformed, although his account of money’s origins draws quite heavily on the example of ancient Mesopotamia. Now he formalizes his theory of money to organize a compendious review of world history in four stages. These are: the era from c.3000 BC that saw the first urban civilizations; the “Axial Age” which he, rather unusually, dates from 800BC to 600AD; the Middle Ages (600-1450AD); and the age of “the great capitalist empires”, from 1450AD to the US dollar’s symbolic rupture with the gold standard in 1971. As this last date suggests, the periodization relies heavily on historical oscillations between broad types of money. Graeber calls these “credit” and “bullion”, that is, money as a virtual measure of personal relations, like IOUs, and as currency or impersonal things made from precious metals for circulation.

Money started out as a unit of account, administered by institutions such as temples and banks, as well as states, largely as a way of measuring debt relations between people. Coinage was introduced in the first millennium as part of a complex linking warfare, mercenary soldiers, slavery, looting, mines, trade and the provisioning of armies on the move. Graeber calls this “the military-coinage-slavery complex” of which Alexander the Great, for example, was a master. Hence our word, “soldier”, refers to his pay. The so-called “dark ages” offered some relief from this regime and for most of the medieval period, metal currencies were in very short supply and money once again took the dominant form of virtual credit. India, China and the Islamic world are enlisted here to supplement what we know of Europe. But then the discovery of the new world opened up the phase we are familiar with from the last half-millennium, when western imperialism revived the earlier tradition of warfare and slavery lubricated by bullion.

The last four decades are obviously transitional, but the recent rise of virtual credit money suggests the possibility of another long swing of history away from the principles that underpinned the world the West made. It could be a multi-polar world, more like the middle ages than the last two centuries. It could offer more scope for “human economies” or at least “social currencies”. The debt crisis might provoke revolutions and then, who knows, debt cancellation along the lines of the ancient jubilee. Perhaps the whole institutional complex based on states, money and markets or capitalism will be replaced by forms of society more directly responsive to ordinary people and their capacity for “everyday communism”.

All of this is touched on in the final chapter. But Graeber leaves these “policy conclusions” deliberately vague. His aim in this book has been to draw his readers into a vision of human history that runs counter to what makes their social predicament supposedly inevitable. It is a vision inspired in part by his profession as an anthropologist, in part by his political engagement as an activist. Both commitments eschew drawing up programmes for others to follow. Occupy Wall Street has been criticized for its failure to enumerate a list of “demands”. No doubt much the same could be said of this book; but then readers, including this reviewer, will be inspired by it in concrete ways to imagine possibilities that its author could not have envisaged.

Towards a human economy

David Graeber and I came up with the term “human economy” independently during the last decade (Graeber 2009b, 2011a; Hart 2008, Hart, Laville and Cattani 2010). The editors of The Human Economy: A citizen’s guide distanced ourselves, in the introduction and our editorial approach, from any “revolutionary” eschatology that suggested society had reached the end of something and would soon be launched on a quite new trajectory. The idea of a “human economy” drew attention to the fact that people do a lot more for themselves than an exclusive focus on the dominant economic institutions would suggest. Against a singular notion of the economy as “capitalism”, we argued that all societies combine a plurality of economic forms and several of these are distributed across history, even if their combination is strongly coloured by the dominant economic form in particular times and places.

For example, in his famous essay on The Gift (1925), Marcel Mauss showed that other economic principles were present in capitalist societies and that understanding this would provide a sounder basis for building non-capitalist alternatives than the Bolshevik revolution’s attempt to break with markets and money entirely. Karl Polanyi too, in his various writings, insisted that the human economy throughout history combined a number of mechanisms of which the market was only one. We argued therefore that the idea of radical transformation of an economy conceived of monolithically as capitalism into its opposite was an inappropriate way to approach economic change. We should rather pay attention to the full range of what people are doing already and build economic initiatives around giving these a new direction and emphasis, instead of supposing that economic change has to be reinvented from scratch. Although this looks like a gradualist approach to economic improvement, its widespread adoption would have revolutionary consequences.

David Graeber’a anarchist politics inform his economic analysis; and he has always taken an anti-statist and anti-capitalist position, with markets and money usually being subsumed under the concept of capitalism. That is, he sees the future as being based on the opposite of our capitalist states. The core of his politics is “direct action” which he has practised and written about as an ethnographer (Graeber 2009a). In The Human Economy, we argued that people everywhere rely on a wide range of organizations in their economic lives: markets, nation-states, corporations, cities, voluntary associations, families, virtual networks, informal economies, crime. We should be looking for a more progressive mix of these things. We can’t afford to turn our backs on institutions that have helped humanity make the transition to modern world society. Large-scale bureaucracies co-exist with varieties of popular self-organization and we have to make them work together rather than at cross-purposes, as they often do now.

Graeber also believes, as we have seen, that economic life everywhere is based on a plural combination of moral principles which take on a different complexion when organized by dominant forms. Thus, helping each other as equals is essential to capitalist societies, but capitalism distorts and marginalizes this human propensity. Yet he appears to expect a radical rupture with capitalist states fairly soon and this is reflected in a stages theory of history, with categories to match. At first sight, these positions (let’s call them “reform” and “revolution”) are incompatible, but recent political developments (the “Arab Spring” and Occupy movements of 2011, however indeterminate their immediate outcomes) point to the need to transcend such an opposition.

The gap between our approaches to making the economy human is therefore narrowing. Even so, there are differences of theory and method that point to some residual reservations I have about the Debt book. The first of these concerns Graeber’s preference for lumping together states, money, markets, debt and capitalism, along with violence, war and slavery as their habitual bedfellows. Money and markets have redemptive qualities that in my view (Hart 2000) could be put to progressive economic ends in non-capitalist forms; nor do I imagine that modern institutions such as states, corporations and bureaucracy will soon die away. Anti-capitalism as a revolutionary strategy begs the question of the plurality of modern economic institutions. As Mauss showed (Hart 2007), human economies exist in the cracks of capitalist societies. David Graeber seems to agree, at least when it comes to finding “everyday communism” there and, by refusing to sanitize “human economies” in their pristine form, he modifies the categorical and historical division separating them and commercial economies. Revolutionary binaries seem to surface at various points in his book, but an underlying tendency to discern continuity in human economic practices is just as much a feature of David Graeber’s anthropological vision.

An argument of Debt’s scope hasn’t been made by a professional anthropologist for the best part of a century, certainly not one with as much contemporary relevance. The discipline largely abandoned “conjectural history” in the twentieth century in order to embrace the narrower local perspectives afforded by ethnographic fieldwork. Works of broad comparison such as Wolf’s and Goody’s were the exception to this trend. Inevitably Graeber’s methods will come under scrutiny, not just from fellow professionals, but from the general public too. (He tells me that academics don’t read footnotes any more, but laymen do). To this reader, the first half of the book – which relies heavily on ethnographic sources to spell out the argument — is more systematic, in terms of both analytical coherence and documentation, than the second, concerned as it is with fleshing out his cycles of history. In either case, little attempt is made to analyse contemporary political economy, although Graeber makes more explicit reference to this than, for example does Mauss in The Gift, where readers’ understanding of capitalist markets is taken for granted. Nowhere in the book is any reference made to the digital revolution in communications of our times and its scope to transform economies, whether human or commercial (Hart 2000, 2005).

Well, that is not quite true, for the author does occasionally introduce anecdotes based on common or his personal knowledge. The problem is that many readers who take on trust what he has to say about ancient Mesopotamia or the Tiv, may find these stories contradicted by their own knowledge. It is something akin to “Time magazine syndrome”: we accept what Time has to say about the world in general until it impinges on what we know ourselves and then its credibility dissolves. Thus:

Apple Computers is a famous example: it was founded by (mostly Republican) computer engineers who broke from IBM in Silicon Valley in the 1980s, forming little democratic circles of twenty to forty people with their laptops in each other’s garages (Graeber 2011a: 96).

The veracity of this anecdote has been challenged by numerous Californian bloggers and the author’s scholarship with it. Graeber is aware of the pitfalls of making contemporary allusions. In the final chapter (Ibid: 362-3), he cleverly introduces an urban myth he often heard about the gold stored under the World Trade Centre and then (almost) rehabilitates that myth using documented sources. Fortunately, David Graeber has not been deterred by the pedants from crossing the line between academic and general knowledge in this book and his readers benefit immensely as a result. I contributed to the publisher’s blurb for this book and said that he is “the finest anthropological scholar I know”. I stand by that. The very long essay he recently published on the divine kingship of the Shilluk (Graeber 2011c) covers the same ground as a number of famous anthropologists from Frazer onwards, but with an unsurpassed range of scholarship, as well as a democratic political perspective. Inevitably in a book like this one, the fact police will catch him out sometimes. But it is a work of immense erudition and deserves to be celebrated as such.

Our world is still massively unequal and we may be entering a period of war and revolution comparable to the “Second Thirty Years War” of 1914-1945 which came after the last time that several decades of financial imperialism went bust. Capitalism itself sometimes seems today to have reverted to a norm of rent-seeking that resembles the arbitrary inequality of the Old Regime more than Victorian industry. The pursuit of economic democracy is more elusive than ever; yet humanity has also devised universal means of communication at last adequate to the expression of universal ideas. Jean-Jacques Rousseau would have leapt at the chance to make use of this opportunity and several illustrious successors did so in their own way during the last two centuries. We need an anthropology that rises to the challenge posed by our common human predicament today. No-one has done more to meet that challenge than David Graeber, in his work as a whole, but especially in this book.

References

Beteille, Andre   1977   Inequality among Men. Blackwell: Oxford.

Childe, V. Gordon   1954   What Happened in History. Penguin: Harmondsworth.

Clastres, Pierre    1989 (1974)    Society against the state: Essays in political anthropology. Zone Books: New York.

Engels, Friedrich   1972 (1884)   The Origin of the Family, Private Property, and the State. Pathfinder: New York.

Goody, Jack   1976   Production and Reproduction: A Comparative Study of the Domestic Domain. Cambridge University Press: Cambridge.

Graeber, David   2001   Toward an Anthropological Theory of Value: The false coin of our own dreams. Palgrave: New York.

——    2004    Fragments of an Anarchist Anthropology. Prickly Paradigm: Chicago.

——    2007a   Lost People: Magic and the legacy of slavery in Madagascar. Indiana University Press: Bloomington IN.

——   2007b   Possibilities: Essays on hierarchy, rebellion, and desire . AK Press: Oakland CA.

——    2009a   Direct Action: An ethnography. AK Press: Baltimore MD.

——    2009b   Debt, Violence, and Impersonal Markets: Polanyian Meditations. In Chris Hann and K. Hart editors Market and Society: The Great Transformation today. Cambridge University Press: Cambridge, 106-132.

——   2011a    Debt: The first 5,000 years. Melville House: New York.

——   2011b   The divine kingship of the Shilluk: On violence, utopia, and the human condition or elements for an archaeology of sovereignty, Hau: Journal of Ethnographic Theory 1.1: 1-62.

——   2011c   Revolutions in Reverse: Essays on politics, violence, art, and imagination. Autonomedia: New York.

Hann, Chris and K. Hart   2011   Economic Anthropology: History, ethnography, critique. Polity: Cambridge.

Hart, Keith   1986   Heads or tails? Two sides of the coin. Man 21 (3): 637–56.

——   2000   The Memory Bank: Money in an unequal world. Profile: London; republished in 2001 as Money in an Unequal World. Texere: New York.

—— 2005 The Hit Man’s Dilemma: Or business personal and impersonal. Prickly Paradigm: Chicago.

——   2006   Agrarian civilization and world society. In D. Olson and M. Cole (eds.), Technology, Literacy and the Evolution of Society: Implications of the work of Jack Goody. Lawrence Erlbaum: Mahwah, NJ, 29–48.

——   2007   Marcel Mauss: in pursuit of the whole – a review essay. Comparative Studies in Society and History 49 (2): 473–85.

——   2008   The human economy. ASAonline 1. http://www.theasa.org/publications/asaonline/articles/asaonline_0101.htm

——   2011   Jack Goody’s vision of world history and African development today (Jack Goody Lecture 2011). Halle/Saale: Max Planck Institute for Social Anthropology, Department II.

Hart, Keith, J-L. Laville and A. Cattani editors   2010   The Human Economy: A citizen’s guide. Polity: Cambridge.

Kant, Immanuel   2006   Anthropology from a Pragmatic Point of View. Cambridge University Press: Cambridge.

Lévi-Strauss, Claude   1969 (1949)   The Elementary Structures of Kinship. Beacon: Boston.

——    1973 (1955) Tristes Tropiques. Cape: London.

Locke, John   1960 (1690)   Two Treatises of Government. Cambridge University Press: Cambridge.

Marx, Karl   1970 (1867)   Capital Volume 1. Lawrence and Wishart: London.

Mauss, Marcel   1990 (1925)  The Gift: The form and reason for exchange in archaic societies. Routledge: London.

Morgan, Lewis H. 1964 (1877) Ancient Society. Bellknapp: Cambridge MA.

Pleyers, Geoffrey   2010   Alter-globalization: Becoming actors in a global age. Polity: Cambridge.

Polanyi, Karl   2001 (1944)   The Great Transformation: The political and economic origins of our times. Beacon: Boston.

——   1957   The economy as instituted process. In K. Polanyi, C. Arensberg and H. Pearson editors Trade and Market in the early Empires. Free Press: Glencoe IL, 243-269.

Rousseau, Jean-Jacques   1984 (1754)   Discourse on Inequality. Penguin: Harmondsworth.

Dummies guide to the latest “Hockey Stick” controversy (Real Climate)

http://www.realclimate.org

 — gavin @ 18 February 2005

by Gavin Schmidt and Caspar Amman

Due to popular demand, we have put together a ‘dummies guide’ which tries to describe what the actual issues are in the latest controversy, in language even our parents might understand. A pdf version is also available. More technical descriptions of the issues can be seen here and here.

This guide is in two parts, the first deals with the background to the technical issues raised byMcIntyre and McKitrick (2005) (MM05), while the second part discusses the application of this to the original Mann, Bradley and Hughes (1998) (MBH98) reconstruction. The wider climate science context is discussed here, and the relationship to other recent reconstructions (the ‘Hockey Team’) can be seen here.

NB. All the data that were used in MBH98 are freely available for download atftp://holocene.evsc.virginia.edu/pub/sdr/temp/nature/MANNETAL98/ (and also as supplementary data at Nature) along with a thorough description of the algorithm.
Part I: Technical issues:

1) What is principal component analysis (PCA)?

This is a mathematical technique that is used (among other things) to summarize the data found in a large number of noisy records so that the essential aspects can more easily seen. The most common patterns in the data are captured in a number of ‘principal components’ which describe some percentage of the variation in the original records. Usually only a limited number of components (‘PC’s) have any statistical significance, and these can be used instead of the larger data set to give basically the same description.

2) What do these individual components represent?

Often the first few components represent something recognisable and physical meaningful (at least in climate data applications). If a large part of the data set has a trend, than the mean trend may show up as one of the most important PCs. Similarly, if there is a seasonal cycle in the data, that will generally be represented by a PC. However, remember that PCs are just mathematical constructs. By themselves they say nothing about the physics of the situation. Thus, in many circumstances, physically meaningful timeseries are ‘distributed’ over a number of PCs, each of which individually does not appear to mean much. Different methodologies or conventions can make a big difference in which pattern comes up tops. If the aim of the PCA analysis is to determine the most important pattern, then it is important to know how robust that pattern is to the methodology. However, if the idea is to more simply summarize the larger data set, the individual ordering of the PCs is less important, and it is more crucial to make sure that as many significant PCs are included as possible.

3) How do you know whether a PC has significant information?

PC significanceThis determination is usually based on a ‘Monte Carlo’ simulation (so-called because of the random nature of the calculations). For instance, if you take 1000 sets of random data (that have the same statistical properties as the data set in question), and you perform the PCA analysis 1000 times, there will be 1000 examples of the first PC. Each of these will explain a different amount of the variation (or variance) in the original data. When ranked in order of explained variance, the tenth one down then defines the 99% confidence level: i.e. if your real PC explains more of the variance than 99% of the random PCs, then you can say that this is significant at the 99% level. This can be done for each PC in turn. (This technique was introduced by Preisendorfer et al. (1981), and is called the Preisendorfer N-rule).

The figure to the right gives two examples of this. Here each PC is plotted against the amount of fractional variance it explains. The blue line is the result from the random data, while the blue dots are the PC results for the real data. It is clear that at least the first two are significantly separated from the random noise line. In the other case, there are 5 (maybe 6) red crosses that appear to be distinguishable from the red line random noise. Note also that the first (‘most important’) PC does not always explain the same amount of the original data.

4) What do different conventions for PC analysis represent?

Some different conventions exist regarding how the original data should be normalized. For instance, the data can be normalized to have an average of zero over the whole record, or over a selected sub-interval. The variance of the data is associated with departures from the whatever mean was selected. So the pattern of data that shows the biggest departure from the mean will dominate the calculated PCs. If there is an a priori reason to be interested in departures from a particular mean, then this is a way to make sure that those patterns move up in the PC ordering. Changing conventions means that the explained variance of each PC can be different, the ordering can be different, and the number of significant PCs can be different.

5) How can you tell whether you have included enough PCs?

This is rather easy to tell. If your answer depends on the number of PCs included, then you haven’t included enough. Put another way, if the answer you get is the same as if you had used all the data without doing any PC analysis at all, then you are probably ok. However, the reason why the PC summaries are used in the first place in paleo-reconstructions is that using the full proxy set often runs into the danger of ‘overfitting’ during the calibration period (the time period when the proxy data are trained to match the instrumental record). This can lead to a decrease in predictive skill outside of that window, which is the actual target of the reconstruction. So in summary, PC selection is a trade off: on one hand, the goal is to capture as much variability of the data as represented by the different PCs as possible (particularly if the explained variance is small), while on the other hand, you don’t want to include PCs that are not really contributing any more significant information.

Part II: Application to the MBH98 ‘Hockey Stick’

1) Where is PCA used in the MBH methodology?

When incorporating many tree ring networks into the multi-proxy framework, it is easier to use a few leading PCs rather than 70 or so individual tree ring chronologies from a particular region. The trees are often very closely located and so it makes sense to summarize the general information they all contain in relation to the large-scale patterns of variability. The relevant signal for the climate reconstruction is the signal that the trees have in common, not each individual series. In MBH98, the North American tree ring series were treated like this. There are a number of other places in the overall methodology where some form of PCA was used, but they are not relevant to this particular controversy.

2) What is the point of contention in MM05?

MM05 contend that the particular PC convention used in MBH98 in dealing with the N. American tree rings selects for the ‘hockey stick’ shape and that the final reconstruction result is simply an artifact of this convention.

3) What convention was used in MBH98?

MBH98 were particularly interested in whether the tree ring data showed significant differences from the 20th century calibration period, and therefore normalized the data so that the mean over this period was zero. As discussed above, this will emphasize records that have the biggest differences from that period (either positive of negative). Since the underlying data have a ‘hockey stick’-like shape, it is therefore not surprising that the most important PC found using this convention resembles the ‘hockey stick’. There are actual two significant PCs found using this convention, and both were incorporated into the full reconstruction.

PC1 vs PC44) Does using a different convention change the answer?

As discussed above, a different convention (MM05 suggest one that has zero mean over the whole record) will change the ordering, significance and number of important PCs. In this case, the number of significant PCs increases to 5 (maybe 6) from 2 originally. This is the difference between the blue points (MBH98 convention) and the red crosses (MM05 convention) in the first figure. Also PC1 in the MBH98 convention moves down to PC4 in the MM05 convention. This is illustrated in the figure on the right, the red curve is the original PC1 and the blue curve is MM05 PC4 (adjusted to have same variance and mean). But as we stated above, the underlying data has a hockey stick structure, and so in either case the ‘hockey stick’-like PC explains a significant part of the variance. Therefore, using the MM05 convention, more PCs need to be included to capture the significant information contained in the tree ring network.

This figure shows the difference in the final result whether you use the original convention and 2 PCs (blue) and the MM05 convention with 5 PCs (red). The MM05-based reconstruction is slightly less skillful when judged over the 19th century validation period but is otherwise very similar. In fact any calibration convention will lead to approximately the same answer as long as the PC decomposition is done properly and one determines how many PCs are needed to retain the primary information in the original data.

different conventions
5) What happens if you just use all the data and skip the whole PCA step?

This is a key point. If the PCs being used were inadequate in characterizing the underlying data, then the answer you get using all of the data will be significantly different. If, on the other hand, enough PCs were used, the answer should be essentially unchanged. This is shown in the figure below. The reconstruction using all the data is in yellow (the green line is the same thing but with the ‘St-Anne River’ tree ring chronology taken out). The blue line is the original reconstruction, and as you can see the correspondence between them is high. The validation is slightly worse, illustrating the trade-off mentioned above i.e. when using all of the data, over-fitting during the calibration period (due to the increase number of degrees of freedom) leads to a slight loss of predictability in the validation step.

No PCA comparison

6) So how do MM05 conclude that this small detail changes the answer?

MM05 claim that the reconstruction using only the first 2 PCs with their convention is significantly different to MBH98. Since PC 3,4 and 5 (at least) are also significant they are leaving out good data. It is mathematically wrong to retain the same number of PCs if the convention of standardization is changed. In this case, it causes a loss of information that is very easily demonstrated. Firstly, by showing that any such results do not resemble the results from using all data, and by checking the validation of the reconstruction for the 19th century. The MM version of the reconstruction can be matched by simply removing the N. American tree ring data along with the ‘St Anne River’ Northern treeline series from the reconstruction (shown in yellow below). Compare this curve with the ones shown above.

No N. American tree rings

As you might expect, throwing out data also worsens the validation statistics, as can be seen by eye when comparing the reconstructions over the 19th century validation interval. Compare the green line in the figure below to the instrumental data in red. To their credit, MM05 acknowledge that their alternate 15th century reconstruction has no skill.

validation period

7) Basically then the MM05 criticism is simply about whether selected N. American tree rings should have been included, not that there was a mathematical flaw?

Yes. Their argument since the beginning has essentially not been about methodological issues at all, but about ‘source data’ issues. Particular concerns with the “bristlecone pine” data were addressed in the followup paper MBH99 but the fact remains that including these data improves the statistical validation over the 19th Century period and they therefore should be included.

Hockey Team *used under GFDL license8) So does this all matter?

No. If you use the MM05 convention and include all the significant PCs, you get the same answer. If you don’t use any PCA at all, you get the same answer. If you use a completely different methodology (i.e. Rutherford et al, 2005), you get basically the same answer. Only if you remove significant portions of the data do you get a different (and worse) answer.

9) Was MBH98 the final word on the climate of last millennium?

Not at all. There has been significant progress on many aspects of climate reconstructions since MBH98. Firstly, there are more and better quality proxy data available. There are new methodologies such as described in Rutherford et al (2005) or Moberg et al (2005) that address recognised problems with incomplete data series and the challenge of incorporating lower resolution data into the mix. Progress is likely to continue on all these fronts. As of now, all of the ‘Hockey Team’ reconstructions (shown left) agree that the late 20th century is anomalous in the context of last millennium, and possibly the last two millennia.

The climate of the climate change debate is changing (The Guardian)

Quantifying how greenhouse gases contribute to extreme weather is a crucial step in calculating the cost of human influence

Myles Allen

guardian.co.uk, Wednesday 11 July 2012 12.08 BST

Climate change could trap hundreds of millions in disaster areas, report claims

This week, climate change researchers were able to attribute recent examples of extreme weather to the effects of human activity on the planet’s climate systems for the first time. Photograph: Rizwan Tabassum/AFP/Getty Images

The climate may have changed this week. Not the physical climate, but the climate of the climate change debate. Tuesday marked thepublication of a series of papers examining the factors behind extreme weather events in 2011. Nothing remarkable about that, you might think, except, if all goes well, this will be the first of a regular, annual assessment quantifying how external drivers of climate contribute to damaging weather.

Some of these drivers, like volcanoes, are things we can do nothing about. But others, like rising levels of greenhouse gases, we can. And quantifying how greenhouse gases contribute to extreme weather is a crucial step in pinning down the real cost of human influence on climate. While most people think of climate change in terms of shrinking ice-sheets and slowly rising sea levels, it is weather events that actually do harm.

This week also saw a workshop in Oxford for climate change negotiators from developing countries. Again, nothing remarkable about that except, for the first time, the issue of “loss and damage” was top of the agenda. For years negotiations have been over emission reductions and sharing the costs of adaptation. Now the debate is turning to: who is going to pay for damage done?

It is a good time to ask, since the costs that can unambiguously be attributed to human-induced climate change are still relatively small. Although Munich Re estimates that weather events in 2011 cost more than $100bn and claimed many thousands of lives, only a few of these events were clearly made more likely by human influence. Others may have been made less likely, but occurred anyway – chance remains the single dominant factor in when and where a weather event occurs. For the vast majority of events, we simply don’t yet know either way.

Connecting climate change and specific weather events is only one link in the causal chain between greenhouse gas emissions and actual harm. But it is a crucial link. If, as planned, the assessment of 2011 becomes routine, we should be able to compare actual weather-related damage, in both good years and bad, with the damage that might have been in a world without human influence on climate. This puts us well on our way to a global inventory of climate change impacts. And as soon as that is available, the question of compensation will not be far behind.

The presumption in climate change negotiations is that “countries with historically high emissions” would be first in line to foot the bill for loss and damage. There may be some logic to this, but if you are an African (or Texan) farmer hit by greenhouse-exacerbated drought, is the European or American taxpayer necessarily the right place to look for compensation? As any good lawyer knows, there is no point in suing a man with empty pockets.

The only institution in the world that could deal with the cost of climate change without missing a beat is the fossil fuel industry: BP took a $30bn charge for Deepwater Horizon, very possibly more than the total cost of climate change damages last year, and was back in profit within months. Of the $5 trillion per year we currently spend on fossil energy, a small fraction would take care of all the loss and damage attributable to climate change for the foreseeable future several times over.

Such a pay-as-you-go liability regime would not address the impacts of today’s emissions on the 22nd century. Governments cannot wash their hands of this issue entirely. But we have been so preoccupied with the climate of the 22nd century that we have curiously neglected to look after the interests of those being affected by climate change today.

So rather than haggling over emission caps and carbon taxes, why not start with a simple statement of principle: standard product liability applies to anyone who sells or uses fossil fuels, including liability for any third-party side-effects. There is no need at present to say what these side-effects might be – indeed, the scientific community does not yet know. But we are getting there.

Um novo bóson à vista (FAPESP)

Físicos do Cern descobriram nova partícula que parece ser o bóson de Higgs

MARCOS PIVETTA | Edição Online 19:46 4 de julho de 2012

Colisões de prótons nos quais se observa quatro elétrons de alta energia (linhas verdes e torres vermelhas). O evento mostra características esperadas do decaimento de um bóson de Higgs mas também é coerente com processos de fundo do modelo padrão

de Lindau (Alemanha)*

O maior laboratório do mundo pode ter encontrado a partícula que dá massa a todas as outras partículas, o tão procurado bóson de Higgs. Era a peça que faltava para completar um quebra-cabeça científico chamado modelo padrão, o arcabouço teórico formulado nas últimas décadas para explicar as partículas e forças presentes na matéria visível do Universo. Depois de analisarem trilhões de colisões de prótons produzidas em 2011 e em parte deste ano no Grande Acelerador de Hádrons (LHC), físicos dos dois maiores experimentos tocados de forma independente no Centro Europeu de Energia Nuclear (Cern) anunciaram nesta quarta-feira (4), nos arredores de Genebra (Suíça), a descoberta de uma nova partícula que tem quase todas as características do bóson de Higgs, embora ainda não possam assegurar com certeza de que se trata especificamente desse ou de algum outro tipo de bóson.

“Observamos em nossos dados sinais claros de uma nova partícula na região de massa em torno de 126 GeV (Giga-elétron-volts)”, disse a física Fabiola Gianotti, porta-voz do experimento Atlas. “Mas precisamos um pouco mais de tempo para prepararmos os resultados para publicação.” As informações provenientes de outro experimento feito no Cern, o CMS, são praticamente idênticas. “Os resultados são preliminares, mas os sinais que vemos em torno da região com massa de 125 GeV são dramáticos. É realmente uma nova partícula. Sabemos que deve ser um bóson e é o bóson mais pesado que achamos”, afirmou o porta-voz do experimento CMS, o físico Joe Incandela. Se tiver mesmo uma massa de 125 ou 126 GeV, a nova partícula será tão pesada quanto um átomo do elemento químico iodo.

Em ambos os casos experimentos, o grau de confiabilidade das análises estatísticas atingiu o nível que os cientistas chamam de 5 sigma. Nesses casos, a chance de erro é de uma em três milhões. Ou seja, com esse nível de certeza, é possível falar que houve uma descoberta, só não se conhece em detalhes a natureza da partícula encontrada. “É incrível que essa descoberta tenha acontecido durante a minha vida”, comenta Peter Higgs, o físico teórico britânico que, há 50 anos, ao lado de outros cientistas, previu a existência desse tipo de bóson. Ainda neste mês, um artigo com os dados do LHC deverá ser submetido a uma revista científica. Até o final do ano, quando acelerador será fechado para manutenção por ao menos um ano e meio, mais dados devem ser produzidos pelos dois experimentos.

“Estou rindo o dia todo”
Em Lindau, uma pequena cidade do sul da Alemanha à beira do lago Constance na divisa com a Áustria e a Suíça, onde ocorre nesta semana o 62º Encontro de Prêmios Nobel, os pesquisadores comemoraram a notícia vinda dos experimentos no Cern. Como o tema do encontro deste ano era física, não faltaram laureados com a maior honraria da ciência para comentar o feito. “Não sabemos se é o bóson (de Higgs), mas é um bóson”, disse o físico teórico David J. Gross, da Universidade de Califórnia, ganhador do Nobel de 2004 pela descoberta da liberdade assintótica. “Estou rindo o dia todo.” O físico experimental Carlo Rubia, ex-diretor geral do Cern e ganhador do Nobel de 1984 por trabalhos que levaram à identificação de dois tipos de bósons (o W e Z), foi na mesma linha de raciocínio. “Estamos diante de um marco”, afirmou.

Talvez com um entusiasmo um pouco menor, mas ainda assim reconhecendo a enorme importância do achado no Cern, dois outros Nobel deram sua opinião sobre a notícia do dia. “É algo que esperávamos há anos”, afirmou o físico teórico holandês Martinus Veltman, que recebeu o prêmio em 1999. “O modelo padrão ganhou um degrau maior de validade.” Para o cosmologista americano George Smoot, ganhador do Nobel de 2006 pela descoberta da radiação cósmica de fundo (uma relíquia do Big Bang, a explosão primordial que criou o Universo), ainda deve demorar uns dois ou três anos para os cientistas realmente saberem que tipo de nova partícula foi realmente descoberta. Se a nova partícula não for o bóson de Higgs, Smoot disse que seria “maravilhoso se fosse algo relacionado com a matéria escura”, um misterioso componente que, ao lado da matéria visível e da ainda mais desconhecida energia escura, seria um dos pilares do Universo.

Não é possível medir de forma direta partículas com as propriedades do bóson de Higgs, mas sua existência, ainda que fugaz, deixaria rastros, que, estes sim, poderiam ser detectados num acelerador de partículas tão potente como o LHC. Instáveis e fugazes, os bósons de Higgs sobrevivem uma ínfima fração de segundo – até decaírem e virarem partículas menos pesadas, que, por sua vez, decaem também e dão origem a partículas ainda mais leves. O modelo padrão prevê que, em função de sua massa, os bósons de Higgs devem decair em diferentes canais, ou seja, em distintas combinações de partículas mais leves, como dois fótons ou quatro léptons. Nos experimentos feitos no Cern, dos quais participaram cerca de 6 mil físicos, foram encontradas evidências quase inequívocas das formas de decaimento que seriam a assinatura típica dos bóson de Higgs.

*O jornalista Marcos Pivetta viajou a Lindau a convite do Daad (Serviço Alemão de Intercâmbio Acadêmico)

Para evitar catástrofes ambientais (FAPERJ)

Vilma Homero

05/07/2012

 Nelson Fernandes / UFRJ
 
  Novos métodos podem prever onde e quando
ocorrerão deslizamentos na região serrana

Quando várias áreas de Nova Friburgo, Petrópolis e Teresópolis sofreram deslizamentos, em janeiro de 2011, soterrando mais de mil pessoas em toneladas de lama e destroços, a pergunta que ficou no ar foi se o desastre poderia ter sido minimizado. No que depender do Instituto de Geociências da Universidade Federal do Rio de Janeiro (UFRJ), as consequências provocadas por cataclismas ambientais como esses poderão ser cada vez menores. Para isso, os pesquisadores estão desenvolvendo uma série de projetos multidisciplinares para viabilizar sistemas de análise de riscos. Um deles é o Prever, que, com suporte de programas computacionais, une os avanços alcançados em metodologias de sensoriamento remoto, geoprocessamento, geomorfologia e geotecnia, à modelagem matemática para a previsão do tempo em áreas mais suscetíveis a deslizamentos, como a região serrana. “Embora a realidade dos vários municípios daquela região seja bastante diferente, há em comum uma falta de metodologias voltadas à previsão para esse tipo de risco. O fundamental agora é desenvolver métodos capazes de prever a localização espacial e temporal desses processos. Ou seja, saber “onde” e “quando” esses deslizamentos podem ocorrer”, explica o geólogo Nelson Ferreira Fernandes, professor do Departamento de Geografia da UFRJ e Cientista do Nosso Estado da FAPERJ.Para elaborar métodos de previsão de risco, em tempo real, que incluam movimentos de massa deflagrados em resposta a entradas pluviométricas, os pesquisadores estão traçando um mapeamento, realizado a partir de sucessivas imagens captadas por satélites, que são cruzadas com mapas geológicos e geotécnicos. O Prever combina modelos de simulação climática e de previsão de eventos pluviométricos extremos, desenvolvidos na área da meteorologia, com modelos matemáticos de previsão, mais as informações desenvolvidos pela geomorfologia e pela geotecnia, que nos indicam as áreas mais suscetíveis a deslizamentos. Assim, podemos elaborar traçar previsões de risco, em tempo real, classificando os resultados de acordo com a gravidade desse risco, que varia continuamente, no espaço e no tempo”, explica Nelson.

Para isso, os Departamentos de Geografia, Geologia e Meteorologia do Instituto de Geociências da UFRJ se unem à Faculdade de Geologia da Universidade do Estado do Rio de Janeiro (Uerj) e ao Departamento de Engenharia Civil da Pontifícia Universidade Católica (PUC-Rio). Com a sobreposição de informações, pode-se apontar, nas imagens resultantes, as áreas mais sensíveis a deslizamentos. “Somando esses conhecimentos acadêmicos aos dados de órgãos estaduais, como o Núcleo de Análise de Desastres (Nade), do Departamento de Recursos Minerais (DRM-RJ), responsável pelo apoio técnico à Defesa Civil, estaremos não apenas atualizando constantemente os mapas usados hoje pelos órgãos do governo do estado e pela Defesa Civil, como estaremos também facilitando um planejamento mais preciso para a tomada de decisões.”

 Divulgação / UFRJ
Uma simulação mostra em imagem a possibilidade de
um deslizamento de massas na
 região de Jacarepaguá

Esse novo mapeamento também significa melhor qualidade e maior precisão e mais detalhamento de imagens. “Obviamente, com melhores instrumentos em mãos, o que quer dizer mapas mais detalhados e precisos, os gestores públicos também poderão planejar e agir de forma mais acurada e em tempo real”, afirma Nelson. Segundo o pesquisador, esses mapas precisam ter atualização constante para acompanhar a dinâmica da interferência da ocupação humana sobre a topografia das várias regiões. “Isso vem acontecendo seja pelo corte de encostas, seja pela ocupação de áreas aterradas ou pelas mudanças em consequência da drenagem de rios. Tudo isso altera a topografia e, no caso de chuvas mais fortes e prolongadas, pode tornar determinados solos mais propensos a deslizamentos ou a alagamentos e enchentes”, exemplifica Nelson.Mas os sistemas de análises de desastres e riscos ambientais também compreendem outras linhas de pesquisa. No Prever, se trabalha em duas linhas de ação distintas. “Uma delas é a de clima, em que detectamos as áreas em que haverá um aumento pluviométrico a longo prazo e fornecemos informações a órgãos de decisão e planejamento. Outra é a previsão de curtíssimo prazo, o chamadonowcasting.” No caso de previsão de longo prazo, a professora Ana Maria Bueno Nunes, do Departamento de Meteorologia da mesma universidade, vem trabalhando no projeto “Implementação de um Sistema de Modelagem Regional: Estudos de Tempo e Clima”, sob sua coordenação, com a proposta de uma reconstrução do hidroclima da América do Sul, uma extensão daquele projeto.

“Unindo dados sobre precipitação fornecidos por satélite às informações das estações atmosféricas, é possível, através de modelagem computacional, traçar estimativas de precipitação. Assim, podemos não apenas saber quando haverá chuvas de intensidade mais forte, ou mais prolongadas, como também observar em mapas passados qual foi a convergência de fatores que provocou uma situação de desastre. A reconstrução é uma forma de estudar o passado para entender cenários atuais que se mostrem semelhantes. E, com isso, ajudamos a melhorar os modelos de previsão”, afirma Ana. Estas informações, que a princípio servirão para uso acadêmico e científico, permitirão que se tenha dados cada vez mais detalhados de como se formam grandes chuvas, aquelas que são capazes de provocar inundações em determinadas áreas. “Isso permitirá não apenas compreender melhor as condições em que certas situações de calamidade acontecem, como prever quando essas condições podem se repetir. Com o projeto, estamos também formando recursos humanos ainda mais especializados nessa área”, avalia a pesquisadora, cujo trabalho conta com recursos de um Auxílio à Pesquisa (APQ 1).

Também integrante do projeto, o professor Gutemberg Borges França, da UFRJ, explica que existem três tipos de previsão meteorológica: a sinótica – que traça previsões numa média de 6h até sete dias, cobrindo alguns milhares de km, como o continente sul-americano; a de mesoescala, que faz previsões sobre uma média de 6h a dois dias, cobrindo algumas centenas de km, como o estado do Rio de Janeiro; e a de curto prazo, ou nowcasting, que varia de poucos minutos até 3h a 6h, sobre uma área específica de poucos km, como a região metropolitana do Rio de Janeiro, por exemplo.

Se previsões de longo prazo são importantes, as de curto prazo, ou nowcasting, também são. Segundo Gutemberg, os atuais modelos numéricos de previsão ainda são deficientes para realizar a previsão de curto prazo, que termina sendo feita em grande parte com base na experiência do meteorologista, pela interpretação das informações de várias fontes de dados disponíveis, como imagens de satélites; de estações meteorológicas de superfície e altitude; de radar e sodar (Sonic Detection and Ranging), e modelos numéricos. “No entanto, o meteorologista carece ainda hoje de ferramentas objetivas que possam auxiliá-lo na integração dessas diversas informações para realizar uma previsão de curto prazo mais acurada”, argumenta Gutemberg.Atualmente, o Rio de Janeiro já dispõe de estações de recepção de satélites, estação de altitude – radiosondagem – que geram perfis atmosféricos, estações meteorológicas de superfície e radar. O Laboratório de Meteorologia Aplicada do Departamento de Meteorologia, da UFRJ, está desenvolvendo, desde 2005, ferramentas de previsão de curto prazo, utilizando inteligência computacional, visando o aprimoramento das previsões de eventos meteorológicos extremos para o Rio de Janeiro. “Com inteligência computacional, temos essa informação em tempo mais curto e de forma mais acurada.”, resume.

© FAPERJ – Todas as matérias poderão ser reproduzidas, desde que citada a fonte.

This summer is ‘what global warming looks like’ (AP) + related & reactions

Jul 3, 1:10 PM EDT

By SETH BORENSTEIN
AP Science Writer

AP PhotoAP Photo/Matthew Barakat

WASHINGTON (AP) — Is it just freakish weather or something more? Climate scientists suggest that if you want a glimpse of some of the worst of global warming, take a look at U.S. weather in recent weeks.

Horrendous wildfires. Oppressive heat waves. Devastating droughts. Flooding from giant deluges. And a powerful freak wind storm called a derecho.

These are the kinds of extremes experts have predicted will come with climate change, although it’s far too early to say that is the cause. Nor will they say global warming is the reason 3,215 daily high temperature records were set in the month of June.

Scientifically linking individual weather events to climate change takes intensive study, complicated mathematics, computer models and lots of time. Sometimes it isn’t caused by global warming. Weather is always variable; freak things happen.

And this weather has been local. Europe, Asia and Africa aren’t having similar disasters now, although they’ve had their own extreme events in recent years.

But since at least 1988, climate scientists have warned that climate change would bring, in general, increased heat waves, more droughts, more sudden downpours, more widespread wildfires and worsening storms. In the United States, those extremes are happening here and now.

So far this year, more than 2.1 million acres have burned in wildfires, more than 113 million people in the U.S. were in areas under extreme heat advisories last Friday, two-thirds of the country is experiencing drought, and earlier in June, deluges flooded Minnesota and Florida.

“This is what global warming looks like at the regional or personal level,” said Jonathan Overpeck, professor of geosciences and atmospheric sciences at the University of Arizona. “The extra heat increases the odds of worse heat waves, droughts, storms and wildfire. This is certainly what I and many other climate scientists have been warning about.”

Kevin Trenberth, head of climate analysis at the National Center for Atmospheric Research in fire-charred Colorado, said these are the very record-breaking conditions he has said would happen, but many people wouldn’t listen. So it’s I told-you-so time, he said.

As recently as March, a special report an extreme events and disasters by the Nobel Prize-winning Intergovernmental Panel on Climate Change warned of “unprecedented extreme weather and climate events.” Its lead author, Chris Field of the Carnegie Institution and Stanford University, said Monday, “It’s really dramatic how many of the patterns that we’ve talked about as the expression of the extremes are hitting the U.S. right now.”

“What we’re seeing really is a window into what global warming really looks like,” said Princeton University geosciences and international affairs professor Michael Oppenheimer. “It looks like heat. It looks like fires. It looks like this kind of environmental disasters.”

Oppenheimer said that on Thursday. That was before the East Coast was hit with triple-digit temperatures and before a derecho – a large, powerful and long-lasting straight-line wind storm – blew from Chicago to Washington. The storm and its aftermath killed more than 20 people and left millions without electricity. Experts say it had energy readings five times that of normal thunderstorms.

Fueled by the record high heat, this was among the strongest of this type of storm in the region in recent history, said research meteorologist Harold Brooks of the National Severe Storm Laboratory in Norman, Okla. Scientists expect “non-tornadic wind events” like this one and other thunderstorms to increase with climate change because of the heat and instability, he said.

Such patterns haven’t happened only in the past week or two. The spring and winter in the U.S. were the warmest on record and among the least snowy, setting the stage for the weather extremes to come, scientists say.

Since Jan. 1, the United States has set more than 40,000 hot temperature records, but fewer than 6,000 cold temperature records, according to the National Oceanic and Atmospheric Administration. Through most of last century, the U.S. used to set cold and hot records evenly, but in the first decade of this century America set two hot records for every cold one, said Jerry Meehl, a climate extreme expert at the National Center for Atmospheric Research. This year the ratio is about 7 hot to 1 cold. Some computer models say that ratio will hit 20-to-1 by midcentury, Meehl said.

“In the future you would expect larger, longer more intense heat waves and we’ve seen that in the last few summers,” NOAA Climate Monitoring chief Derek Arndt said.

The 100-degree heat, drought, early snowpack melt and beetles waking from hibernation early to strip trees all combined to set the stage for the current unusual spread of wildfires in the West, said University of Montana ecosystems professor Steven Running, an expert on wildfires.

While at least 15 climate scientists told The Associated Press that this long hot U.S. summer is consistent with what is to be expected in global warming, history is full of such extremes, said John Christy at the University of Alabama in Huntsville. He’s a global warming skeptic who says, “The guilty party in my view is Mother Nature.”

But the vast majority of mainstream climate scientists, such as Meehl, disagree: “This is what global warming is like, and we’ll see more of this as we go into the future.”

Intergovernmental Panel on Climate Change report on extreme weather: http://ipcc-wg2.gov/SREX/

U.S. weather records:

http://www.ncdc.noaa.gov/extremes/records/

Seth Borenstein can be followed at http://twitter.com/borenbears

© 2012 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed. Learn more about our Privacy Policy and Terms of Use.

*   *   *

July 3, 2012

To Predict Environmental Doom, Ignore the Past

http://www.realclearscience.com

By Todd Myers

The information presented here cannot be used directly to calculate Earth’s long-term carrying capacity for human beings because, among other things, carrying capacity depends on both the affluence of the population being supported and the technologies supporting it. – Paul Ehrlich, 1986

One would expect scientists to pause when they realize their argument about resource collapse makes the king of environmental catastrophe, Paul Ehrlich, look moderate by comparison. Ehrlich is best known for a 40-year series of wildly inaccurate predictions of looming environmental disaster. Yet he looks positively reasonable compared to a paper recently published in the scientific journal Nature titled “Approaching a state shift in Earth’s biosphere.”

The paper predicts we are rapidly approaching a moment of “planetary-scale critical transition,” due to overuse of resources, climate change and other human-caused environmental damage. As a result, the authors conclude, this will “require reducing world population growth and per-capita resource use; rapidly increasing the proportion of the world’s energy budget that is supplied by sources other than fossil fuels,” and a range of other drastic policies. If these sound much like the ideas proposed in the 1970s by Ehrlich and others, like The Club of Rome, it is not a coincidence. TheNature paper is built on Ehrlich’s assumptions and cites his work more than once.

The Nature article, however, suffers from numerous simple statistical errors and assumptions rather than evidence. Its authors do nothing to deal with the fundamental mistakes that led Ehrlich and others like him down the wrong path so many times. Instead, the paper simply argues that with improved data, this time their predictions of doom are correct.

Ultimately, the piece is a good example of the great philosopher of science Thomas Kuhn’s hypothesis, written 50 years ago, that scientists often attempt to fit the data to conform to their particular scientific paradigm, even when that paradigm is obviously flawed. When confronted with failure to explain real-world phenomena, the authors of the Nature piece have, as Kuhn described in The Structure of Scientific Revolutions, devised “numerous articulations and ad hoc modifications of their theory in order to eliminate any apparent conflict.” Like scientists blindly devoted to a failed paradigm, the Nature piece simply tries to force new data to fit a flawed concept.

“Assuming this does not change”

During the last half-century, the world has witnessed a dramatic increase in food production. According to the U.N.’s Food and Agriculture Organization, yields per acre of rice have more than doubled, corn yields are more than one-and-a-half times larger than 50 years ago, and wheat yields have almost tripled. As a result, even as human population has increased, worldwide hunger has declined.

Despite these well-known statistics, the authors of the Nature study assume not only no future technological improvements, but that none have occurred over the last 200 years. The authors simply choose one data point and then project it both into the past and into the future. The authors explain the assumption that underlies their thesis in the caption to a graphic showing the Earth approaching environmental saturation. They write:

“The percentages of such transformed lands… when divided by 7,000,000,000 (the present global human population) yield a value of approximately 2.27 acres (0.92 ha) of transformed land for each person. That value was used to estimate the amount of transformed land that probably existed in the years 1800, 1900 and 1950, and which would exist in 2025 and 2045 assuming conservative population growth and that resource use does not become any more efficient.” (emphasis added)

In other words, the basis for their argument ignores the easily accessible data from the last half century. They take a snapshot in time and mistake it for a historical trend. In contrast to their claim of no change in the efficient use of resources, it would be difficult to find a time period in the last millennium when resource use did not become more efficient.

Ironically, this is the very error Ehrlich warns against in his 1986 paper – a paper the authors themselves cite several times. Despite Ehrlich’s admonition that projections of future carrying capacity are dependent upon technological change, the authors of the Nature article ignore history to come to their desired conclusion.

A Paradigm of Catastrophe

What would lead scientists to make such simplistic assumptions and flat-line projections? Indeed, what would lead Nature editors to print an article whose statistical underpinnings are so flawed? The simple belief in the paradigm of inevitable environmental catastrophe: humans are doing irreparable damage to the Earth and every bit of resource use moves us closer to that catastrophe. The catastrophe paradigm argues a simple model that eventually we will run out of space and resources, and determining the date of ultimate doom is a simple matter of doing the math.

Believing in this paradigm also justifies exaggeration in order to stave off the serious consequences of collapse. Thus, they describe the United Nations’ likely population estimate for 2050 as “the most conservative,” without explaining why. They claim “rapid climate change shows no signs of slowing” without providing a source citation for the claim, and despite an actual slowing of climate change over the last decade.

The need to avoid perceived global catastrophe also encourages the authors to blow past warning signs that their analysis is not built on solid foundations – as if the poor history of such projections were not already warning enough. Even as they admit the interactions “between overlapping complex systems, however, are providing difficult to characterize mathematically,” they base their conclusions on the simplest linear mathematical estimate that assumes nothing will change except population over the next 40 years. They then draw a straight line, literally, from today to the environmental tipping point.

Why is such an unscientific approach allowed to pass for science in a respected international journal? Because whatever the argument does not supply, the paradigm conveniently fills in. Even if the math isn’t reliable and there are obvious counterarguments, “everyone” understands and believes in the underlying truth – we are nearing the limits of the planet’s ability to support life. In this way the conclusion is not proven but assumed, making the supporting argument an impenetrable tautology.

Such a circumstance creates the conditions of scientific revolutions, where the old paradigm fails to explain real-world phenomena and is replaced by an alternative. Given the record of failure of the paradigm of resource catastrophe, dating back to the 1970s, one would hope we are moving toward such a change. Unfortunately, Nature and the authors of the piece are clinging to the old resource-depletion model, simply trying to re-work the numbers.

Let us hope policymakers recognize the failure of that paradigm before they make costly and dangerous policy mistakes that impoverish billions in the name of false scientific assumptions.

Todd Myers is the Environmental Director of the Washington Policy Center and author of the book Eco-Fads.

*   *   *

Washington Policy Center exposed: Todd Myers

The Washington Policy Center labels itself as a non-partisan think tank. It’s a mischaractization to say the least but that is their bread and butter. Based in Seattle, with a director in Spokane, the WPC’s mission is to “promote free-market solutions through research and education.” It makes sense they have an environmental director in the form of Todd Myers who has a new book called“Eco-Fads: How The Rise Of Trendy Environmentalism Is Harming The Environment.” You know, since polar bears love to swim.


From the WPC’s newsletter:

Wherever we turn, politicians, businesses and activists are promoting the latest fashionable “green” policy or product. Green buildings, biofuels, electric cars, compact fluorescent lightbulbs and a variety of other technologies are touted as the next key step in protecting the environment and promoting a sustainable future. Increasingly, however, scientific and economic information regarding environmental problems takes a back seat to the social and personal value of being seen and perceived as “green.”

As environmental consciousness has become socially popular, eco-fads supplant objective data. Politicians pick the latest environmental agenda in the same way we choose the fall fashions – looking for what will yield the largest benefit with our public and social circles.

Eco-Fads exposes the pressures that cause politicians, businesses, the media and even scientists to fall for trendy environmental fads. It examines why we fall for such fads, even when we should know better. The desire to “be green” can cloud our judgment, causing us to place things that make us appear green ahead of actions that may be socially invisible yet environmentally responsible.

By recognizing the range of forces that have taken us in the wrong direction, Eco-Fads shows how we can begin to get back on track, creating a prosperous and sustainable legacy for our planet’s future. Order Eco-Fads today for $26.95 (tax and shipping included).

This is what the newsletter doesn’t tell you about Todd Myers.

Myers has spoken at the Heartland Institute’s International Conference on Climate Change. In case you didn’t know, the Heartland Institute has received significant funding from ExxonMobil, Phillip Morris and numerous other corporations and conservative foundations with vested interest in the so-called debate around climate change. That conference was co-sponsored by numerous prominent climate change denier groups, think tanks and lobby groups, almost all of which have received money from the oil industry.

Why not just call it the Washington Fallacy Center? For a litte more background, including ties back to the Koch Brothers, go HERE. In fact, Jack Kemp calls it “The Heritage Foundation of the Northwest.”

*   *   *

 

Did climate change ’cause’ the Colorado wildfires?

By David Roberts

29 Jun 2012 1:50 PM

http://grist.org

Photo by USAF.

The wildfires raging through Colorado and the West are unbelievable. As of yesterday there were 242 fires burning, according to the National Interagency Fire Center. Almost 350 homes have been destroyed in Colorado Springs, where 36,000 people have been evacuated from their homes. President Obama is visiting today to assess the devastation for himself.

Obviously the priority is containing the fires and protecting people. But inevitably the question is going to come up: Did climate change “cause” the fires? Regular readers know that this question drives me a little nuts. Pardon the long post, but I want to try to tackle this causation question once and for all.

What caused the Colorado Springs fire? Well, it was probably a careless toss of a cigarette butt, or someone burning leaves in their backyard, or a campfire that wasn’t properly doused. [UPDATE:Turns out it was lightning.] That spark, wherever it came from, is what triggered the cascading series of events we call “a fire.” It was what philosophers call the proximate cause, the most immediate, the closest.

All the other factors being discussed — the intense drought covering the state, the dead trees left behind by bark beetles, the high winds — are distal causes. Distal causes are less tightly connected to their effects. The dead trees didn’t make any particular fire inevitable; there can be no fire without a spark. What they did is make it more likelythat a fire would occur. Distal causes are like that: probabilistic. Nonetheless, our intuitions tell us that distal causes are in many ways more satisfactory explanations. They tell us something about themeaning of events, not just the mechanisms, which is why they’re also called “ultimate” causes. It’s meaning we usually want.

When we say, “the fires in Colorado were caused by unusually dry conditions, high winds, and diseased trees,” no one accuses us of error or imprecision because it was “really” the matches or campfires that caused them. We are not expected to say, “no individual fire can be definitively attributed to hot, windy conditions, but these are the kinds of fires we would expect to see in those conditions.” Why waste the words? We are understood to be talking about distal causes.

When we talk about, not fires themselves, but the economic and socialimpacts of fires, the range of distal causes grows even broader. For a given level of damages, it’s not enough to have dry conditions and dead trees, not even enough to have fire — you also have to take into account the density of development, the responsiveness of emergency services, and the preparedness of communities for prevention or evacuation.

So if we say, “the limited human toll of the Colorado fires is the result of the bravery and skill of Western firefighters,” no one accuses us of error or imprecision because good firefighting was only one of many contributors to the final level of damages. Everything from evacuation plans to the quality of the roads to the vagaries of the weather contributed in some way to that state of affairs. But we are understood to be identifying a distal cause, not giving a comprehensive account of causation.

What I’m trying to say is, we are perfectly comfortable discussing distal causes in ordinary language. We don’t require scientistic literalism in our everyday talk.

The reason I’m going through all this, you won’t be surprised, is to tie it back to climate change. We know, of course, that climate change was not the proximate cause of the fires. It was a distal cause; it made the fires more likely. That much we know with a high degree of confidence, as this excellent review of the latest science by Climate Communication makes clear.

One can distinguish between distal causes by their proximity to effects. Say the drought made the fires 50 percent more likely than average June conditions in Colorado. (I’m just pulling these numbers out of my ass to illustrate a point.) Climate change maybe only made the fires 1 percent more likely. As a cause, it is more distal than the drought. And there are probably causes even more distal than climate change. Maybe the exact tilt of the earth’s axis this June made the fires 0.0001 percent more likely. Maybe the location of a particular proton during the Big Bang made them 0.000000000000000001 percent more likely. You get the point.

With this in mind, it’s clear that the question as it’s frequently asked — “did climate change cause the fires?” — is not going to get us the answer we want. If it’s yes or no, the answer is “yes.” But that doesn’t tell us much. What people really want to know when they ask that question is, “how proximate a cause is climate change?”

When we ask the question like that, we start to see why climate is such a wicked problem. Human beings, by virtue of their evolution, physiology, and socialization, are designed to heed causes within a particular range between proximate and distal. If I find my kid next to an overturned glass and a puddle of milk and ask him why the milk is spilled, I don’t care about the neurons firing and the muscles contracting. That’s too proximate. I don’t care about humans evolving with poor peripheral vision. That’s too distal. I care about my kid reaching for it and knocking it over. That’s not the only level of causal explanation that is correct, but it’s the level of causal explanation that is most meaningful to me.

For a given effect — a fire, a flood, a dead forest — climate change is almost always too distal a cause to make a visceral impression on us. We’re just not built to pay heed to those 1 percent margins. It’s too abstract. The problem is, wildfires being 1 percent more likely averaged over the whole globe actually means a lot more fires, a lot more damage, loss, and human suffering. Part of managing the Anthropocene is finding ways of making distal causes visceral, giving them a bigger role in our thinking and institutions.

That’s what the “did climate change cause XYZ?” questions are always really about: how proximate a cause climate change is, how immediate its effects are in our lives, how close it is.

There is, of course, a constant temptation among climate hawks to exaggerate how proximate it is, since, all things being equal, proximity = salience. But I don’t think that simply saying “climate change caused the fires” is necessarily false or exaggerated, any more than saying “drought caused the fires” is. The fact that the former strikes many people as suspect while the latter is immediately understood mostly just means that we’re not used to thinking of climate change as a distal cause among others.

That’s why we reach for awkward language like, “fires like this are consonant with what we would expect from climate change.” Not because that’s the way we discuss all distal causes — it’s clearly not — but simply because we’re unaccustomed to counting climate change among those causes. It’s an unfamiliar habit. As it grows more familiar, I suspect we’ll quit having so many of these tedious semantic disputes.

And I’m afraid that, in coming years, it will become all-too familiar.

*   *   *

 

Perspective On The Hot and Dry Continental USA For 2012 Based On The Research Of Judy Curry and Of McCabe Et Al 2004

http://pielkeclimatesci.wordpress.com

Photo is from June 26 2012 showing start of the June 26 Flagstaff firenear Boulder Colorado

I was alerted to an excellent presentation by Judy Curry [h/t to Don Bishop] which provides an informative explanation of the current hot and dry weather in the USA. The presentation is titled

Climate Dimensions of the Water Cycle by Judy Curry

First, there is an insightful statement by Judy where she writes in slide 5

CMIP century scale simulations are designed for assessing sensitivity to greenhouse gases using emissions scenarios They are not fit for the purpose of inferring decadal scale or regional climate variability, or assessing variations associated with natural forcing and internal variability. Downscaling does not help.

We need a much broader range of scenarios for regions (historical data, simple models, statistical models, paleoclimate analyses, etc). Permit creatively constructed scenarios as long as they can’t be falsified as incompatible with background knowledge.

With respect to the current hot and dry weather, the paper referenced by Judy in her Powerpoint talk

Gregory J. McCabe, Michael A. Palecki, and Julio L. Betancourt, 2004: Pacific and Atlantic Ocean influences on multidecadal drought frequency in the United States. PNAS 2004 101 (12) 4136-4141; published ahead of print March 11, 2004, doi:10.1073/pnas.0306738101

has the abstract [highlight added]

More than half (52%) of the spatial and temporal variance in multidecadal drought frequency over the conterminous United States is attributable to the Pacific Decadal Oscillation (PDO) and the Atlantic Multidecadal Oscillation (AMO). An additional 22% of the variance in drought frequency is related to a complex spatial pattern of positive and negative trends in drought occurrence possibly related to increasing Northern Hemisphere temperatures or some other unidirectional climate trend. Recent droughts with broad impacts over the conterminous U.S. (1996, 1999–2002) were associated with North Atlantic warming (positive AMO) and northeastern and tropical Pacific cooling (negative PDO). Much of the long-term predictability of drought frequency may reside in the multidecadal behavior of the North Atlantic Ocean. Should the current positive AMO (warm North Atlantic) conditions persist into the upcoming decade, we suggest two possible drought scenarios that resemble the continental-scale patterns of the 1930s (positive PDO) and 1950s (negative PDO) drought.

They also present the figure below with the title “Impact of AMO, PDO on 20-yr drought frequency (1900-1999)”.   The figures correspond to A: Warm PDO, cool AMO; B: Cool PDO, cool AMO; C: Warm PDO, warm AMO and D:  Cool PDO, warm AMO

The current Drought Monitor analysis shows a remarkable agreement with D, as shown below

As Judy shows in her talk (slide 8) since 1995 we have been in a warm phase of the AMO and have entered a cool phase of the PDO. This corresponds to D in the above figure.  Thus the current drought and heat is not an unprecedented event but part of the variations in atmospheric-ocean circulation features that we have seen in the past.  This reinforces what Judy wrote that

[w]e need a much broader range of scenarios for regions (historical data, simple models, statistical models, paleoclimate analyses

in our assessment of risks to key resources due to climate. Insightful discussions of the importance of these circulation features are also presented, as just a few excellent examples, by Joe Daleo  and Joe Bistardi on ICECAP, by Bob Tisdale at Bob Tisdale – Climate Observations, and in posts on Anthony Watts’s weblog Watts Up With That.

 

*   *   *

Hotter summers could be a part of Washington’s future

http://www.washingtonpost.com

By  and , Published: July 5

As relentless heat continues to pulverize Washington, the conversation has evolved from when will it end to what if it never does?

Are unbroken weeks of sweltering weather becoming the norm rather than the exception?

The answer to the first question is simple: Yes, it will end. Probably by Monday.

The answer to the second, however, is a little more complicated.

Call it a qualified yes.

“Trying to wrap an analysis around it in real time is like trying to diagnose a car wreck as the cars are still spinning,” said Deke Arndt, chief of climate monitoring at the National Climatic Data Center in Asheville, N.C. “But we had record heat for the summer season on the Eastern Seaboard in 2010. We had not just record heat, but all-time record heat, in the summer season in 2011. And then you throw that on top of this [mild] winter and spring and the year to date so far, it’s very consistent with what we’d expect in a warming world.”

Nothing dreadfully dramatic is taking place — the seasons are not about to give way to an endless summer.

Heat-trapping greenhouse gases pumped into the atmosphere may be contributing to unusually hot and long heat waves — the kind of events climate scientists have long warned will become more common. Many anticipate a steady trend of ever-hotter average temperatures as human activity generates more and more carbon pollution.

To some, the numbers recorded this month and in recent years fit together to suggest a balmy future.

“We had a warm winter, a cold spring and now a real hot summer,” said Jessica Miller, 21, a visitor from Ohio, as she sat on a bench beneath the trees in Lafayette Square. “I think the overall weather patterns are changing.”

Another visitor, who sat nearby just across from the White House, shared a similar view.

“I think it’s a natural changing of the Earth’s average temperatures,” said Joe Kaufman, a Pennsylvanian who had just walked over from Georgetown.

Arndt said he expects data for the first half of this year will show that it was the warmest six months on record. Experts predict that average temperatures will rise by 3 to 5 degrees by mid-century and by 6 to 10 degrees by the end of the century.

If that worst prediction comes true, 98 degrees will become the new normal at this time of year in Washington 88 years from now.

Will every passing year till then break records?

“Not so much record-breaking every year,” Arndt said. “But we’ll break records on the warm end more often than on the cold end, that’s for sure. As we continue to warm, we will be flirting with warm records much more than with cold records, and that’s what’s played out over much of the last few years.”

If the present is our future, it may be sizzling. The current heat wave has had eight consecutive days of 95-degree weather. The temperature may reach 106 on Saturday, and the first break will come Monday, when a few days of more seasonable highs in the upper 80s are expected.

The hot streak began June 28 and peaked the next day with a 104-degree record-breaker, the hottest temperature ever recorded here in June. That broke a record of 102 set in 1874 and matched in June 2011.

 

 

Political Scientists Are Lousy Forecasters (N.Y.Times)

OPINION

Katia Fouquet

By JACQUELINE STEVENS
Published: June 23, 2012

DESPERATE “Action Alerts” land in my in-box. They’re from the American Political Science Association and colleagues, many of whom fear grave “threats” to our discipline. As a defense, they’ve supplied “talking points” we can use to tell Congressional representatives that political science is a “critical part of our national science agenda.”

Political scientists are defensive these days because in May the House passed an amendment to a bill eliminating National Science Foundation grants for political scientists. Soon the Senate may vote on similar legislation. Colleagues, especially those who have received N.S.F. grants, will loathe me for saying this, but just this once I’m sympathetic with the anti-intellectual Republicans behind this amendment. Why? The bill incited a national conversation about a subject that has troubled me for decades: the government — disproportionately — supports research that is amenable to statistical analyses and models even though everyone knows the clean equations mask messy realities that contrived data sets and assumptions don’t, and can’t, capture.

It’s an open secret in my discipline: in terms of accurate political predictions (the field’s benchmark for what counts as science), my colleagues have failed spectacularly and wasted colossal amounts of time and money. The most obvious example may be political scientists’ insistence, during the cold war, that the Soviet Union would persist as a nuclear threat to the United States. In 1993, in the journal International Security, for example, the cold war historian John Lewis Gaddis wrote that the demise of the Soviet Union was “of such importance that no approach to the study of international relations claiming both foresight and competence should have failed to see it coming.” And yet, he noted, “None actually did so.” Careers were made, prizes awarded and millions of research dollars distributed to international relations experts, even though Nancy Reagan’s astrologer may have had superior forecasting skills.

Political prognosticators fare just as poorly on domestic politics. In a peer-reviewed journal, the political scientist Morris P. Fiorina wrote that “we seem to have settled into a persistent pattern of divided government” — of Republican presidents and Democratic Congresses. Professor Fiorina’s ideas, which synced nicely with the conventional wisdom at the time, appeared in an article in 1992 — just before the Democrat Bill Clinton’s presidential victory and the Republican 1994 takeover of the House.

Alas, little has changed. Did any prominent N.S.F.-financed researchers predict that an organization like Al Qaeda would change global and domestic politics for at least a generation? Nope. Or that the Arab Spring would overthrow leaders in Egypt, Libya and Tunisia? No, again. What about proposals for research into questions that might favor Democratic politics and that political scientists seeking N.S.F. financing do not ask — perhaps, one colleague suggests, because N.S.F. program officers discourage them? Why are my colleagues kowtowing to Congress for research money that comes with ideological strings attached?

The political scientist Ted Hopf wrote in a 1993 article that experts failed to anticipate the Soviet Union’s collapse largely because the military establishment played such a big role in setting the government’s financing priorities. “Directed by this logic of the cold war, research dollars flowed from private foundations, government agencies and military individual bureaucracies.” Now, nearly 20 years later, the A.P.S.A. Web site trumpets my colleagues’ collaboration with the government, “most notably in the area of defense,” as a reason to retain political science N.S.F. financing.

Many of today’s peer-reviewed studies offer trivial confirmations of the obvious and policy documents filled with egregious, dangerous errors. My colleagues now point to research by the political scientists and N.S.F. grant recipients James D. Fearon and David D. Laitin that claims that civil wars result from weak states, and are not caused by ethnic grievances. Numerous scholars have, however, convincingly criticized Professors Fearon and Laitin’s work. In 2011 Lars-Erik Cederman, Nils B. Weidmann and Kristian Skrede Gleditsch wrote in the American Political Science Review that “rejecting ‘messy’ factors, like grievances and inequalities,” which are hard to quantify, “may lead to more elegant models that can be more easily tested, but the fact remains that some of the most intractable and damaging conflict processes in the contemporary world, including Sudan and the former Yugoslavia, are largely about political and economic injustice,” an observation that policy makers could glean from a subscription to this newspaper and that nonetheless is more astute than the insights offered by Professors Fearon and Laitin.

How do we know that these examples aren’t atypical cherries picked by a political theorist munching sour grapes? Because in the 1980s, the political psychologist Philip E. Tetlock began systematically quizzing 284 political experts — most of whom were political science Ph.D.’s — on dozens of basic questions, like whether a country would go to war, leave NATO or change its boundaries or a political leader would remain in office. His book “Expert Political Judgment: How Good Is It? How Can We Know?” won the A.P.S.A.’s prize for the best book published on government, politics or international affairs.

Professor Tetlock’s main finding? Chimps randomly throwing darts at the possible outcomes would have done almost as well as the experts.

These results wouldn’t surprise the guru of the scientific method, Karl Popper, whose 1934 book “The Logic of Scientific Discovery” remains the cornerstone of the scientific method. Yet Mr. Popper himself scoffed at the pretensions of the social sciences: “Long-term prophecies can be derived from scientific conditional predictions only if they apply to systems which can be described as well-isolated, stationary, and recurrent. These systems are very rare in nature; and modern society is not one of them.”

Government can — and should — assist political scientists, especially those who use history and theory to explain shifting political contexts, challenge our intuitions and help us see beyond daily newspaper headlines. Research aimed at political prediction is doomed to fail. At least if the idea is to predict more accurately than a dart-throwing chimp.

To shield research from disciplinary biases of the moment, the government should finance scholars through a lottery: anyone with a political science Ph.D. and a defensible budget could apply for grants at different financing levels. And of course government needs to finance graduate student studies and thorough demographic, political and economic data collection. I look forward to seeing what happens to my discipline and politics more generally once we stop mistaking probability studies and statistical significance for knowledge.

Jacqueline Stevens is a professor of political science at Northwestern University and the author, most recently, of “States Without Nations: Citizenship for Mortals.”

A version of this op-ed appeared in print on June 24, 2012, on page SR6 of the New York edition with the headline: Political Scientists Are Lousy Forecasters.

How “sustainability” became “sustained growth” (The Guardian)

The Rio Declaration rips up the basic principles of environmental action.

BY GLOBAL JUSTICE ECOLOGY PROJECT | JUNE 23, 2012 · 9:25 AM

By George Monbiot, published on the Guardian’s website

June 22, 2012. In 1992 world leaders signed up to something called “sustainability”. Few of them were clear about what it meant; I suspect that many of them had no idea. Perhaps as a result, it did not take long for this concept to mutate into something subtly different: “sustainable development”. Then it made a short jump to another term: “sustainable growth”. And now, in the 2012 Earth Summit text that world leaders are about to adopt, it has subtly mutated once more: into “sustained growth”.

This term crops up 16 times in the document, where it is used interchangeably with sustainability and sustainable development. But if sustainability means anything, it is surely the opposite of sustained growth. Sustained growth on a finite planet is the essence of unsustainability.

As Robert Skidelsky, who comes at this issue from a different angle, observes in the Guardian today:

“Aristotle knew of insatiability only as a personal vice; he had no inkling of the collective, politically orchestrated insatiability that we call economic growth. The civilization of “always more” would have struck him as moral and political madness. And, beyond a certain point, it is also economic madness. This is not just or mainly because we will soon enough run up against the natural limits to growth. It is because we cannot go on for much longer economising on labour faster than we can find new uses for it.”

Several of the more outrageous deletions proposed by the United States – such as any mention of rights or equity or of common but differentiated responsibilities – have been rebuffed. In other respects the Obama government’s purge has succeeded, striking out such concepts as “unsustainable consumption and production patterns” and the proposed decoupling of economic growth from the use of natural resources.

At least the states due to sign this document haven’t ripped up the declarations from the last Earth Summit, 20 years ago. But in terms of progress since then, that’s as far as it goes. Reaffirming the Rio 1992 commitments is perhaps the most radical principle in the entire declaration.

As a result, the draft document, which seems set to become the final document, takes us precisely nowhere. 190 governments have spent 20 years bracing themselves to “acknowledge”, “recognise” and express “deep concern” about the world’s environmental crises, but not to do anything about them.

This paragraph from the declaration sums up the problem for me:

“We recognize that the planet Earth and its ecosystems are our home and that Mother Earth is a common expression in a number of countries and regions and we note that some countries recognize the rights of nature in the context of the promotion of sustainable development. We are convinced that in order to achieve a just balance among the economic, social and environment needs of present and future generations, it is necessary to promote harmony with nature.”

It sounds lovely, doesn’t it? It could be illustrated with rainbows and psychedelic unicorns and stuck on the door of your toilet. But without any proposed means of implementation, it might just as well be deployed for a different function in the same room.

The declaration is remarkable for its absence of figures, dates and targets. It is as stuffed with meaningless platitudes as an advertisement for payday loans, but without the necessary menace. There is nothing to work with here, no programme, no sense of urgency or call for concrete action beyond the inadequate measures already agreed in previous flaccid declarations. Its tone and contents would be better suited to a retirement homily than a response to a complex of escalating global crises.

The draft and probably final declaration is 283 paragraphs of fluff. It suggests that the 190 governments due to approve it have, in effect, given up on multilateralism, given up on the world and given up on us. So what do we do now? That is the topic I intend to address in my column next week.

Rio+20 chega ao fim com resultado tímido e promessas adiadas (BBC)

Atualizado em  22 de junho, 2012 – 16:20 (Brasília) 19:20 GMT

Índio | Foto: Agência BrasilDuas décadas após a Eco-92, Rio+20 não produziu respostas às principais questões modernas.

No último dia da Rio+20, a Conferência das Nações Unidas sobre Desenvolvimento Sustentável, o secretário-geral da ONU, Ban Ki-moon, pediu a todos os governos que eliminem a fome do mundo. Ele disse que, em um mundo populoso, ninguém deveria passar fome.

A fase final da conferência também registrou promessas de diferentes países e empresas em temas como energias limpas.

Mesmo assim, um grupo de políticos veteranos se juntaram a organizações ambientalistas em sua avaliação de que a declaração final do encontro foi o resultado de um “fracasso de liderança”.

Na visão do vice-primeiro-ministro da Grã-Bretanha, Nick Clegg, o resultado das discussões pode ser classificado como “insípido”.

O encontro, que marcou os 20 anos após a emblemática Cúpula da Terra também realizada no Rio de Janeiro, em 1992, e 40 anos depois da primeira reunião mundial sobre o tema, em Estocolmo, tinha como objetivo estimular novas medidas rumo a uma “economia verde”.

Mas a declaração que foi concluída por negociadores de diferentes governos na terça-feira, e que ministros e chefes de Estado e governo não quiseram rediscutir, coloca a economia verde apenas como um de muitos caminhos rumo a um desenvolvimento sustentável.

Mary Robinson, ex-presidente irlandesa que também já ocupou o posto de Alta Comissária da ONU para os Direitos Humanos, disse que os termos do documento não são suficientes.

“Este é um daqueles momentos únicos em uma geração, quando o mundo precisa de visão, compromisso e, acima de tudo, liderança”, disse. “Tristemente, o documento atual é um fracasso de liderança”, afirmou, ecoando as declarações do vice-premiê britânico.

O ex-presidente Fernando Henrique Cardoso disse que a declaração não produz benefícios para a proteção ambiental nem para o desenvolvimento humano.

“Esta divisão antiga entre o meio ambiente e o desenvolvimento não é o caminho para resolver os problemas que estamos criando para nossos netos e bisnetos”, disse. “Temos que aceitar que as soluções para a pobreza e a desigualdade se encontram no desenvolvimento sustentável, e não no crescimento a qualquer custo.”

O secretário-geral da ONU esperava que o encontro adotasse medidas mais firmes para garantir que os mais pobres tivessem acesso a água, energia e alimentos. No entanto, sua emblemática iniciativa Energia Sustentável para Todos foi apenas citada no texto, ao invés de receber apoio enfático dos líderes.

Esperança

Na fase final do encontro, Ban Ki-moon desafiou os governos mundiais a fazerem mais.

“Em um mundo de muitos, ninguém, nem mesmo uma única pessoa, deveria passar fome”, disse. “Convido todos vocês a se juntarem a mim para trabalhar em um futuro sem fome”, acrecentou a uma plateia estimada em 130 chefes de Estado e governo.

“Embora o mundo produza comida suficiente para alimentar todos os habitantes do planeta, há mais pessoas passando fome hoje do que no último encontro organizado no Rio em 1992. “

Barbara Stocking, diretora-executiva da Oxfam Internacional

Atualmente acredita-se que quase 1 bilhão de pessoas – um sétimo da população mundial – vivem em fome crônica, enquanto outro bilhão não recebe nutrição adequada.

As medidas que poderiam ajudar a eliminar essa situação incluem a redução do desperdício de alimentos – quase um terço de todos os alimentos produzidos são jogados no lixo nos países ricos, e uma proporção ainda maior nos países mais pobres, por razões diferentes – além de dobrar a produtividade de pequenas propriedades.

O desafio é parcialmente baseado no programa Fome Zero, criado no Brasil pelo ex-presidente Luiz Inácio Lula da Silva.

“O anúncio de Ban Ki-moon é um raio de esperança bem-vindo em uma conferência que foi vergonhosamente marcada pela ausência de progresso”, disse Barbara Stockling, chefe da ONG internacional Oxfam.

“Apesar do fato de que o mundo produz alimentos suficientes para todos, há mais pessoas com fome hoje do que em 1992, quando o mundo se reuniu pela última vez no Rio”, acrecentou.

No entanto, até o momento, tudo o que há de concreto é um desafio. Não há dinheiro nem mudanças na maneira como a própria ONU se posiciona sobre o assunto da fome.

Em paralelo às principais negociações no Rio, empresas e governos firmaram mais de 200 compromissos de ações voluntárias em diferentes áreas.

Energia, água e alimentos estão neste pacote, embora a maioria das promessas sejam de inclusão do tema desenvolvimento sustentável em programas educacionais.

Indígenas criticam o REDD (IPS)

Envolverde Rio + 20
21/6/2012 – 10h27

por Clarinha Glock, da IPS

Indigenas 300x225 Indígenas criticam o REDD

Protestos contra a mercantilização da natureza. Foto: Mario Osava.

Rio de Janeiro, Brasil, 21/6/2012 (TerraViva) – Os indígenas reunidos na aldeia Kari-Oca pretendem entregar hoje (21) um documento à presidenta Dilma Rousseff, no Riocentro. Segundo Berenice Sanches Nahua, 30 anos, integrante da Aliança Mundial dos Povos Indígenas e Comunidades Locais sobre Mudanças Climáticas e contra a REDD (Redução de Emissões por Desmatamento e Degradação), a Declaração da Kari-oca 2 (a primeira foi na Rio-92) reafirma a preocupação com a “farsa” da economia verde, que comercializa o que para os indígenas é mais sagrado, toma o seu território e viola os direitos da Mãe Terra. “Esperamos que os representantes da Rio+20 abram suas mentes e corações e se deem conta de que não há mais o que fazer senão defender a Mãe Terra e seus filhos”, falou Berenice.

A Aldeia foi inaugurada pelo indígena Marcos Terena e foi instalada em Jacarepaguá. A Declaração diz: “Desde Rio 1992, nós como Povos Indígenas vemos que o colonialismo está sendo transformado na base da globalização do comércio e da hegemonia econômica capitalista mundial. Se vem intensificado a exploração e o roubo dos ecossistemas e biodiversidade do mundo, assim como a violação aos diretos inerentes dos povos indígenas. Nosso direito a livre determinação, a nossa própria governança e ao nosso desenvolvimento livremente determinado, nossos direitos inerentes as nossas terras, territórios e recursos estão cada vez  mais atacados por uma colaboração de governos e empresas transnacionais”.

Os indígenas acrescentam: “Fazemos um chamado a ONU a começar sua implementação, e assegurar a participação plena, formal e efetiva dos povos indígenas em todos os processos e atividades da Conferência de Rio+20 e mais  além, de acordo com a Declaração das Nações Unidas sobe os Direitos dos Povos Indígenas (DNUDPI) e o principio do consentimento livre, prévio e informado (CLPI)”. A declaração encerra com a frase que iniciou a Declaração Kari-Oca de 1992: “Caminhamos para o futuro nos rastros de nossos antepassados”. (TerraViva)

* Publicado originalmente no site TerraViva.

A diretriz do Rio é decepcionante, dizem grupos da sociedade civil (IPS)

Envolverde Rio + 20
21/6/2012 – 10h48

por Stephen Leahy, da IPS

Rio Civil Society final1 A diretriz do Rio é decepcionante, dizem grupos da sociedade civil

Um cartaz em uma parede no Riocentro. Grupos da sociedade civil dizem que estão “muito decepcionados” com as negociações formais na Conferência das Nações Unidas sobre Desenvolvimento Sustentável, a Rio+20. Foto: Stephen Leahy/IPS

Rio de Janeiro, Brasil (TerraViva) “Muito decepcionante” é a forma como empresas e organizações não governamentais descreveram, hoje, as negociações intergovernamentais formais na Conferência das Nações Unidas sobre Desenvolvimento Sustentável, a Rio+20.

Depois de dois anos, negociadores de mais de 190 nações concordaram com um documento de 49 páginas destinado a ser o roteiro para essa transformação. Ele será apresentado aos chefes de Estado no Rio de Janeiro, na abertura da reunião de alto nível da cúpula, no dia 20. Funcionários da ONU disseram que era altamente improvável que qualquer mudança seja feita. O documento deixa de fora o fundo de US$ 30 bilhões para financiar a transição para uma economia verde proposto pelo Grupo dos 77 (G-77), bloco de nações em desenvolvimento mais a China, e não define Objetivos de Desenvolvimento Sustentável (SDGs) tangíveis para substituir as Metas do Milênio, que expiram em 2015.

“Isso é extremamente decepcionante…. Não há visão, não há dinheiro e realmente não há compromissos aqui”, disse Lasse Gustavsson, chefe internacional da delegação para a Rio+20 do World Wildlife Fund (WWF). “A Rio +20 deveria ter sido sobre a vida, sobre o futuro dos nossos filhos, dos nossos netos. Deveria ter sido sobre florestas, rios, lagos, oceanos dos quais todos nós estamos dependendo para a nossa segurança de alimentos, água e energia”, declarou ao TerraViva.

A conferência foi um contraste gritante com a emocionante atmosfera de “vamos mudar o mundo” da primeira Cúpula da Terra em 1992, disse Robert Engleman do Worldwatch Institute, um “think tank” ambiental internacional. Enquanto o documento de forma geral re-confirma compromissos passados de uma forma muito passiva, há uma nova confirmação a respeito da importância da preservação de sementes tradicionais, e a consideração sobre fortalecer o Programa das Nações Unidas para o Meio Ambiente (Pnuma), afirmou ao TerraViva.

“Este documento é uma grande decepção, não há ambição e pouca referência aos desafios planetários que enfrentamos”, disse Kiara Worth, representando o grupo Crianças e Jovens na Rio +20. “As vozes da sociedade civil e as futuras gerações não serão ouvidas. Devemos chamar este evento de ‘Rio menos 20′ porque estamos indo para trás”, declarou ao TerraViva.

Steven Wilson do International Council for Science, uma organização não governamental que representa organismos científicos nacionais e uniões científicas internacionais, observou que “a evidência científica é clara. Nós vamos precisar de um esforço global em ciência e tecnologia para atender o maior desafio que a humanidade já enfrentou, e eu não entendo porque não há uma seção no documento sobre a ciência. Isto passa uma mensagem muito infeliz”.

Jeffery Huffines da Civicus World Alliance for Citizen Participation, organização com sede em Joahnnesburgo, na África do Sul, opinou que “nós temos um sistema econômico fundamentalmente falho e nós da sociedade civil esperávamos que os governos do mundo reconhecessem essa realidade, mas eles não fizeram isso.” Em vez disso, há 49 páginas de conceitos, sem quaisquer compromissos ou meios para avançar com estes conceitos. O papel da participação da sociedade civil tem sido limitado. “Precisamos de uma tomada de decisões mais democrática, e não menos”, enfatizou. Envolverde/IPS

* Publicado originalmente no site TerraViva.