Arquivo da tag: Mediação tecnológica

Support for Climate Policy Linked to People’s Perceptions About Scientific Agreement Regarding Global Warming (Science Daily)

ScienceDaily (Nov. 21, 2011) — People who believe there is a lot of disagreement among scientists about global warming tend to be less certain that global warming is happening and less supportive of climate policy, researchers at George Mason, San Diego State, and Yale Universities report in a new study published in the journal Nature Climate Change.

A recent survey of climate scientists conducted by researchers at the University of Illinois found near unanimous agreement among climate scientists that human-caused global warming is happening.

This new George Mason University study, however, using results from a national survey of the American public, finds that many Americans believe that most climate scientists actually disagree about the subject.

In the national survey conducted in June 2010, two-thirds of respondents said they either believed there is a lot of disagreement among scientists about whether or not global warming is happening (45 percent), that most scientists think it is not happening (5 percent), or that they did not know enough to say (16 percent.) These respondents were less likely to support climate change policies and to view climate change as a lower priority.

By contrast, survey respondents who correctly understood that there is widespread agreement about global warming among scientists were themselves more certain that it is happening, and were more supportive of climate policies.

“Misunderstanding the extent of scientific agreement about climate change is important because it undermines people’s certainty that climate change is happening, which in turn reduces their conviction that America should find ways to deal with the problem,” says Edward Maibach, director of the Center for Climate Change Communication at George Mason University.

Maibach argues that a campaign should be mounted to correct this misperception. “It is no accident that so many Americans misunderstand the widespread scientific agreement about human-caused climate change. A well-financed disinformation campaign deliberately created a myth about there being lack of agreement. The climate science community should take all reasonable measures to put this myth to rest.”

Large Gaps Found in Public Understanding of Climate Change (Science Daily)

ScienceDaily (Oct. 14, 2010) — Sixty-three percent of Americans believe that global warming is happening, but many do not understand why, according to a national study conducted by researchers at Yale University.

The report titled “Americans’ Knowledge of Climate Change” found that only 57 percent know what the greenhouse effect is, only 45 percent of Americans understand that carbon dioxide traps heat from the Earth’s surface, and just 50 percent understand that global warming is caused mostly by human activities. Large majorities incorrectly think that the hole in the ozone layer and aerosol spray cans cause global warming. Meanwhile, 75 percent of Americans have never heard of the related problems of ocean acidification or coral bleaching.

However, many Americans do understand that emissions from cars and trucks and the burning of fossil fuels contribute to global warming and that a transition to renewable energy sources is an important solution.

Americans also recognize their own limited understanding. Only 1 in 10 say that they are “very well-informed” about climate change, and 75 percent say they would like to know more about the issue. Likewise, 75 percent say that schools should teach children about climate change and 68 percent would welcome a national program to teach Americans more about the issue.

“This study demonstrates that Americans need to learn more about the causes, impacts and potential solutions to global warming,” said study director Anthony Leiserowitz of Yale University. “But it also shows that Americans want to learn more about climate change in order to make up their minds and take action.”

The executive summary and full report are available online:http://environment.yale.edu/climate/publications/knowledge-of-climate-change

The online survey was conducted by Knowledge Networks from June 24 to July 22, 2010, with 2,030 American adults 18 and older. The margin of sampling error is plus- or minus-2 percent, with 95 percent confidence.

Increased Knowledge About Global Warming Leads To Apathy, Study Shows (Science Daily)

ScienceDaily (Mar. 27, 2008) — The more you know the less you care — at least that seems to be the case with global warming. A telephone survey of 1,093 Americans by two Texas A&M University political scientists and a former colleague indicates that trend, as explained in their recent article in the peer-reviewed journal Risk Analysis.

“More informed respondents both feel less personally responsible for global warming, and also show less concern for global warming,” states the article, titled “Personal Efficacy, the Information Environment, and Attitudes toward Global Warming and Climate Change in the USA.”

The study showed high levels of confidence in scientists among Americans led to a decreased sense of responsibility for global warming.

The diminished concern and sense of responsibility flies in the face of awareness campaigns about climate change, such as in the movies An Inconvenient Truth and Ice Age: The Meltdown and in the mainstream media’s escalating emphasis on the trend.

The research was conducted by Paul M. Kellstedt, a political science associate professor at Texas A&M; Arnold Vedlitz, Bob Bullock Chair in Government and Public Policy at Texas A&M’s George Bush School of Government and Public Service; and Sammy Zahran, formerly of Texas A&M and now an assistant professor of sociology at Colorado State University.

Kellstedt says the findings were a bit unexpected. The focus of the study, he says, was not to measure how informed or how uninformed Americans are about global warming, but to understand why some individuals who are more or less informed about it showed more or less concern.

“In that sense, we didn’t really have expectations about how aware or unaware people were of global warming,” he says.

But, he adds, “The findings that the more informed respondents were less concerned about global warming, and that they felt less personally responsible for it, did surprise us. We expected just the opposite.

“The findings, while rather modest in magnitude — there are other variables we measured which had much larger effects on concern for global warming — were statistically quite robust, which is to say that they continued to appear regardless of how we modeled the data.”

Measuring knowledge about global warming is a tricky business, Kellstedt adds.

“That’s true of many other things we would like to measure in surveys, of course, especially things that might embarrass people (like ignorance) or that they might feel social pressure to avoid revealing (like prejudice),” he says.

“There are no industry standards, so to speak, for measuring knowledge about global warming. We opted for this straightforward measure and realize that other measures might produce different results.”

Now, for better or worse, scientists have to deal with the public’s abundant confidence in them. “But it cannot be comforting to the researchers in the scientific community that the more trust people have in them as scientists, the less concerned they are about their findings,” the researchers conclude in their study.

Despite Awareness Of Global Warming Americans Concerned More About Local Environment (Science Daily)

ScienceDaily (Mar. 26, 2008) — British Prime Minister Gordon Brown recently declared climate change a top international threat, and Al Gore urged politicians to get involved to fight global warming. Results from a recent survey conducted by a University of Missouri professor reveal that the U.S. public, while aware of the deteriorating global environment, is concerned predominantly with local and national environmental issues.

Potomac River near Washington DC. The top three issues that the US public wants the government to address are protecting community drinking water, reducing pollution of U.S. rivers and lakes, and improving urban air pollution issues like smog. (Credit: Michele Hogan)

“The survey’s core result is that people care about their communities and express the desire to see government action taken toward local and national issues,” said David Konisky, a policy research scholar with the Institute of Public Policy. “People are hesitant to support efforts concerning global issues even though they believe that environmental quality is poorer at the global level than at the local and national level. This is surprising given the media attention that global warming has recently received and reflects the division of opinion about the severity of climate change.”

Konisky, an assistant professor in the Truman School of Public Affairs at MU, recently surveyed 1,000 adults concerning their attitudes about the environment. The survey polled respondents about their levels of concern for the environment and preferences for government action to address a wide set of environmental issues.

A strong majority of the public expressed general concern about the environment. According to the survey, the top three issues that the public wants the government to address are protecting community drinking water, reducing pollution of U.S. rivers and lakes, and improving urban air pollution issues like smog.  In the survey, global warming ranks eighth in importance.

“Americans are clearly most concerned about pollution issues that might affect their personal health, or the health of their families,” Konisky said.

Additionally, Konisky and his colleagues found that the best predictor of individuals’ environmental preferences is their political attributes. They examined the relationship between party identification and political ideology and support for action to address environmental problems.

“The survey reinforced the stark differences in people’s environmental attitudes, depending on their political leanings,” Konisky said. “Democrats and political liberals clearly express more desire for governmental action to address environmental problems. Republicans and ideological conservatives are much less enthusiastic about further government intervention.”

Results from the survey were recently presented at the annual meeting of the Western Political Science Association in San Diego.

Jews Are a ‘Race,’ Genes Reveal (The Jewish Daily Forward)

MONTAGE KURT HOFFMAN

By Jon Entine

Published May 04, 2012, issue of May 11, 2012.

In his new book, “Legacy: A Genetic History of the Jewish People,” Harry Ostrer, a medical geneticist and professor at Albert Einstein College of Medicine in New York, claims that Jews are different, and the differences are not just skin deep. Jews exhibit, he writes, a distinctive genetic signature. Considering that the Nazis tried to exterminate Jews based on their supposed racial distinctiveness, such a conclusion might be a cause for concern. But Ostrer sees it as central to Jewish identity.

“Who is a Jew?” has been a poignant question for Jews throughout our history. It evokes a complex tapestry of Jewish identity made up of different strains of religious beliefs, cultural practices and blood ties to ancient Palestine and modern Israel. But the question, with its echoes of genetic determinism, also has a dark side.

Geneticists have long been aware that certain diseases, from breast cancer to Tay-Sachs, disproportionately affect Jews. Ostrer, who is also director of genetic and genomic testing at Montefiore Medical Center, goes further, maintaining that Jews are a homogeneous group with all the scientific trappings of what we used to call a “race.”

For most of the 3,000-year history of the Jewish people, the notion of what came to be known as “Jewish exceptionalism” was hardly controversial. Because of our history of inmarriage and cultural isolation, imposed or self-selected, Jews were considered by gentiles (and usually referred to themselves) as a “race.” Scholars from Josephus to Disraeli proudly proclaimed their membership in “the tribe.”


Legacy: A Genetic History of the Jewish People
By Harry Ostrer
Oxford University Press, 288 Pages, $24.95

Ostrer explains how this concept took on special meaning in the 20th century, as genetics emerged as a viable scientific enterprise. Jewish distinctiveness might actually be measurable empirically. In “Legacy,” he first introduces us to Maurice Fishberg, an upwardly mobile Russian-Jewish immigrant to New York at the fin de siècle. Fishberg fervently embraced the anthropological fashion of the era, measuring skull sizes to explain why Jews seemed to be afflicted with more diseases than other groups — what he called the “peculiarities of the comparative pathology of the Jews.” It turns out that Fishberg and his contemporary phrenologists were wrong: Skull shape provides limited information about human differences. But his studies ushered in a century of research linking Jews to genetics.

Ostrer divides his book into six chapters representing the various aspects of Jewishness: Looking Jewish, Founders, Genealogies, Tribes, Traits and Identity. Each chapter features a prominent scientist or historical figure who dramatically advanced our understanding of Jewishness. The snippets of biography lighten a dense forest of sometimes-obscure science. The narrative, which consists of a lot of potboiler history, is a slog at times. But for the specialist and anyone touched by the enduring debate over Jewish identity, this book is indispensable.

“Legacy” may cause its readers discomfort. To some Jews, the notion of a genetically related people is an embarrassing remnant of early Zionism that came into vogue at the height of the Western obsession with race, in the late 19th century. Celebrating blood ancestry is divisive, they claim: The authors of “The Bell Curve” were vilified 15 years ago for suggesting that genes play a major role in IQ differences among racial groups.

Furthermore, sociologists and cultural anthropologists, a disproportionate number of whom are Jewish, ridicule the term “race,” claiming there are no meaningful differences between ethnic groups. For Jews, the word still carries the especially odious historical association with Nazism and the Nuremberg Laws. They argue that Judaism has morphed from a tribal cult into a worldwide religion enhanced by thousands of years of cultural traditions.

Is Judaism a people or a religion? Or both? The belief that Jews may be psychologically or physically distinct remains a controversial fixture in the gentile and Jewish consciousness, and Ostrer places himself directly in the line of fire. Yes, he writes, the term “race” carries nefarious associations of inferiority and ranking of people. Anything that marks Jews as essentially different runs the risk of stirring either anti- or philo-Semitism. But that doesn’t mean we can ignore the factual reality of what he calls the “biological basis of Jewishness” and “Jewish genetics.” Acknowledging the distinctiveness of Jews is “fraught with peril,” but we must grapple with the hard evidence of “human differences” if we seek to understand the new age of genetics.

Although he readily acknowledges the formative role of culture and environment, Ostrer believes that Jewish identity has multiple threads, including DNA. He offers a cogent, scientifically based review of the evidence, which serves as a model of scientific restraint.

“On the one hand, the study of Jewish genetics might be viewed as an elitist effort, promoting a certain genetic view of Jewish superiority,” he writes. “On the other, it might provide fodder for anti-Semitism by providing evidence of a genetic basis for undesirable traits that are present among some Jews. These issues will newly challenge the liberal view that humans are created equal but with genetic liabilities.”

Jews, he notes, are one of the most distinctive population groups in the world because of our history of endogamy. Jews — Ashkenazim in particular — are relatively homogeneous despite the fact that they are spread throughout Europe and have since immigrated to the Americas and back to Israel. The Inquisition shattered Sephardi Jewry, leading to far more incidences of intermarriage and to a less distinctive DNA.

In traversing this minefield of the genetics of human differences, Ostrer bolsters his analysis with volumes of genetic data, which are both the book’s greatest strength and its weakness. Two complementary books on this subject — my own “Abraham’s Children: Race, Identity, and the DNA of the Chosen People” and “Jacob’s Legacy: A Genetic View of Jewish History” by Duke University geneticist David Goldstein, who is well quoted in both “Abraham’s Children” and “Legacy” — are more narrative driven, weaving history and genetics, and are consequently much more congenial reads.

The concept of the “Jewish people” remains controversial. The Law of Return, which establishes the right of Jews to come to Israel, is a central tenet of Zionism and a founding legal principle of the State of Israel. The DNA that tightly links Ashkenazi, Sephardi and Mizrahi, three prominent culturally and geographically distinct Jewish groups, could be used to support Zionist territorial claims — except, as Ostrer points out, some of the same markers can be found in Palestinians, our distant genetic cousins, as well. Palestinians, understandably, want their own right of return.

That disagreement over the meaning of DNA also pits Jewish traditionalists against a particular strain of secular Jewish liberals that has joined with Arabs and many non-Jews to argue for an end to Israel as a Jewish nation. Their hero is Shlomo Sand, an Austrian-born Israeli historian who reignited this complex controversy with the 2008 publication of “The Invention of the Jewish People.”

Sand contends that Zionists who claim an ancestral link to ancient Palestine are manipulating history. But he has taken his thesis from novelist Arthur Koestler’s 1976 book, “The Thirteenth Tribe,” which was part of an attempt by post-World War II Jewish liberals to reconfigure Jews not as a biological group, but as a religious ideology and ethnic identity.

The majority of the Ashkenazi Jewish population, as Koestler, and now Sand, writes, are not the children of Abraham but descendants of pagan Eastern Europeans and Eurasians, concentrated mostly in the ancient Kingdom of Khazaria in what is now Ukraine and Western Russia. The Khazarian nobility converted during the early Middle Ages, when European Jewry was forming.

Although scholars challenged Koestler’s and now Sand’s selective manipulation of the facts — the conversion was almost certainly limited to the tiny ruling class and not to the vast pagan population — the historical record has been just fragmentary enough to titillate determined critics of Israel, who turned both Koestler’s and Sand’s books into roaring best-sellers.

Fortunately, re-creating history now depends not only on pottery shards, flaking manuscripts and faded coins, but on something far less ambiguous: DNA. Ostrer’s book is an impressive counterpoint to the dubious historical methodology of Sand and his admirers. And, as a co-founder of the Jewish HapMap — the study of haplotypes, or blocks of genetic markers, that are common to Jews around the world — he is well positioned to write the definitive response.

In accord with most geneticists, Ostrer firmly rejects the fashionable postmodernist dismissal of the concept of race as genetically naive, opting for a more nuanced perspective.

When the human genome was first mapped a decade ago, Francis Collins, then head of the National Genome Human Research Institute, said: “Americans, regardless of ethnic group, are 99.9% genetically identical.” Added J. Craig Venter, who at the time was chief scientist at the private firm that helped sequenced the genome, Celera Genomics, “Race has no genetic or scientific basis.” Those declarations appeared to suggest that “race,” or the notion of distinct but overlapping genetic groups, is “meaningless.”

But Collins and Venter have issued clarifications of their much-misrepresented comments. Almost every minority group has faced, at one time or another, being branded as racially inferior based on a superficial understanding of how genes peculiar to its population work. The inclination by politicians, educators and even some scientists to underplay our separateness is certainly understandable. But it’s also misleading. DNA ensures that we differ not only as individuals, but also as groups.

However slight the differences (and geneticists now believe that they are significantly greater than 0.1%), they are defining. That 0.1% contains some 3 million nucleotide pairs in the human genome, and these determine such things as skin or hair color and susceptibility to certain diseases. They contain the map of our family trees back to the first modern humans.

Both the human genome project and disease research rest on the premise of finding distinguishable differences between individuals and often among populations. Scientists have ditched the term “race,” with all its normative baggage, and adopted more neutral terms, such as “population” and “clime,” which have much of the same meaning. Boiled down to its essence, race equates to “region of ancestral origin.”

Ostrer has devoted his career to investigating these extended family trees, which help explain the genetic basis of common and rare disorders. Today, Jews remain identifiable in large measure by the 40 or so diseases we disproportionately carry, the inescapable consequence of inbreeding. He traces the fascinating history of numerous “Jewish diseases,” such as Tay-Sachs, Gaucher, Niemann-Pick, Mucolipidosis IV, as well as breast and ovarian cancer. Indeed, 10 years ago I was diagnosed as carrying one of the three genetic mutations for breast and ovarian cancer that mark my family and me as indelibly Jewish, prompting me to write “Abraham’s Children.”

Like East Asians, the Amish, Icelanders, Aboriginals, the Basque people, African tribes and other groups, Jews have remained isolated for centuries because of geography, religion or cultural practices. It’s stamped on our DNA. As Ostrer explains in fascinating detail, threads of Jewish ancestry link the sizable Jewish communities of North America and Europe to Yemenite and other Middle Eastern Jews who have relocated to Israel, as well as to the black Lemba of southern Africa and to India’s Cochin Jews. But, in a twist, the links include neither the Bene Israel of India nor Ethiopian Jews. Genetic tests show that both groups are converts, contradicting their founding myths.

Why, then, are Jews so different looking, usually sharing the characteristics of the surrounding populations? Think of red-haired Jews, Jews with blue eyes or the black Jews of Africa. Like any cluster — a genetic term Ostrer uses in place of the more inflammatory “race” — Jews throughout history moved around and fooled around, although mixing occurred comparatively infrequently until recent decades. Although there are identifiable gene variations that are common among Jews, we are not a “pure” race. The time machine of our genes may show that most Jews have a shared ancestry that traces back to ancient Palestine but, like all of humanity, Jews are mutts.

About 80% of Jewish males and 50% of Jewish females trace their ancestry back to the Middle East. The rest entered the “Jewish gene pool” through conversion or intermarriage. Those who did intermarry often left the faith in a generation or two, in effect pruning the Jewish genetic tree. But many converts became interwoven into the Jewish genealogical line. Reflect on the iconic convert, the biblical Ruth, who married Boaz and became the great-grandmother of King David. She began as an outsider, but you don’t get much more Jewish than the bloodline of King David!

To his credit, Ostrer also addresses the third rail of discussions about Jewishness and race: the issue of intelligence. Jews were latecomers to the age of freethinking. While the Enlightenment swept through Christian Europe in the 17th century, the Haskalah did not gather strength until the early 19th century. By the beginning of the new millennium, however, Jews were thought of as among the smartest people on earth. The trend is most prominent in America, which has the largest concentration of Jews outside Israel and a history of tolerance.

Although Jews make up less than 3% of the population, they have won more than 25% of the Nobel Prizes awarded to American scientists since 1950. Jews also account for 20% of this country’s chief executives and make up 22% of Ivy League students. Psychologists and educational researchers have pegged their average IQ at 107.5 to 115, with their verbal IQ at more than 120, a stunning standard deviation above the average of 100 found in those of European ancestry. Like it or not, the IQ debate will become an increasingly important issue going forward, as medical geneticists focus on unlocking the mysteries of the brain.

Many liberal Jews maintain, at least in public, that the plethora of Jewish lawyers, doctors and comedians is the product of our cultural heritage, but the science tells a more complex story. Jewish success is a product of Jewish genes as much as of Jewish moms.

Is it “good for the Jews” to be exploring such controversial subjects? We can’t avoid engaging the most challenging questions in the age of genetics. Because of our history of endogamy, Jews are a goldmine for geneticists studying human differences in the quest to cure disease. Because of our cultural commitment to education, Jews are among the top genetic researchers in the world.

As humankind becomes more genetically sophisticated, identity becomes both more fluid and more fixed. Jews in particular can find threads of our ancestry literally anywhere, muddying traditional categories of nationhood, ethnicity, religious belief and “race.” But such discussions, ultimately, are subsumed by the reality of the common shared ancestry of humankind. Ostrer’s “Legacy” points out that — regardless of the pros and cons of being Jewish — we are all, genetically, in it together. And, in doing so, he gets it just right.

Jon Entine is the founder and director of the Genetic Literacy Project at George Mason University, where he is senior research fellow at the Center for Health and Risk Communication. His website is www.jonentine.com.

Read more: http://www.forward.com/articles/155742/jews-are-a-race-genes-reveal/?p=all#ixzz1uJ67qPdJ

Bruno Latour: Love Your Monsters (Breakthrough)

Breakthrough Journal, No. 2, Fall 2011

Latour - crying baby - AP.jpg

In the summer of 1816, a young British woman by the name of Mary Godwin and her boyfriend Percy Shelley went to visit Lord Byron in Lake Geneva, Switzerland. They had planned to spend much of the summer outdoors, but the eruption of Mount Tambora in Indonesia the previous year had changed the climate of Europe. The weather was so bad that they spent most of their time indoors, discussing the latest popular writings on science and the supernatural.

After reading a book of German ghost stories, somebody suggested they each write their own. Byron’s physician, John Polidori, came up with the idea for The Vampyre, published in 1819,1 which was the first of the “vampire-as-seducer” novels. Godwin’s story came to her in a dream, during which she saw “the pale student of unhallowed arts kneeling beside the thing he had put together.”2 Soon after that fateful summer, Godwin and Shelley married, and in 1818, Mary Shelley’s horror story was published under the title, Frankenstein, Or, the Modern Prometheus.3

Frankenstein lives on in the popular imagination as a cautionary tale against technology. We use the monster as an all-purpose modifier to denote technological crimes against nature. When we fear genetically modified foods we call them “frankenfoods” and “frankenfish.” It is telling that even as we warn against such hybrids, we confuse the monster with its creator. We now mostly refer to Dr. Frankenstein’s monster as Frankenstein. And just as we have forgotten that Frankenstein was the man, not the monster, we have also forgotten Frankenstein’s real sin.

Dr. Frankenstein’s crime was not that he invented a creature through some combination of hubris and high technology, but rather that heabandoned the creature to itself. When Dr. Frankenstein meets his creation on a glacier in the Alps, the monster claims that it was notborn a monster, but that it became a criminal only after being left alone by his horrified creator, who fled the laboratory once the horrible thing twitched to life. “Remember, I am thy creature,” the monster protests, “I ought to be thy Adam; but I am rather the fallen angel, whom thou drivest from joy for no misdeed… I was benevolent and good; misery made me a fiend. Make me happy, and I shall again be virtuous.”

Written at the dawn of the great technological revolutions that would define the 19th and 20th centuries, Frankenstein foresees that the gigantic sins that were to be committed would hide a much greater sin. It is not the case that we have failed to care for Creation, but that we have failed to care for our technological creations. We confuse the monster for its creator and blame our sins against Nature upon our creations. But our sin is not that we created technologies but that we failed to love and care for them. It is as if we decided that we were unable to follow through with the education of our children.4

Let Dr. Frankenstein’s sin serve as a parable for political ecology. At a time when science, technology, and demography make clear that we can never separate ourselves from the nonhuman world — that we, our technologies, and nature can no more be disentangled than we can remember the distinction between Dr. Frankenstein and his monster — this is the moment chosen by millions of well-meaning souls to flagellate themselves for their earlier aspiration to dominion, to repent for their past hubris, to look for ways of diminishing the numbers of their fellow humans, and to swear to make their footprints invisible?

The goal of political ecology must not be to stop innovating, inventing, creating, and intervening. The real goal must be to have the same type of patience and commitment to our creations as God the Creator, Himself. And the comparison is not blasphemous: we have taken the whole of Creation on our shoulders and have become coextensive with the Earth.

What, then, should be the work of political ecology? It is, I believe, tomodernize modernization, to borrow an expression proposed by Ulrich Beck.5 
This challenge demands more of us than simply embracing technology and innovation. It requires exchanging the modernist notion of modernity for what I have called a “compositionist” one that sees the process of human development as neither liberation from Nature nor as a fall from it, but rather as a process of becoming ever-more attached to, and intimate with, a panoply of nonhuman natures.

1.
At the time of the plough we could only scratch the surface of the soil. Three centuries back, we could only dream, like Cyrano de Bergerac, of traveling to the moon. In the past, my Gallic ancestors were afraid of nothing except that the “sky will fall on their heads.”

Today we can fold ourselves into the molecular machinery of soil bacteria through our sciences and technologies. We run robots on Mars. We photograph and dream of further galaxies. And yet we fear that the climate could destroy us.

Everyday in our newspapers we read about more entanglements of all those things that were once imagined to be separable — science, morality, religion, law, technology, finance, and politics. But these things are tangled up together everywhere: in the Intergovernmental Panel on Climate Change, in the space shuttle, and in the Fukushima nuclear power plant.

If you envision a future in which there will be less and less of these entanglements thanks to Science, capital S, you are a modernist. But if you brace yourself for a future in which there will always be more of these imbroglios, mixing many more heterogeneous actors, at a greater and greater scale and at an ever-tinier level of intimacy requiring even more detailed care, then you are… what? A compositionist!

The dominant, peculiar story of modernity is of humankind’semancipation from Nature. Modernity is the thrusting-forward arrow of time — Progress — characterized by its juvenile enthusiasm, risk taking, frontier spirit, optimism, and indifference to the past. The spirit can be summarized in a single sentence: “Tomorrow, we will be able to separate more accurately what the world is really like from the subjective illusions we used to entertain about it.”

The very forward movement of the arrow of time and the frontier spirit associated with it (the modernizing front) is due to a certain conception of knowledge: “Tomorrow, we will be able to differentiate clearly what in the past was still mixed up, namely facts and values, thanks to Science.”

Science is the shibboleth that defines the right direction of the arrow of time because it, and only it, is able to cut into two well-separated parts what had, in the past, remained hopelessly confused: a morass of ideology, emotions, and values on the one hand, and, on the other, stark and naked matters of fact.

The notion of the past as an archaic and dangerous confusion arises directly from giving Science this role. A modernist, in this great narrative, is the one who expects from Science the revelation that Nature will finally be visible through the veils of subjectivity — and subjection — that hid it from our ancestors.

And here has been the great failure of political ecology. Just when all of the human and nonhuman associations are finally coming to the center of our consciousness, when science and nature and technology and politics become so confused and mixed up as to be impossible to untangle, just as these associations are beginning to be shaped in our political arenas and are triggering our most personal and deepest emotions, this is when a new apartheid is declared: leave Nature alone and let the humans retreat — as the English did on the beaches of Dunkirk in the 1940s.

Just at the moment when this fabulous dissonance inherent in the modernist project between what modernists say (emancipation from all attachments!) and what they do (create ever-more attachments!) is becoming apparent to all, along come those alleging to speak for Nature to say the problem lies in the violations and imbroglios — the attachments!

Instead of deciding that the great narrative of modernism (Emancipation) has always resulted in another history altogether (Attachments), the spirit of the age has interpreted the dissonance in quasi-apocalyptic terms: “We were wrong all along, let’s turn our back to progress, limit ourselves, and return to our narrow human confines, leaving the nonhumans alone in as pristine a Nature as possible, mea culpa, mea maxima culpa…

Nature, this great shortcut of due political process, is now used to forbid humans to encroach. Instead of realizing at last that the emancipation narrative is bunk, and that modernism was always about attachments, modernist greens have suddenly shifted gears and have begun to oppose the promises of modernization.

Why do we feel so frightened at the moment that our dreams of modernization finally come true? Why do we suddenly turn pale and wish to fall back on the other side of Hercules’s columns, thinking we are being punished for having transgressed the sign: “Thou shall not transgress?” Was not our slogan until now, as Nordhaus and Shellenberger note in Break Through, “We shall overcome!”?6

In the name of indisputable facts portraying a bleak future for the human race, green politics has succeeded in leaving citizens nothing but a gloomy asceticism, a terror of trespassing Nature, and a diffidence toward industry, innovation, technology, and science. No wonder that, while political ecology claims to embody the political power of the future, it is reduced everywhere to a tiny portion of electoral strap-hangers. Even in countries where political ecology is a little more powerful, it contributes only a supporting force.

Political ecology has remained marginal because it has not grasped either its own politics or its own ecology. It thinks it is speaking of Nature, System, a hierarchical totality, a world without man, an assured Science, but it is precisely these overly ordered pronouncements that marginalize it.

Set in contrast to the modernist narrative, this idea of political ecology could not possibly succeed. There is beauty and strength in the modernist story of emancipation. Its picture of the future is so attractive, especially when put against such a repellent past, that it makes one wish to run forward to break all the shackles of ancient existence.

To succeed, an ecological politics must manage to be at least as powerful as the modernizing story of emancipation without imagining that we are emancipating ourselves from Nature. What the emancipation narrative points to as proof of increasing human mastery over and freedom from Nature — agriculture, fossil energy, technology — can be redescribed as the increasing attachmentsbetween things and people at an ever-expanding scale. If the older narratives imagined humans either fell from Nature or freed themselves from it, the compositionist narrative describes our ever-increasing degree of intimacy with the new natures we are constantly creating. Only “out of Nature” may ecological politics start again and anew.

2.
The paradox of “the environment” is that it emerged in public parlance just when it was starting to disappear. During the heyday of modernism, no one seemed to care about “the environment” because there existed a huge unknown reserve on which to discharge all bad consequences of collective modernizing actions. The environment is what appeared when unwanted consequences came back to haunt the originators of such actions.

But if the originators are true modernists, they will see the return of “the environment” as incomprehensible since they believed they were finally free of it. The return of consequences, like global warming, is taken as a contradiction, or even as a monstrosity, which it is, of course, but only according to the modernist’s narrative of emancipation. In the compositionist’s narrative of attachments, unintended consequences are quite normal — indeed, the most expected things on earth!

Environmentalists, in the American sense of the word, never managed to extract themselves from the contradiction that the environment is precisely not “what lies beyond and should be left alone” — this was the contrary, the view of their worst enemies! The environment is exactly what should be even more managed, taken up, cared for, stewarded, in brief, integrated and internalized in the very fabric of the polity.

France, for its part, has never believed in the notion of a pristine Nature that has so confused the “defense of the environment” in other countries. What we call a “national park” is a rural ecosystem complete with post offices, well-tended roads, highly subsidized cows, and handsome villages.

Those who wish to protect natural ecosystems learn, to their stupefaction, that they have to work harder and harder — that is, to intervene even more, at always greater levels of detail, with ever more subtle care — to keep them “natural enough” for Nature-intoxicated tourists to remain happy.

Like France’s parks, all of Nature needs our constant care, our undivided attention, our costly instruments, our hundreds of thousands of scientists, our huge institutions, our careful funding. But though we have Nature, and we have nurture, we don’t know what it would mean for Nature itself to be nurtured.7

The word “environmentalism” thus designates this turning point in history when the unwanted consequences are suddenly considered to be such a monstrosity that the only logical step appears to be to abstain and repent: “We should not have committed so many crimes; now we should be good and limit ourselves.” Or at least this is what people felt and thought before the breakthrough, at the time when there was still an “environment.”

But what is the breakthrough itself then? If I am right, the breakthrough involves no longer seeing a contradiction between the spirit of emancipation and its catastrophic outcomes, but accepting it as the normal duty of continuing to care for unwanted consequences, even if this means going further and further down into the imbroglios. Environmentalists say: “From now on we should limit ourselves.” Postenvironmentalists exclaim: “From now on, we should stop flagellating ourselves and take up explicitly and seriously what we have been doing all along at an ever-increasing scale, namely, intervening, acting, wanting, caring.” For environmentalists, the return of unexpected consequences appears as a scandal (which it is for the modernist myth of mastery). For postenvironmentalists, the other, unintended consequences are part and parcel of any action.

3.
One way to seize upon the breakthrough from environmentalism to postenvironmentalism is to reshape the very definition of the “precautionary principle.” This strange moral, legal, epistemological monster has appeared in European and especially French politics after many scandals due to the misplaced belief by state authority in the certainties provided by Science.8

When action is supposed to be nothing but the logical consequence of reason and facts (which the French, of all people, still believe), it is quite normal to wait for the certainty of science before administrators and politicians spring to action. The problem begins when experts fail to agree on the reasons and facts that have been taken as the necessary premises of any action. Then the machinery of decision is stuck until experts come to an agreement. It was in such a situation that the great tainted blood catastrophe of the 1980s ensued: before agreement was produced, hundreds of patients were transfused with blood contaminated by the AIDS virus.9

The precautionary principle was introduced to break this odd connection between scientific certainty and political action, stating that even in the absence of certainty, decisions could be made. But of course, as soon as it was introduced, fierce debates began on its meaning. Is it an environmentalist notion that precludes action or a postenvironmentalist notion that finally follows action through to its consequences?

Not surprisingly, the enemies of the precautionary principle — which President Chirac enshrined in the French Constitution as if the French, having indulged so much in rationalism, had to be protected against it by the highest legal pronouncements — took it as proof that no action was possible any more. As good modernists, they claimed that if you had to take so many precautions in advance, to anticipate so many risks, to include the unexpected consequences even before they arrived, and worse, to be responsible for them, then it was a plea for impotence, despondency, and despair. The only way to innovate, they claimed, is to bounce forward, blissfully ignorant of the consequences or at least unconcerned by what lies outside your range of action. Their opponents largely agreed. Modernist environmentalists argued that the principle of precaution dictated no action, no new technology, no intervention unless it could be proven with certainty that no harm would result. Modernists we were, modernists we shall be!

But for its postenvironmental supporters (of which I am one) the principle of precaution, properly understood, is exactly the change ofzeitgeist needed: not a principle of abstention — as many have come to see it — but a change in the way any action is considered, a deep tidal change in the linkage modernism established between science and politics. From now on, thanks to this principle, unexpected consequences are attached to their initiators and have to be followed through all the way.

4.
The link between technology and theology hinges on the notion of mastery. Descartes exclaimed that we should be “maîtres et possesseurs de la nature.”10
But what does it mean to be a master? In the modernist narrative, mastery was supposed to require such total dominance by the master that he was emancipated entirely from any care and worry. This is the myth about mastery that was used to describe the technical, scientific, and economic dominion of Man over Nature.

But if you think about it according to the compositionist narrative, this myth is quite odd: where have we ever seen a master freed from any dependence on his dependents? The Christian God, at least, is not a master who is freed from dependents, but who, on the contrary, gets folded into, involved with, implicated with, and incarnated into His Creation. God is so attached and dependent upon His Creation that he is continually forced (convinced? willing?) to save it. Once again, the sin is not to wish to have dominion over Nature, but to believe that this dominion means emancipation and not attachment.

If God has not abandoned His Creation and has sent His Son to redeem it, why do you, a human, a creature, believe that you can invent, innovate, and proliferate — and then flee away in horror from what you have committed? Oh, you the hypocrite who confesses of one sin to hide a much graver, mortal one! Has God fled in horror after what humans made of His Creation? Then have at least the same forbearance that He has.

The dream of emancipation has not turned into a nightmare. It was simply too limited: it excluded nonhumans. It did not care about unexpected consequences; it was unable to follow through with its responsibilities; it entertained a wholly unrealistic notion of what science and technology had to offer; it relied on a rather impious definition of God, and a totally absurd notion of what creation, innovation, and mastery could provide.

Which God and which Creation should we be for, knowing that, contrary to Dr. Frankenstein, we cannot suddenly stop being involved and “go home?” Incarnated we are, incarnated we will be. In spite of a centuries-old misdirected metaphor, we should, without any blasphemy, reverse the Scripture and exclaim: “What good is it for a man to gain his soul yet forfeit the whole world?” /

1. Polidori, John, et al. 1819. The Vampyre: A Tale. Printed for Sherwood, Neely, and Jones.

2. Shelley, Mary W., 1823. Frankenstein: Or, The Modern Prometheus. Printed for G. and W.B. Whittaker.

3. Ibid.

4. This is also the theme of: Latour, Bruno. 1996. Aramis or the Love of Technology. Translated by Catherine Porter. Cambridge, Mass: Harvard University Press.

5. Beck, Ulrich. 1992. Risk Society: Towards a New Modernity. London: Sage.

6. Nordhaus, Ted, and Michael Shellenberger. 2007. Break Through: From the Death of Environmentalism to the Politics of Possibility. Boston: Houghton Mifflin Harcourt.

7. Descola, Philippe. 2005. Par dela nature et culture. Paris: Gallimard.

8. Sadeleer, Nicolas de, 2006. Implementing the Precautionary Principle: Approaches from Nordic Countries and the EU. Earthscan Publ. Ltd.

9. Hermitte, Marie-Angele. 1996. Le Sang Et Le Droit. Essai Sur La Transfusion Sanguine. Paris: Le Seuil.

10. Descartes, Rene. 1637. Discourse on Method in Discourse on Method and Related Writings. Translated by Desmond M. Clark. 1999. Part 6, 44. New York: Penguin.

Novelas brasileiras passam imagem de país branco, critica escritora moçambicana (Agência Brasil)

17/04/2012 – 15h35

Alex Rodrigues
Repórter da Agência Brasil

 Brasília – “Temos medo do Brasil.” Foi com um desabafo inesperado que a romancista moçambicana Paulina Chiziane chamou a atenção do público do seminário A Literatura Africana Contemporânea, que integra a programação da 1ª Bienal do Livro e da Leitura, em Brasília (DF). Ela se referia aos efeitos da presença, em Moçambique, de igrejas e templos brasileiros e de produtos culturais como as telenovelas que transmitem, na opinião dela, uma falsa imagem do país.

“Para nós, moçambicanos, a imagem do Brasil é a de um país branco ou, no máximo, mestiço. O único negro brasileiro bem-sucedido que reconhecemos como tal é o Pelé. Nas telenovelas, que são as responsáveis por definir a imagem que temos do Brasil, só vemos negros como carregadores ou como empregados domésticos. No topo [da representação social] estão os brancos. Esta é a imagem que o Brasil está vendendo ao mundo”, criticou a autora, destacando que essas representações contribuem para perpetuar as desigualdades raciais e sociais existentes em seu país.

“De tanto ver nas novelas o branco mandando e o negro varrendo e carregando, o moçambicano passa a ver tal situação como aparentemente normal”, sustenta Paulina, apontando para a mesma organização social em seu país.

A presença de igrejas brasileiras em território moçambicano também tem impactos negativos na cultura do país, na avaliação da escritora. “Quando uma ou várias igrejas chegam e nos dizem que nossa maneira de crer não é correta, que a melhor crença é a que elas trazem, isso significa destruir uma identidade cultural. Não há o respeito às crenças locais. Na cultura africana, um curandeiro é não apenas o médico tradicional, mas também o detentor de parte da história e da cultura popular”, detacou Paulina, criticando os governos dos dois países que permitem a intervenção dessas instituições.

Primeira mulher a publicar um livro em Moçambique, Paulina procura fugir de estereótipos em sua obra, principalmente, os que limitam a mulher ao papel de dependente, incapaz de pensar por si só, condicionada a apenas servir.

“Gosto muito dos poetas de meu país, mas nunca encontrei na literatura que os homens escrevem o perfil de uma mulher inteira. É sempre a boca, as pernas, um único aspecto. Nunca a sabedoria infinita que provém das mulheres”, disse Paulina, lembrando que, até a colonização europeia, cabia às mulheres desempenhar a função narrativa e de transmitir o conhecimento.

“Antes do colonialismo, a arte e a literatura eram femininas. Cabia às mulheres contar as histórias e, assim, socializar as crianças. Com o sistema colonial e o emprego do sistema de educação imperial, os homens passam a aprender a escrever e a contar as histórias. Por isso mesmo, ainda hoje, em Moçambique, há poucas mulheres escritoras”, disse Paulina.

“Mesmo independentes [a partir de 1975], passamos a escrever a partir da educação europeia que havíamos recebido, levando os estereótipos e preconceitos que nos foram transmitidos. A sabedoria africana propriamente dita, a que é conhecida pelas mulheres, continua excluída. Isso para não dizer que mais da metade da população moçambicana não fala português e poucos são os autores que escrevem em outras línguas moçambicanas”, disse Paulina.

Durante a bienal, foi relançado o livro Niketche, uma história de poligamia, de autoria da escritora moçambicana.

The U.S. Has Fallen Behind in Numerical Weather Prediction: Part I

March 28, 2012 – 05:00 AM
By Dr. Cliff Mass (Twitter @CliffMass)

It’s a national embarrassment. It has resulted in large unnecessary costs for the U.S. economy and needless endangerment of our citizens. And it shouldn’t be occurring.

What am I talking about? The third rate status of numerical weather prediction in the U.S. It is a huge story, an important story, but one the media has not touched, probably from lack of familiarity with a highly technical subject. And the truth has been buried or unavailable to those not intimately involved in the U.S. weather prediction enterprise. This is an issue I have mentioned briefly in previous blogs, and one many of you have asked to learn more about. It’s time to discuss it.

Weather forecasting today is dependent on numerical weather prediction, the numerical solution of the equations that describe the atmosphere. The technology of weather prediction has improved dramatically during the past decades as faster computers, better models, and much more data (mainly satellites) have become available.

Supercomputers are used for numerical weather prediciton.

U.S. numerical weather prediction has fallen to third or fourth place worldwide, with the clear leader in global numerical weather prediction (NWP) being the European Center for Medium Range Weather Forecasting (ECMWF). And we have also fallen behind in ensembles (using many models to give probabilistic prediction) and high-resolution operational forecasting. We used to be the world leader decades ago in numerical weather prediction: NWP began and was perfected here in the U.S. Ironically, we have the largest weather research community in the world and the largest collection of universities doing cutting-edge NWP research (like the University of Washington!). Something is very, very wrong and I will talk about some of the issues here. And our nation needs to fix it.

But to understand the problem, you have to understand the competition and the players. And let me apologize upfront for the acronyms.

In the U.S., numerical weather prediction mainly takes place at the National Weather Service’s Environmental Modeling Center (EMC), a part of NCEP (National Centers for Environmental Prediction). They run a global model (GFS) and regional models (e.g., NAM).

The Europeans banded together decades ago to form the European Center for Medium-Range Forecasting (ECMWF), which runs a very good global model. Several European countries run regional models as well.

The United Kingdom Met Office (UKMET) runs an excellent global model and regional models. So does the Canadian Meteorological Center (CMC).

There are other major global NWP centers such as the Japanese Meteorological Agency (JMA), the U.S. Navy (FNMOC), the Australian center, one in Beijing, among others. All of these centers collect worldwide data and do global NWP.

The problem is that both objective and subjective comparisons indicate that the U.S. global model is number 3 or number 4 in quality, resulting in our forecasts being noticeably inferior to the competition. Let me show you a rather technical graph (produced by the NWS) that illustrates this. This figure shows the quality of the 500hPa forecast (about halfway up in the troposphere–approximately 18,000 ft) for the day 5 forecast. The top graph is a measure of forecast skill (closer to 1 is better) from 1996 to 2012 for several models (U.S.–black, GFS; ECMWF-red, Canadian: CMC-blue, UKMET: green, Navy: FNG, orange). The bottom graph shows the difference between the U.S. and other nation’s model skill.

You first notice that forecasts are all getting better. That’s good. But you will notice that the most skillful forecast (closest to one) is clearly the red one…the European Center. The second best is the UKMET office. The U.S. (GFS model) is third…roughly tied with the Canadians.

Here is a global model comparison done by the Canadian Meteorological Center, for various global models from 2009-2012 for the 120 h forecast. This is a plot of error (RMSE, root mean square error) again for 500 hPa, and only for North America. Guess who is best again (lowest error)?–the European Center (green circle). UKMET is next best, and the U.S. (NCEP, blue triangle) is back in the pack.

Lets looks at short-term errors. Here is a plot from a paper by Garrett Wedam, Lynn McMurdie and myself comparing various models at 24, 48, and 72 hr for sea level pressure along the West Coast. Bigger bar means more error. Guess who has the lowest errors by far? You guessed it, ECMWF.

I could show you a hundred of these plots, but the answers are very consistent. ECMWF is the worldwide gold standard in global prediction, with the British (UKMET) second. We are third or fourth (with the Canadians). One way to describe this, is that the ECWMF model is not only better at the short range, but has about one day of additional predictability: their 8 day forecast is about as skillful as our 7 day forecast. Another way to look at it is that with the current upward trend in skill they are 5-7 years ahead of the U.S.

Most forecasters understand the frequent superiority of the ECMWF model. If you read the NWS forecast discussion, which is available online, you will frequently read how they often depend not on the U.S. model, but the ECMWF. And during the January western WA snowstorm, it was the ECMWF model that first indicated the correct solution. Recently, I talked to the CEO of a weather/climate related firm that was moving up to Seattle. I asked them what model they were using: the U.S. GFS? He laughed, of course not…they were using the ECMWF.

A lot of U.S. firms are using the ECMWF and this is very costly, because the Europeans charge a lot to gain access to their gridded forecasts (hundreds of thousands of dollars per year). Can you imagine how many millions of dollars are being spent by U.S. companies to secure ECMWF predictions? But the cost of the inferior NWS forecasts are far greater than that, because many users cannot afford the ECMWF grids and the NWS uses their global predictions to drive the higher-resolution regional models–which are NOT duplicated by the Europeans. All of U.S. NWP is dragged down by these second-rate forecasts and the costs for the nation has to be huge, since so much of our economy is weather sensitive. Inferior NWP must be costing billions of dollars, perhaps many billions.

The question all of you must be wondering is why this bad situation exists. How did the most technologically advanced country in the world, with the largest atmospheric sciences community, end up with third-rate global weather forecasts? I believe I can tell you…in fact, I have been working on this issue for several decades (with little to show for it). Some reasons:

1. The U.S. has inadequate computer power available for numerical weather prediction. The ECMWF is running models with substantially higher resolution than ours because they have more resources available for NWP. This is simply ridiculous–the U.S. can afford the processors and disk space it would take. We are talking about millions or tens of millions of dollars at most to have the hardware we need. A part of the problem has been NWS procurement, that is not forward-leaning, using heavy metal IBM machines at very high costs.

2. The U.S. has used inferior data assimilation. A key aspect of NWP is to assimilate the observations to create a good description of the atmosphere. The European Center, the UKMET Office, and the Canadians using 4DVAR, an advanced approach that requires lots of computer power. We used an older, inferior approach (3DVAR). The Europeans have been using 4DVAR for 20 years! Right now, the U.S. is working on another advanced approach (ensemble-based data assimilation), but it is not operational yet.

3. The NWS numerical weather prediction effort has been isolated and has not taken advantage of the research community. NCEP’s Environmental Modeling Center (EMC) is well known for its isolation and “not invented here” attitude. While the European Center has lots of visitors and workshops, such things are a rarity at EMC. Interactions with the university community have been limited and EMC has been reluctant to use the models and approaches developed by the U.S. research community. (True story: some of the advances in probabilistic weather prediction at the UW has been adopted by the Canadians, while the NWS had little interest). The National Weather Service has invested very little in extramural research and when their budget is under pressure, university research is the first thing they reduce. And the U.S. NWP center has been housed in a decaying building outside of D.C.,one too small for their needs as well. (Good news… a new building should be available soon).

4. The NWS approach to weather related research has been ineffective and divided. The governmnent weather research is NOT in the NWS, but rather in NOAA. Thus, the head of the NWS and his leadership team do not have authority over folks doing research in support of his mission. This has been an extraordinarily ineffective and wasteful system, with the NOAA research teams doing work that often has a marginal benefit for the NWS.

5. Lack of leadership. This is the key issue. The folks in NCEP, NWS, and NOAA leadership have been willing to accept third-class status, providing lots of excuses, but not making the fundamental changes in organization and priority that could deal with the problem. Lack of resources for NWP is another issue…but that is a decision made by NOAA/NWS/Dept of Commerce leadership.

This note is getting long, so I will wait to talk about the other problems in the NWS weather modeling efforts, such as our very poor ensemble (probabilistic) prediction systems. One could write a paper on this…and I may.

I should stress that I am not alone in saying these things. A blue-ribbon panel did a review of NCEP in 2009 and came to similar conclusions (found here). And these issues are frequently noted at conferences, workshops, and meetings.

Let me note that the above is about the modeling aspects of the NWS, NOT the many people in the local forecast offices. This part of the NWS is first-rate. They suffer from inferior U.S. guidance and fortunately have access to the ECMWF global forecasts. And there are some very good people at NCEP that have lacked the resources required and suitable organization necessary to push forward effectively.

This problem at the National Weather Service is not a weather prediction problem alone, but an example of a deeper national malaise. It is related to other U.S. issues, like our inferior K-12 education system. Our nation, gaining world leadership in almost all areas, became smug, self-satisfied, and a bit lazy. We lost the impetus to be the best. We were satisfied to coast. And this attitude must end…in weather prediction, education, and everything else… or we will see our nation sink into mediocrity.

The U.S. can reclaim leadership in weather prediction, but I am not hopeful that things will change quickly without pressure from outside of the NWS. The various weather user communities and our congressional representatives must deliver a strong message to the NWS that enough is enough, that the time for accepting mediocrity is over. And the Weather Service requires the resources to be first rate, something it does not have at this point.

*  *  *

Saturday, April 7, 2012

Lack of Computer Power Undermines U.S. Numerical Weather Prediction (Revised)

In my last blog on this subject, I provided objective evidence of how U.S. numerical weather prediction (NWP), and particularly our global prediction skill, lags between major international centers, such as the European Centre for Medium Range Weather Forecasting (ECMWF), the UKMET office, and the Canadian Meteorological Center (CMC).   I mentioned briefly how the problem extends to high-resolution weather prediction over the U.S. and the use of ensemble (many model runs) weather prediction, both globally and over the U.S.  Our nation is clearly number one in meteorological research and we certainly have the knowledge base to lead the world in numerical weather prediction, but for a number of reasons we are not.  The cost of inferior weather prediction is huge: in lives lost, injuries sustained, and economic impacts unmitigated.  Truly, a national embarrassment. And one we must change.

In this blog, I will describe in some detail one major roadblock in giving the U.S. state-of-the-art weather prediction:  inadequate computer resources.   This situation should clearly have been addressed years ago by leadership in the National Weather Service, NOAA, and the Dept of Commerce, but has not, and I am convinced will not without outside pressure.  It is time for the user community and our congressional representatives to intervene.  To quote Samuel L. Jackson, enough is enough. (…)

In the U.S. we are trying to use less computer resources to do more tasks than the global leaders in numerical weather prediction. (Note: U.S. NWP is done by National Centers for Environmental Prediction’s (NCEP) Environmental Modeling Center (EMC)).  This chart tells the story:
Courtesy of Bill Lapenta, EMC.
ECMWF does global high resolution and ensemble forecasts, and seasonal climate forecasts.  UKMET office also does regional NWP (England is not a big country!) and regional air quality.  NCEP does all of this plus much, much more (high resolution rapid update modeling, hurricane modeling, etc.).   And NCEP has to deal with prediction over a continental-size country.

If you would expect the U.S. has a lot more computer power to balance all these responsibilities and tasks, you would be very wrong.  Right now the U.S. NWS has two IBM supercomputers, each with 4992 processors (IBM Power6 processors).   One computer does the operational work, the other is for back up (research and testing runs are done on the back-up).  About 70 teraflops (trillion floating points operations per second) for each machine.

NCEP (U.S.) Computer
The European Centre has a newer IBM machine with 8192, much faster, processors that gets 182 terraflops (yes, over twice as fast and with far fewer tasks to do).

The UKMET office, serving a far, far smaller country, has two newer IBM machines, each with 7680 processors for 175 teraflops per machine.

Here is a figure, produced at NCEP that compares the relative computer power of NCEP’s machine with the European Centre’s.  The shading indicates computational activity and the x-axis for each represents a 24-h period.  The relative heights allows you to compare computer resources.  Not only does the ECMWF have much more computer power, but they are more efficient in using it…packing useful computations into every available minute.

Courtesy of Bill Lapenta, EMC
Recently, NCEP had a request for proposals for a replacement computer system.  You may not believe this, but the specifications were ONLY for a system at least equal to the one that have.    A report in acomputer magazine suggests that perhaps this new system (IBM got the contract) might be slightly less powerful (around 150 terraflops) than one of the UKMET office systems…but that is not known at this point.

The Canadians?  They have TWO machines like the European Centre’s!

So what kind of system does NCEP require to serve the nation in a reasonable way?

To start, we need to double the resolution of our global model to bring it into line with ECMWF (they are now 15 km global).   Such resolution allows the global model to model regional features (such as our mountains).  Doubling horizontal resolution requires 8 times more computer power.  We need to use better physics (description of things like cloud processes and radiation).  Double again.  And we need better data assimilation (better use of observations to provide an improved starting point for the model).  Double once more.  So we need 32 times more computer power for the high-resolution global runs to allow us to catch up with ECMWF.  Furthermore, we must do the same thing for the ensembles (running many lower resolution global simulations to get probabilistic information).  32 times more computer resources for that (we can use some of the gaps in the schedule of the high resolution runs to fit some of this in…that is what ECMWF does).   There are some potential ways NCEP can work more efficiently as well.  Right now NCEP runs our global model out to 384 hours four times a day (every six hours).  To many of us this seems excessive, perhaps the longest periods (180hr plus) could be done twice a day.  So lets begin with a computer 32 times faster that the current one.

Many workshops and meteorological meetings (such as one on improvements in model physics that was held at NCEP last summer—I was the chair) have made a very strong case that the U.S. requires an ensemble prediction system that runs at 4-km horizontal resolution.  The current national ensemble system has a horizontal resolution about 32 km…and NWS plans to get to about 20 km in a few years…both are inadequate.   Here is an example of the ensemble output (mean of the ensemble members) for the NWS and UW (4km) ensemble systems:  the difference is huge–the NWS system does not even get close to modeling the impacts of the mountains.  It is similarly unable to simulate large convective systems.

Current NWS( NCEP) “high resolution” ensembles (32 km)
4 km ensemble mean from UW system
Let me make one thing clear.  Probabilistic prediction based on ensemble forecasts and reforecasting (running models back for years to get statistics of performance) is the future of weather prediction.  The days of giving a single number for say temperature at day 5 are over.  We need to let people know about uncertainty and probabilities.  The NWS needs a massive increase of computer power to do this. It lacks this computer power now and does not seem destined to get it soon.

A real champion within NOAA of the need for more computer power is Tom Hamill, an expert on data assimilation and model post-processing.   He and colleagues have put together a compelling case for more NWS computer resources for NWP.  Read it here.

Back-of-the-envelope calculations indicates that a good first step– 4km national ensembles–would require about 20,000 processors to do so in a timely manner–but it would revolutionize weather prediction in the U.S., including forecasting convection and in mountainous areas.  This high-resolution ensemble effort would meld with data assimilation over the long-term.

And then there is running super-high resolution numerical weather prediction to get fine-scale details right.  Here in the NW my group runs a 1.3 km horizontal resolution forecast out twice a day for 48h.   Such capability is needed for the entire country.  It does not exist now due to inadequate computer resources.

The bottom line is that the NWS numerical modeling effort needs a huge increase of computer power to serve the needs of the country–and the potential impacts would be transformative.   We could go from having a third-place effort, which is slipping back into the pack, to a world leader.  Furthermore, the added computer power will finally allow NOAA to complete Observing System Simulation Experiments (OSSEs) and Observing System Experiments (OSEs) to make rational decisions about acquisitions of very expensive satellite systems.  The fact that this is barely done today is really amazing and a potential waste of hundreds of millions of dollars on unnecessary satellite systems.

But do to so will require a major jump in computational power, a jump our nation can easily afford.   I would suggest that NWS’s EMC should begin by securing at least a 100,000 processor machine, and down the road something considerably larger.  Keep in mind my department has about 1000 processors in our computational clusters, so this is not as large as you think.

For a country with several billion-dollar weather disasters a year, investment in reasonable computer resrouces for NWP is obvious.
The cost?   Well, I asked Art Mann of Silicon Mechanics (a really wonderful local vendor of computer clusters) to give me rough quote:  using fast AMD chips, you could have such a 100K core machine for 11 million dollars. (this is without any discount!)  OK, this is the U.S. government and they like expensive, heavy metal machines….lets go for 25 million dollars.  The National Center for Atmospheric Research (NCAR) is getting a new machine with around 75,000 processors and the cost will be around 25-35 million dollars.   NCEP will want two machines, so lets budget 60 million dollars. We spend this much money on a single jet fighter, but we can’t invest this amount to greatly improve forecasts and public safety in the U.S.?  We have machines far larger than this for breaking codes, doing simulations of thermonuclear explosions, and simulating climate change.

Yes, a lot of money, but I suspect the cost of the machine would be paid back in a few months from improved forecasts.   Last year we had quite a few (over ten) billion-dollar storms….imagine the benefits of forecasting even a few of them better.  Or the benefits to the wind energy and utility industries, or U.S. aviation, of even modestly improved forecasts.   And there is no doubt such computer resources would improve weather prediction.  The list of benefits is nearly endless.   Recent estimates suggest that  normal weather events cost the U.S. economy nearly 1/2 trillion dollars a year.  Add to that hurricanes, tornadoes, floods, and other extreme weather.  The business case is there.

As someone with an insider’s view of the process, it is clear to me that the current players are not going to move effectively without some external pressure.  In fact, the budgetary pressure on the NWS is very intense right now and they are cutting away muscle and bone at this point (like reducing IT staff in the forecast offices by over 120 people and cutting back on extramural research).  I believe it is time for weather sensitive industries and local government, together with t he general public, to let NOAA management and our congressional representatives know that this acute problem needs to be addressed and addressed soon.   We are acquiring huge computer resources for climate simulations, but only a small fraction of that for weather prediction…which can clearly save lives and help the economy.  Enough is enough.

Posted by Cliff Mass Weather Blog at 8:38 PM

Best Practices Are the Worst (Education Next)

SUMMER 2012 / VOL. 12, NO. 3 – http://educationnext.org/

As reviewed by Jay P. Greene

“Best practices” is the worst practice. The idea that we should examine successful organizations and then imitate what they do if we also want to be successful is something that first took hold in the business world but has now unfortunately spread to the field of education. If imitation were the path to excellence, art museums would be filled with paint-by-number works.

The fundamental flaw of a “best practices” approach, as any student in a half-decent research-design course would know, is that it suffers from what is called “selection on the dependent variable.” If you only look at successful organizations, then you have no variation in the dependent variable: they all have good outcomes. When you look at the things that successful organizations are doing, you have no idea whether each one of those things caused the good outcomes, had no effect on success, or was actually an impediment that held organizations back from being even more successful. An appropriate research design would have variation in the dependent variable; some have good outcomes and some have bad ones. To identify factors that contribute to good outcomes, you would, at a minimum, want to see those factors more likely to be present where there was success and less so where there was not.

“Best practices” lacks scientific credibility, but it has been a proven path to fame and fortune for pop-management gurus like Tom Peters, with In Search of Excellence, and Jim Collins, with Good to Great. The fact that many of the “best” companies they featured subsequently went belly-up—like Atari and Wang Computers, lauded by Peters, and Circuit City and Fannie Mae, by Collins—has done nothing to impede their high-fee lecture tours. Sometimes people just want to hear a confident person with shiny teeth tell them appealing stories about the secrets to success.

With Surpassing Shanghai, Marc Tucker hopes to join the ranks of the “best practices” gurus. He, along with a few of his colleagues at the National Center on Education and the Economy, has examined the education systems in some other countries with successful outcomes so that the U.S. can become similarly successful. Tucker coauthors the chapter on Japan, as well as an introductory and two concluding chapters. Tucker’s collaborators write chapters featuring Shanghai, Finland, Singapore, and Canada. Their approach to greatness in American education, as Linda Darling-Hammond phrases it in the foreword, is to ensure that “our strategies must emulate the best of what has been accomplished in public education both from here and abroad.”

But how do we know what those best practices are? The chapters on high-achieving countries describe some of what those countries are doing, but the characteristics they feature may have nothing to do with success or may even be a hindrance to greater success. Since the authors must pick and choose what characteristics they highlight, it is also quite possible that countries have successful education systems because of factors not mentioned at all. Since there is no scientific method to identifying the critical features of success in the best-practices approach, we simply have to trust the authority of the authors that they have correctly identified the relevant factors and have properly perceived the causal relationships.

But Surpassing Shanghai is even worse than the typical best-practices work, because Tucker’s concluding chapters, in which he summarizes the common best practices and draws policy recommendations, have almost no connection to the preceding chapters on each country. That is, the case studies of Shanghai, Finland, Japan, Singapore, and Canada attempt to identify the secrets to success in each country, a dubious-enough enterprise, and then Tucker promptly ignores all of the other chapters when making his general recommendations.

Tucker does claim to be drawing on the insights of his coauthors, but he never actually references the other chapters in detail. He never names his coauthors or specifically draws on them for his conclusions. In fact, much of what Tucker claims as common lessons of what his coauthors have observed from successful countries is contradicted in chapters that appear earlier in the book. And some of the common lessons they do identify, Tucker chooses to ignore.

For example, every country case study in Surpassing Shanghai, with the exception of the one on Japan coauthored by Marc Tucker, emphasizes the importance of decentralization in producing success. In Shanghai the local school system “received permission to create its own higher education entrance examination. This heralded a trend of exam decentralization, which was key to localized curricula.” The chapter on Finland describes the importance of the decision “to devolve increasing levels of authority and responsibility for education from the Ministry of Education to municipalities and schools…. [T]here were no central initiatives that the government was trying to push through the system.” Singapore is similarly described: “Moving away from the centralized top-down system of control, schools were organized into geographic clusters and given more autonomy…. It was felt that no single accountability model could fit all schools. Each school therefore set its own goals and annually assesses its progress toward meeting them…” And the chapter on Canada teaches us that “the most striking feature of the Canadian system is its decentralization.”

Tucker makes no mention of this common decentralization theme in his conclusions and recommendations. Instead, he claims the opposite as the common lesson of successful countries: “students must all meet a common basic education standard aligned to a national or provincial curriculum… Further, in these countries, the materials prepared by textbook publishers and the publishers of supplementary materials are aligned with the national curriculum framework.” And “every high-performing country…has a unit of government that is clearly in charge of elementary and secondary education…In such countries, the ministry has an obligation to concern itself with the design of the system as a whole…”

Conversely, Tucker emphasizes that “the dominant elements of the American education reform agenda” are noticeably absent from high-performing countries, including “the use of market mechanisms, such as charter schools and vouchers….” But if Tucker had read the chapter on Shanghai, he would have found a description of a system by which “students choose schools in other neighborhoods by paying a sponsorship fee. It is the Chinese version of school choice, a hot issue in the United States.” And although the chapter on Canada fails to make any mention of it, Canada has an extensive system of school choice, offering options that vary by language and religious denomination. According to recently published research by David Card, Martin Dooley, and Abigail Payne, competition among these options is a significant contributor to academic achievement in Canada.

There is a reason that promoters of best-practices approaches are called “gurus.” Their expertise must be derived from a mystical sphere, because it cannot be based on a scientific appraisal of the evidence. Marc Tucker makes no apology for his nonscientific approach. In fact, he denounces “the clinical research model used in medical research” when assessing education policies. The problem, he explains, is that no country would consent to “randomly assigning entire national populations to the education systems of another country or to certain features of the education system of another country.” On the contrary, countries, states, and localities can and do randomly assign “certain features of the education system,” and we have learned quite a lot from that scientific process. In the international arena, Tucker may want to familiarize himself with the excellent work being done by Michael Kremer and Karthik Muralidharan utilizing random assignment around the globe.

In addition, social scientists have developed practices to observe and control for differences in the absence of random assignment that have allowed extensive and productive analyses of the effectiveness of educational practices in different countries. In particular, the recent work of Ludger Woessmann, Martin West, and Eric Hanushek has utilized the PISA and TIMSS international test results that Tucker finds so valuable, but they have done so with the scientific methods that Tucker rejects. Even well-constructed case study research, like that done by Charles Glenn, can draw useful lessons across countries. The problem with the best-practices approach is not entirely that it depends on case studies, but that by avoiding variation in the dependent variable it prevents any scientific identification of causation.

Tucker’s hostility to scientific approaches is more understandable, given that his graduate training was in theater rather than a social science. Perhaps that is also why Tucker’s book reminds me so much of The Music Man. Tucker is like “Professor” Harold Hill come to town to sell us a bill of goods. His expertise is self-appointed, and his method, the equivalent of “the think system,” is obvious quackery. And the Gates Foundation, which has for some reason backed Tucker and his organization with millions of dollars, must be playing the residents of River City, because they have bought this pitch and are pouring their savings into a band that can never play music except in a fantasy finale.

Best practices really are the worst.

Jay P. Greene is professor of education reform at the University of Arkansas and a fellow at the George W. Bush Institute.

Surpassing Shanghai: An Agenda for American Education Built on the World’s Leading Systems
Edited by Marc Tucker
Harvard Education Press, 2011, $49.99; 288 pages.

Lead Dust Is Linked to Violence, Study Suggests (Science Daily)

ScienceDaily (Apr. 17, 2012) — Childhood exposure to lead dust has been linked to lasting physical and behavioral effects, and now lead dust from vehicles using leaded gasoline has been linked to instances of aggravated assault two decades after exposure, says Tulane toxicologist Howard W. Mielke.

Vehicles using leaded gasoline that contaminated cities’ air decades ago have increased aggravated assault in urban areas, researchers say.

The new findings are published in the journal Environment International by Mielke, a research professor in the Department of Pharmacology at the Tulane University School of Medicine, and demographer Sammy Zahran at the Center for Disaster and Risk Analysis at Colorado State University.

The researchers compared the amount of lead released in six cities: Atlanta, Chicago, Indianapolis, Minneapolis, New Orleans and San Diego, during the years 1950-1985. This period saw an increase in airborne lead dust exposure due to the use of leaded gasoline. There were correlating spikes in the rates of aggravated assault approximately two decades later, after the exposed children grew up.

After controlling for other possible causes such as community and household income, education, policing effort and incarceration rates, Mielke and Zahran found that for every one percent increase in tonnages of environmental lead released 22 years earlier, the present rate of aggravated assault was raised by 0.46 percent.

“Children are extremely sensitive to lead dust, and lead exposure has latent neuroanatomical effects that severely impact future societal behavior and welfare,” says Mielke. “Up to 90 per cent of the variation in aggravated assault across the cities is explained by the amount of lead dust released 22 years earlier.” Tons of lead dust were released between 1950 and 1985 in urban areas by vehicles using leaded gasoline, and improper handling of lead-based paint also has contributed to contamination.

Charting Hybridised Realities (Tactical Media Files)

Posted on April 15, 2012 by 

This text was originally written for the Re-Public on-line journal, which focuses on innovative developments in contemporary political theory and practice, and is published from Greece. As the journal has ground to a (hopefully just temporary) halt under severe austerity pressures we decided to post the current first draft of the text on the Tactical Media Files blog. This posting is one of two, the second of which will follow shortly. Both texts build on my recent Network Notebook on the ‘Legacies of Tactical Media‘.

The second text is a collection of preliminary notes that expand on recent discussions following Marco Deseriis and Jodi Dean’s essay “A Movement Without Demands”. It is conceivable that both texts will merge into a more substantive essay in the future, but I haven’t made up my mind about that as yet.

Hope this will be of interest,
Eric

Charting Hybridised Realities

Tactical Cartographies for a densified present

In the midst of an enquiry into the legacies of Tactical Media – the fusion of art, politics, and media which had been recognised in the middle 1990s as a particularly productive mix for cultural, social and political activism [1], the year 2011 unfolded. The enquiry had started as an extension of the work on the Tactical Media Files, an on-line documentation resource for tactical media practices worldwide [2], which grew out of the physical archives of the infamous Next 5 Minutes festival series on tactical media (1993 – 2003) housed at the International Institute of Social History in Amsterdam. After making much of tactical media’s history accessible again on-line, our question, as editors of the resource, had been what the current significance of the term and the thinking and practices around it might be?

Prior to 2011 this was something emphatically under question. The Next 5 Minutes festival series had been ended with the 2003 edition, following a year that had started on September 11, 2002, convening local activists gatherings named as Tactical Media Labs across six continents. [3] Two questions were at the heart of the fourth and last edition of the Next 5 Minutes: How has the field of media activism diversified since it was first named ‘tactical media’ in the middle 1990s? And what could be significance and efficacy of tactical media’s symbolic interventions in the midst of the semiotic corruption of the media landscape after the 9/11 terrorist attacks?

This ‘crash of symbols’ for obvious reasons took centre stage during this fourth and last edition of the festival. Naomi Klein had famously claimed in her speedy response to the horrific events of 9/11 that the activist lever of symbolic intervention had been contaminated and rendered useless in the face of the overpowering symbolic power of the terrorist attacks and their real-time mediation on a global scale. [4] The attacks left behind an “utterly transformed semiotic landscape” (Klein) in which the accustomed tactics of culture jammers had been ‘blown away’ by the symbolic power of the terrorist atrocities. Instead ‘we’ (Klein appealing to an imaginary community of social activists) should move from symbols to substance. What Klein overlooked in this response in ‘shock and awe’, however, was that while the semiotic landscape had indeed been dramatically transformed (and corrupted) in the wake of the 9/11 attacks, it still remained a semiotic landscape – symbols were still the only lever and entry point into the wider real-time mediated public domain.

Therefore, as unlikely as it may have seemed at the time, the question about the diversification of the terrain and the practices of media activism(s) was ultimately of far greater importance. What the 9/11 crash of symbols and the semiotic corruption debate contributed here was ‘merely’ an added layer of complexity. In a society permeated by media flows, social activism necessarily had to become media activism, and thus had to operate in a significantly more complex and contested environment. The diversification of the media and information landscape, however, also implied that a radical diversification of activist strategies was needed to address these increasingly hybridised conditions.

To name but a few of the emerging concerns: Witnessing of human rights abuses around the world, and creating public visibility and debate around them remained a pivotal concern for many tactical media practitioners, as it had been right from the early days of camcorder activism. But now new concerns over privacy in networked media environments, coupled with security and secrecy regimes of information control entered the scene. Critical media arts spread in different directions, claiming new terrains as diverse as life sciences and bio-engineering, as well as ‘contestational robotics’, interventions into the space of computer games, and even on-line role playing environments. Meanwhile the free software movement made its strides into developing more autonomous toolsets and infrastructures for a variety of social and cultural needs – adding a more strategic dimension to what had hitherto been mostly an interventionist practice. In a parallel movement on-line discussion groups, mailing lists, and activity on various social media platforms started to coalesce slowly into what media theorist Geert Lovink has described as ‘organised networks’. [5] Or finally the rapid development of wireless transmission technologies, smart phones and other wireless network clients, which introduced a paradoxical superimposition of mediated and embodied spatial logics, best be captured in the multilayered concept of Hybrid Space. [6]

Our question was therefore entirely justified, to ask how the term ‘tactical media’ could possibly bring together such a diversified, heterogeneous, and hybridised set of practices in a meaningful way? It had become clear that more sophisticated cartographies would be necessary to begin charting this intensely hybridised landscape.

A digital conversion of public space

If the events in 2011 have made one thing clear it is that the ominous claim of Critical Art Ensemble that “the streets are dead capital” [7] has been declared null and void by an astounding resurgence of street protest, whatever their longer term political significance and fallout might be. These protests staged in the streets and squares, ranging from anti-austerity protests in Southern Europe to the various uprisings in Arab countries in North Africa and the Middle East, to the Occupy protests in the US and Northern Europe, have by no means been staged in physical spaces out of a rejection of the semiotic corruption of the media space. Much rather the streets and squares have acted as a platform for the digital and networked multiplication of protest across a plethora of distribution channels, cutting right across the spectrum of alternative and mainstream, broadcast and networked media outlets.

What remained true to the origin of the term ‘tactical media’ was to build on Michel de Certeau’s insight that the ‘tactics of the weak’ operate on the terrain of strategic power through highly agile displacements and temporary interventions [8], creating a continuous nomadic movement, giving voice to the voiceless by means of ‘any media necessary’ (Critical Art Ensemble). However, the radical dispersal of wireless and mobile media technologies meant that mediated and embodied public spaces increasingly started to coincide, creating a new hybridised logic for social contestation. As witnessed in the remarkable series of public square occupations in 2011, through the digital conversion of public space the streets have become networks and the squares the medium for collective expression in a transnationally interconnected but still highly discontinuous media network.

Horizontal networks / lateral connections

One of the remarkable characteristics of the various protests is not simply the adoption of similar tactics (most notably occupations of public city squares), but the conscious interlinking of events as they unfold. Italian activists of the Unicommons movement physically linked up with revolting students in Tunisia, Egyptian bloggers and occupiers of Tahrir Square linked up with the ‘take the square’ activists in Spain, who in turn expressed solidarity and even co-initiated transnational actions with #occupy activists in the United States and elsewhere. It is the first time that the new organisational logic of transnational horizontal networks that has been theorised for instance in the seminal work “Territory, Authority, Rights” by sociologist Saskia Sassen, has become so evidently visible in activists practices across a set of radically dispersed geographic assemblages.

Horizontal networks by-pass traditional vertically integrated hierarchies of the local / national / international to create specific spatio-temporal transnational linkages around common interests, but also around affective ties. By and large these ties and linkages are still extra-institutional, largely informal, and because of their radically dispersed make up and their ‘affective’ constitution highly unstable. Political institutions have not even begun assembling an adequate response to these new emergent political constellations (other than traditional repressive instruments of strategic power, i.e. evictions, arrests, prohibitions). Given the structural inequalities that fuel the different strands of protest the longer term effectiveness of these measures remains highly uncertain. The institutional linkages at the moment seem mostly limited to anti-institutional contestation on the part of protestors and repressive gestures of strategic authority. The truly challenging proposition these new transnational linkages suggest, however, is their movement to bypass the nested hierarchies of vertically integrated power structures in a horizontal configuration of social organisation. They link up a bewildering array of local groups, sites, networks, geographies, and cultural contexts and sensitivities, taking seriously for the first time the networked space as a new ‘frontier zone’ (Sassen) where the new constellations of lateral transnational politics are going to be constructed.

Charting the layered densities of hybrid space

Hybrid Space is discontinuous. It’s density is always variable, from place to place, from moment to moment. Presence of carrier signals can be interrupted or restored at any moment. Coverage is never guaranteed. The economics of the wireless network space is a matter of continuous contestation, and transmitters are always accompanied by their own forms of electromagnetic pollution (electrosmog). Charting and navigating this discontinuous and unstable space, certainly for social and political activists, is therefore always a challenge. Some prominent elements in this cartography are emerging more clearly, however:

– connectivity: presence or absence of the signal carrier wave is becoming an increasingly important factor in staging and mediating protest. Exclusive reliance on state and corporate controlled infrastructures thus becomes increasingly perilous.

– censorship: censorship these days comes in many guises. Besides the continued forms of overt repression (arrests, confiscations, closures) of media outlets, new forms are the excessive application of intellectual property rights regimes to weed out unwarranted voices from the media landscape, but also highly effective forms of  dis-information and information overflow, something that has called the political efficacy of a project like WikiLeaks emphatically into question.

– circumvention: Great Information Fire Walls and information blockages are obvious forms of censorship, widely used during the Arab protests and common practice in China, now also spreading throughout the EU (under the guise of anti-piracy laws). These necessitate an ever more sophisticated understanding and deployment of internet censorship circumvention techniques, an understanding that should become common practice for contemporary activists. [9]

– attention economies: attention is a sought after commodity in the informational society. It is also fleeting. (Media-) Activists need to become masters at seizing and displacing public attention. Agility and mobility are indispensable here.

– public imagination management: Strategic operators try to manage public opinion. Activists cannot rely on this strategy. They do not have the means to keep and maintain public opinion in favour of their temporary goals. Instead activists should focus on ‘public imagination management’ – the continuous remembrance that another world is possible.

Beyond semiotic corruption: A perverse subjectivity

The immersion in extended networks of affect that now permeate both embodied and mediated spaces introduces a new and inescapable corruption of subjectivity. Critical theory already taught us that we cannot trust subjectivity. However, the excessive self-mediation of protestors on the public square has shown that a deep desire for subjective articulation drives the manifestation in public. The dynamic is underscored further by upload statistics of video platforms such as youtube that continue to outpace the possibility for the global population to actually see and witness these materials.

Rather than dismissing subjectivity it should be embraced. This requires a new attitude ‘beyond good and evil’, beyond critique and submission. A new perverse subjectivity is able to straddle the seemingly impossible divide between willing submission to various forms of corporate, state and social coercion, and vital social and political critique and contestation. It’s maxim here: Relish your own commodification, embrace your perverse subjectivity, in order to escape the perversion of subjectivity.

Eric Kluitenberg
Amsterdam, April 15, 2012.

References:

1 – See: David Garcia & Geert Lovink, The ABC of Tactical Media, May 1997, a.o.:
www.tacticalmediafiles.net/article.jsp?objectnumber=37996

2 – www.tacticalmediafiles.net

3 – Documentation of the Tactical Media Labs events can be found at:
www.n5m4.org

4 – Naomi Klein – Signs of the Times, in The Nation, October 5, 2001.
Archived at: www.tacticalmediafiles.net/article.jsp?objectnumber=46632

5 – Geert Lovink and Ned Rossiter, Dawn of the Organised Networks, in; Fibreculture Journal, Issue 5, 2005.
http://five.fibreculturejournal.org/fcj-029-dawn-of-the-organised-networks/

6 – See my article The Network of Waves, and the theme issue Hybrid Space of Open – Journal for Art and the Public Domain, Amsterdam, 2006;
www.tacticalmediafiles.net/article.jsp?objectnumber=48405
(the complete issue is linked as pdf file to the article).

7 – Critical Art Ensemble, Digital Resistance, Autonomedia, New York, 2001.
www.critical-art.net/books/digital/

8 – Michel de Certeau, The Practice of Everyday Life, University of California Press, 1984.

9 – A useful manual can be found here: www.flossmanuals.net/bypassing-censorship/

A internet está cada vez mais política (Folha de S.Paulo)

JC e-mail 4464, de 27 de Março de 2012.

fonte: http://www.jornaldaciencia.org.br/Detalhe.jsp?id=81741

O advogado Marcel Leonardi foi um dos principais colaboradores na discussão pública que elaborou o Marco Civil da Internet, projeto de lei proposto pelo Ministério da Justiça para traçar princípios como neutralidade e privacidade na internet brasileira. Tempos depois, Leonardi foi chamado para assumir o posto de diretor de políticas públicas do Google no Brasil.

Em outras palavras, ele é o responsável por conversar com o governo, articular a defesa dos usuários em casos como o da cobrança do Escritório Central de Arrecadação e Distribuição (Ecad) sobre vídeos do YouTube embedados em blogs e levar à esfera pública princípios básicos da internet.

Tanto é que ele vive entre idas e vindas de Brasília e participa de audiências públicas para expor a opinião do Google – e a sua – sobre projetos de leis em discussão que afetam a maneira como as pessoas usam a internet, como o Código de Defesa do Consumidor, a Lei de Direitos Autorais e o próprio Marco Civil da Internet.

O advogado também responde questionamentos em nome do Google. Recentemente, o Ministério da Justiça exigiu explicações sobre as mudanças das regras de privacidade. A empresa, afinal, é custeada por publicidade – e neste modelo, os dados pessoais dos usuários têm muito valor. E é neste ponto em que os interesses da empresa e os dos usuários se distanciam. Leonardi diz que é uma questão de conscientização dos usuários sobre as novas regras.

Vestindo camiseta e calça jeans, sem o terno habitual, o articulador do Google deixa claro: hoje as empresas também fazem política. Cada vez mais.

O Ministério da Justiça questionou as mudanças na política de privacidade do Google. O que vocês responderam?

A gente está disposto a trabalhar com as autoridades. Há muita apreensão do que a gente faz em relação à privacidade, mas há pouca compreensão. Antes o Google tinha políticas separadas por produtos. Mas todas elas, com exceção de duas, já diziam que dados de um serviço poderiam ser utilizados em outros serviços. Então a unificação não alterou nada. Os dados que a gente coleta são os mesmos. As exceções eram o YouTube, que tinha uma política própria, e o histórico de buscas, que hoje expressamente pode ser usado em outros produtos do Google.

O que é preocupante.

A gente não considera assustador porque damos ao usuário as ferramentas para ele controlar isso. O usuário acessa o painel de controle e diz se quer ou não manter o histórico da busca. A pessoa pode desativar completamente. Seria assustador se acontecesse sem o usuário saber o que está acontecendo. Todas as empresas do setor adotam esse modelo.

Os dados pessoais são valiosos, e as pessoas não têm ideia do que é feito com as informações.

A mudança passou pelo maior esforço de notificação da história do Google. Anunciamos no dia 24 de janeiro, e elas só entraram em vigor no dia 1º de março. Durante todo esse período, tinha um aviso em todas as páginas. A lógica era reduzir o “legalês”, porque a indústria de internet sempre ouviu que as políticas e termos de uso tinham de ser mais claros. Enxugamos radicalmente, só que cai nesse problema: em que momento você consegue forçar alguém a ler? As pessoas sempre dizem que estão preocupadas com a privacidade, mas agem diferente.

O Google foi condenado recentemente por causa de uma postagem no Orkut. A responsabilização de empresas por conteúdo de usuários é recorrente?

É um debate antigo. Mundialmente existe o conceito de que a plataforma não é responsável. Nos EUA e na Europa a lei diz isso expressamente. O Brasil ainda não tem uma lei específica. Uma das propostas é o Marco Civil da Internet, que diz que a responsabilidade só será derivada do descumprimento de uma ordem judicial. Na ausência de leis, os tribunais analisam caso a caso. O Google sempre recorre para mostrar que, pela lógica e pelo bom senso, não existe responsabilidade da plataforma.

Como funciona o processo de remoção de conteúdo, por exemplo, um post de um blog?

Em casos de direito autoral, o Google recebe a notificação de alguém que demonstra que é titular daquele direito e que aquilo não foi autorizado, e existe a verificação se isso viola ou não. Mas existem alguns requisitos. Na lei americana, há os requisitos do DMCA (Digital Millenium Copyright Act, lei de direitos autorais sancionada em 1998). No Brasil, da lei autoral.

O próprio Google verifica?

Existem os times internos que avaliam. Se há infração, a remoção acontece sem intervenção judicial, porque está de acordo com a nossa política de não permitir violação de direito autoral.

Concorda com a proposta do Ministério da Cultura, na nova Lei de Direitos Autorais, de institucionalizar um mecanismo de notificação?

Ainda é controverso. Eles pretendiam incluir o mecanismo que transforma em lei uma prática que muitas empresas adotam. O problema desse modelo é que dá margem para muito abuso. A gente vê muito isso nos EUA. Todo mundo tenta enquadrar própria situação em uma violação para justificar uma remoção.

Por que vocês se posicionaram contra a cobrança do Ecad sobre vídeos do YouTube?

Percebemos uma distorção na postura do Ecad. Achamos importantíssimo deixar pública a nossa posição de que não compactuávamos com aquilo, de que a interpretação da lei estava errada. O grande problema é que os novos modelos de negócio querem florescer, mas eles vêem uma interpretação antiga da lei autoral e isso impede que eles cresçam. O Spotify é um exemplo. O sujeito paga 10 euros e tem acesso a milhões de músicas. Muitas vezes a pirataria nada mais é que uma demanda reprimida que o mercado não está cumprindo.

A reforma da lei de direitos autorais é um avanço?

É uma incógnita. Tenho a impressão de que a versão intermediária é um pouco mais aberta e amigável para esses modelos. Tinha a licença compulsória, que era interessante, e uma linguagem que permitiria um uso mais flexível.

Vocês opinaram nesse texto?

A gente participa dos debates, mas depois da consulta pública a coisa fica fechada. No Congresso dá para conversar. É importante. Inclusive, se não fossem os ativistas, muita coisa de regulação de internet no Brasil teria sido diferente. Toda a oposição à lei Azeredo, toda a pressão para o Marco Civil, é fruto do engajamento. Nos EUA, a o caso Sopa foi interessante. O fato da Wikipedia ter saído do ar apavorou muita gente. Foi só aí que houve conscientização sobre os riscos da lei.

Essa lei nos EUA provocou um movimento em defesa dos princípios da internet. As empresas estão assumindo uma postura política?

Não tem como a gente não pensar politicamente hoje. Não dá para olhar para o próprio umbigo e pensar que enquanto o negócio vai bem não é preciso conversar. Porque existem questões acima. Quando a gente pensa politicamente é isso, todas as empresas do setor tendem a conversar e entender melhor como isso funciona.

Há necessidade de uma lei atualizada de cibercrimes?

Existe a necessidade do juiz ou de quem trabalha com direito criminal entender melhor a internet. Porque a maior parte do que está na lei já funciona. Não podemos correr o risco de adotar um texto tão genérico ao ponto de você estar lá fuçando no celular, sem querer você invade um sistema e vão dizer que você cometeu um crime.

O Brasil ainda é líder nos pedidos de remoção de conteúdo?

Sim. No nosso relatório de transparência constam todas as requisições do governo ou da Justiça de remoção de conteúdo. O Brasil é líder em remoções porque aqui é fácil. Você pode ir sem custo e sem advogado a um tribunal de pequenas causas e pedir uma liminar para tirar um blog do ar. Além disso, muita gente está acostumada com a cultura de “na dúvida, vamos pedir para remover”.

O que pode instituir a censura.

É. A gente já se deparou com casos assustadores. Está crescendo o número de empresas criticadas por consumidores que entram com uma ação para remover qualquer referência negativa.

(Folha de São Paulo)

HISTORIAS OLVIDADAS DE BUENOS AIRES: UN HOMBRE DECIA HABER INVENTADO LA MAQUINA DE LA LLUVIA

Sucedió el 2 de enero de 1939, cuando un ingeniero llamado Juan Baigorri le aseguró al director de Meteorología que haría llover sobre la ciudad. Y llovió.

Héctor Gambini. DE LA REDACCION DE CLARIN.

Lunes 17.06.2002

“Como respuesta a la censura a mi procedimiento, regalo —por intermedio de Crítica— una lluvia a Buenos Aires para el 2 de enero de 1939″. La frase salió en el diario a fines del 38 y era un desafío público al director de Meteorología Nacional, para quien el autor de los dichos no era más que un embustero. Un ingeniero provocador que decía haber inventado la máquina de hacer llover.

Cuando llegó el 1° de enero, los porteños tenían el desafío tan presente que chocaban copas de madrugada con los ojos clavados en el cielo limpio. El día fue tan caluroso y húmedo que hasta la tarea de sentarse bajo la parra a mirar las nubes raquíticas que pasaban por Buenos Aires resultaba un entretenimiento cansador. Pero llegó la noche y nada.

En la mañana del 2, la ciudad volvió al trabajo. Y nada. Ni rastros de la lluvia. Pero no había viento ni para mover un pétalo de rosa. Y las nubecitas blancas y enfermizas de la tarde anterior iban echando cuerpo y color. Primero grises plomo. Después virando hacia el negro. Cada vez más. Hasta que una brisa de suspiro apareció de la nada con un aliento de humedad en suspensión. Gotitas sin peso ni para llegar al suelo. Y otras gotitas finas detrás, que ya tocaban el asfalto. Y otras gordas como ñoquis, que ahora hacían dibujos en los charcos incipientes. Enseguida,tormenta eléctrica y chaparrón violento. Una catarata que caía del cielo mientras Crítica paraba las rotativas para salir al mediodía con el título principal de la quinta edición, en tipografía catástrofe: “Como lo pronosticó Baigorri, hoy llovió”, debajo de una volanta que daba información acerca de lo que acababa de ocurrir en Buenos Aires:“Baigorri consiguió que tres millones de personas dirijan sus miradas al cielo”.

El tal Baigorri había nacido en Entre Ríos a fines del siglo anterior. Hijo de un militar amigo del general Roca, llegó a Buenos Aires para hacer la secundaria en el Colegio Nacional. Cuando egresó viajó a Italia para estudiar geofísica y se recibió de ingeniero en la Universidad de Milán.

En esos años —principios de la década del 30— comenzó a viajar por el mundo, contratado por diferentes petroleras. Estuvo en diversos países de Europa, Asia y Africa. Y también en Estados Unidos, desde donde volvió contratado por YPF.

Con su mujer y su hijo se instaló en Caballito. Junto a sus bultos de familia hizo trasladar desde el aeropuerto un aparato con antenas expandibles, que guardó celosamente en un placard. “Más o menos estoy adaptado a Buenos Aires, pero hay mucha humedad”, se quejaba.

Una mañana se decidió. Tomó unos aparatos y los utilizó para ir midiendo la humedad por los barrios porteños. Se paró frente a una casa de Araujo y Falcón, en Villa Luro. Las agujas le indicaban que era la zona más alta de cuanto había recorrido. Compró esa casa, que tenía un altillo perfecto para un laboratorio.

Allí se fue “desarrollando” la función de la extraña máquina, un artefacto que, a los dichos de Baigorri, provocaba que el cielo rompiese en lluvia cada vez que la encendiera. Según él, ocurría por un mecanismo de electromagnetismo que concentraba nubes en el área de influencia del aparato.

Era 1938 y los diarios hablaban de los recientes suicidios de Leopoldo Lugones y Alfonsina Storni. Y de los fraudes en las elecciones parlamentarias que ponían al presidente Roberto Ortiz al borde de la renuncia. River inauguraba el Monumental.

Baigorri buscaba demostrar que podía manejar la lluvia y buscó el patrocinio del Ferrocarril Central Argentino. El gerente inglés oyó la propuesta y sonrió, malicioso. “¿Y usted podría hacerlo en cualquier lugar?”, preguntó, tropezando con las palabras en español. Baigorri contestó que sí, y el inglés desafió, sarcástico: “Bueno, haga llover en Santiago del Estero”.

Hacia allí salió el ingeniero, con su extraña máquina y un perito agrónomo de acompañante, que viajaba para controlarlo. A los pocos días volvieron y el perito certificó que, en una estancia de una localidad llamada Estación Pinto, Baigorri se puso a trabajar y a las ocho horas llovió.

Su fama comenzó a crecer y llegó con él, en tren, a Buenos Aires. Hasta viajaron dos periodistas de The Times, de Londres, para entrevistarlo. En el otro rincón, el ingeniero Calmarini, director de Meteorología, salió a decir que todo era un invento infame o, a lo sumo, obra de la casualidad.

Aprovechando la polémica y con el tema instalado en la calle, Crítica fue a entrevistar a Baigorri. De allí salió el desafío para el 2 de enero. Ante el silencio de Meteorología, el ingeniero subió la apuesta: le mandó al funcionario nacional un paraguas de regalo . Junto al bulto, una tarjeta:“Para que lo use el 2 de enero”. Fue el día en que los porteños se desvelaron para mirar el cielo, esperando la lluvia.

Baigorri comenzó a viajar por el interior y a “hacer llover” con su máquina en diferentes localidades, con suerte dispar.

En 1951 fue asesor ad honórem del Ministerio de Asuntos Técnicos. Al año siguiente desempolvó su viejo invento y viajó a La Pampa. Llegó, encendió la batería y empezó a llover, aunque ya la gente dudaba de sus méritos:“Iba a llover de todos modos”, decían.

Baigorri se recluyó en un largo silencio. Ya viudo, pasaba horas en el altillo de Villa Luro. Leonor, la mujer que hoy vive en esa casa, contó a Clarín:“Cada vez que llovía la gente rodeaba la casa y se ponía a mirar hacia el altillo”. Allí mismo Baigorri se negó a atender a un emisario que decía venir en nombre de un empresario norteamericano para comprarle la fórmula. “Mi invento es argentino y será para exclusivo beneficio de los argentinos”, le contestó.

Anciano y solo, vendió la casa y se mudó a lo de un amigo francés, que le prestó una habitación en un departamento. Murió en el otoño de 1972, hace justo 30 años. Tenía 81 y había llegado al hospital solo, con problemas en los bronquios.

Nadie más supo de la extraña máquina de las antenas. Ni si Baigorri dejó un sucesor secreto para que la activara como homenaje durante su propio sepelio: cuando lo estaban enterrando, en el cementerio de la Chacarita, se largó a llover. 

MIT Predicts That World Economy Will Collapse By 2030 (POPSCI)

By Rebecca Boyle – Posted 04.05.2012 at 4:30 pm

Crowds and Haze in Shanghai Jeremy Vandel via Flickr

Forty years after its initial publication, a study called The Limits to Growth is looking depressingly prescient. Commissioned by an international think tank called the Club of Rome, the 1972 report found that if civilization continued on its path toward increasing consumption, the global economy would collapse by 2030. Population losses would ensue, and things would generally fall apart.

The study was — and remains — nothing if not controversial, with economists doubting its predictions and decrying the notion of imposing limits on economic growth. Australian researcher Graham Turner has examined its assumptions in great detail during the past several years, and apparently his latest research falls in line with the report’s predictions, according to Smithsonian Magazine. The world is on track for disaster, the magazine says.

The study, initially completed at MIT, relied on several computer models of economic trends and estimated that if things didn’t change much, and humans continued to consume natural resources apace, the world would run out at some point. Oil will peak (some argue it has) before dropping down the other side of the bell curve, yet demand for food and services would only continue to rise. Turner says real-world data from 1970 to 2000 tracks with the study’s draconian predictions: “There is a very clear warning bell being rung here. We are not on a sustainable trajectory,” he tells Smithsonian.

Is this impossible to fix? No, according to both Turner and the original study. If governments enact stricter policies and technologies can be improved to reduce our environmental footprint, economic growth doesn’t have to become a market white dwarf, marching toward inevitable implosion. But just how to do that is another thing entirely.

[Smithsonian]

As linguagens da psicose (Revista Fapesp)

Abordagem matemática evidencia as diferenças entre os discursos de quem tem mania ou esquizofrenia

CARLOS FIORAVANTI | Edição 194 – Abril de 2012

Como o estudo foi feito: os entrevistados relatavam um sonho e a entrevistadora convertia as palavras mais importantes em pontos e as frases em setas para examinar a estrutura da linguagem

Para os psiquiatras e para a maioria das pessoas, é relativamente fácil diferenciar uma pessoa com psicose de quem não apresentou nenhum distúrbio mental já diagnosticado: as do primeiro grupo relatam delírios e alucinações e por vezes se apresentam como messias que vão salvar o mundo. Porém, diferenciar os dois tipos de psicose – mania e esquizofrenia – já não é tão simples e exige um bocado de experiência pessoal, conhecimento e intuição dos especialistas. Uma abordagem matemática desenvolvida no Instituto do Cérebro da Universidade Federal do Rio Grande do Norte (UFRN) talvez facilite essa diferenciação, fundamental para estabelecer os tratamentos mais adequados para cada enfermidade, ao avaliar de modo quantitativo as diferenças nas estruturas de linguagem verbal adotadas por quem tem mania ou esquizofrenia.

A estratégia de análise – com base na teoria dos grafos, que representou as palavras como pontos e a sequência entre elas nas frases por setas – indicou que as pessoas com mania são muito mais prolixas e repetitivas do que as com esquizofrenia, geralmente lacônicas e centradas em um único assunto, sem deixar o pensamento viajar. “A recorrência é uma marca do discurso do paciente com mania, que conta três ou quatro vezes a mesma coisa, enquanto aquele com esquizofrenia fala objetivamente o que tem para falar, sem se desviar, e tem um discurso pobre em sentidos”, diz a psiquiatra Natália Mota, pesquisadora do instituto. “Em cada grupo”, diz Sidarta Ribeiro, diretor do instituto, “o número de palavras, a estrutura da linguagem e outros indicadores são completamente distintos”.

Eles acreditam que conseguiram dar os primeiros passos rumo a uma forma objetiva de diferenciar as duas formas de psicose, do mesmo modo que um hemograma é usado para atestar uma doença infecciosa, desde que os próximos testes, com uma amostra maior de participantes, reforcem a consistência dessa abordagem e os médicos consintam em trabalhar com um assistente desse tipo. Os testes comparativos descritos em um artigo recém-publicado na revista PLoS One indicaram que essa nova abordagem proporciona taxas de acerto da ordem de 93% no diagnóstico, enquanto as escalas psicométricas hoje em uso, com base em questionários de avaliação de sintomas, chegam a apenas 67%. “São métodos complementares”, diz Natália. “As escalas psicométricas e a experiência dos médicos continuam indispensáveis.”

“O resultado é bastante simples, mesmo para quem não entende matemática”, diz o físico Mauro Copelli, da Universidade Federal de Pernambuco (UFPE), que participou desse trabalho. O discurso das pessoas com mania se mostra como um emaranhado de pontos e linhas, enquanto o das com esquizofrenia se apresenta como uma reta, com poucos pontos. A teoria dos grafos, que levou a esses diagramas, tem sido usada há séculos para examinar as trajetórias pelas quais um viajante poderia visitar todas as cidades de uma região, por exemplo. Mais recentemente, tem servido para otimizar o tráfego aéreo, considerando os aeroportos como um conjunto de pontos ou nós conectados entre si por meio dos aviões.

“Na primeira vez que rodei o programa de grafos, as diferenças de linguagem saltaram aos olhos”, conta Natália. Em 2007, ao terminar o curso de medicina e começar a residência médica em psiquiatria no hospital da UFRN, Natália notava que muitos diagnósticos diferenciais de mania e de esquizofrenia dependiam da experiência pessoal e de julgamentos subjetivos dos médicos – os que trabalhavam mais com pacientes com esquizofrenia tendiam a encontrar mais casos de esquizofrenia e menos de mania – e muitas vezes não havia consenso. Já se sabia que as pessoas com mania falam mais e se desviam do tópico central muito mais facilmente que as com esquizofrenia, mas isso lhe pareceu genérico demais. 
Em um congresso científico em 2008 em Fortaleza ela conversou com Copelli, que já colaborava com Ribeiro e a incentivou a trabalhar com grafos. No início ela resistiu, por causa da pouca familiaridade com matemática, mas logo depois a nova teoria lhe pareceu simples e prática.

Para levar o trabalho adiante, ela gravou e, com a ajuda de Nathália Lemos e Ana Cardina Pieretti, transcreveu as entrevistas com 24 pessoas 
(oito com mania, oito com esquizofrenia e oito sem qualquer distúrbio mental diagnosticado), a quem pedia para relatar um sonho; qualquer comentário fora desse tema era considerado um voo da imaginação, bastante comum entre as pessoas com mania.

“Já na transcrição, os relatos dos pacientes com mania eram claramente maiores que os com esquizofrenia”, diz. Em seguida, ela eliminou elementos menos importantes como artigos e preposições, dividiu a frase em sujeito, verbo e objetos, representados por pontos ou nós, enquanto a sequência entre elas na frase era representada por setas, unindo dois nós, e assinalou as que não se referiam ao tema central do relato, ou seja, o sonho recente que ela pedira para os entrevistados contarem, e marcavam um desvio do pensamento, comum entre as pessoas com mania.

Um programa específico para grafos baixado de graça na internet indicava as características relevantes para análise – ou atributos – e representava as principais diferenças de discurso entre os participantes, como quantidades de nós, extensão e densidade das conexões entre os pontos, recorrência, prolixidade (ou logorreia) e desvio do tópico central. “É supersimples”, assegura Natália. Nas validações e análises dos resultados, ela contou também com a colaboração de Osame Kinouchi, da Universidade de São Paulo (USP) em Ribeirão Preto, e Guillermo Cecchi, do Centro de Biologia Computacional da IBM, Estados Unidos.

Resultado: as pessoas com mania obtiveram uma pontuação maior que as com esquizofrenia em quase todos os itens avaliados. “A logorreia típica de pacientes com mania não resulta só do excesso de palavras, mas de um discurso que volta sempre ao mesmo tópico, em comparação com o grupo com esquizofrenia”, ela observou. Curiosamente, os participantes do grupo-controle, sem distúrbio mental diagnosticado, apresentaram estruturas discursivas de dois tipos, ora redundantes como os participantes com mania, ora enxutas como os com esquizofrenia, refletindo as diferenças entre suas personalidades ou a motivação para, naquele momento, falar mais ou menos. “A patologia define o discurso, não é nenhuma novidade”, diz ela. “Os psiquiatras são treinados para reconhecer essas diferenças, mas dificilmente poderão dizer que a recorrência de um paciente com mania está 28% menor, por mais experientes que sejam.”

“O ambiente interdisciplinar do instituto foi essencial para realizar esse estudo, porque eu estava todo dia trocando ideias com gente de outras áreas. Nivaldo Vasconcelos, um engenheiro de computação, me ajudou muito”, diz ela. O Instituto do Cérebro, em funcionamento desde 2007, conta atualmente com 13 professores, 22 estudantes de graduação e 42 de pós, 8 pós-doutorandos e 30 técnicos. “Vencidas as dificuldades iniciais, conseguimos formar um grupo de pesquisadores jovens e talentosos”, comemora Ribeiro. “A casa em que estamos agora tem um jardim amplo, e muitas noites ficamos lá até as duas, três da manhã, falando sobre ciência e tomando chimarrão.”

Artigo científico
MOTA, N.B. et al
Speech graphs provide 
a quantitative measure of thought disorder 
in psychosis. PLoS ONE (no prelo).

New Understanding to Past Global Warming Events: Hyperthermal Events May Be Triggered by Warming (Science Daily)

These geological deposits make the Bighorn Basin area of Wyoming ideal for studying the PETM. (Credit: Aaron Diefendorf)

ScienceDaily (Apr. 2, 2012) — A series of global warming events called hyperthermals that occurred more than 50 million years ago had a similar origin to a much larger hyperthermal of the period, the Pelaeocene-Eocene Thermal Maximum (PETM), new research has found. The findings, published in Nature Geoscience online on April 1, 2012, represent a breakthrough in understanding the major “burp” of carbon, equivalent to burning the entire reservoir of fossil fuels on Earth, that occurred during the PETM.

“As geologists, it unnerves us that we don’t know where this huge amount of carbon released in the PETM comes from,” says Will Clyde, associate professor of Earth sciences at the University of New Hampshire and a co-author on the paper. “This is the first breakthrough we’ve had in a long time. It gives us a new understanding of the PETM.” The work confirms that the PETM was not a unique event – the result, perhaps, of a meteorite strike – but a natural part of Earth’s carbon cycle.

Working in the Bighorn Basin region of Wyoming, a 100-mile-wide area with a semi-arid climate and stratified rocks that make it ideal for studying the PETM, Clyde and lead author Hemmo Abels of Utrecht University in the Netherlands found the first evidence of the smaller hyperthermal events on land. Previously, the only evidence of such events were from marine records.

“By finding these smaller hyperthermal events in continental records, it secures their status as global events, not just an ocean process. It means they are atmospheric events,” Clyde says.

Their findings confirm that, like the smaller hyperthermals of the era that released carbon into the atmosphere, the release of carbon in the PETM had a similar origin. In addition, the warming-to-carbon release of the PETM and the other hyperthermals are similarly scaled, which the authors interpret as an indication of a similar mechanism of carbon release during all hyperthermals, including the PETM.

“It points toward the fact that we’re dealing with the same source of carbon,” Clyde says.

Working in two areas of the Bighorn Basin just east of Yellowstone National Park – Gilmore Hill and Upper Deer Creek – Clyde and Abels sampled rock and soil to measure carbon isotope records. They then compared these continental recordings of carbon release to equivalent marine records already in existence.

During the PETM, temperatures rose between five and seven degrees Celsius in approximately 10,000 years — “a geological instant,” Clyde calls it. This rise in temperature coincided exactly with a massive global change in mammals, as land bridges opened up connecting the continents. Prior to the PETM, North America had no primates, ancient horses, or split-hoofed mammals like deer or cows.

Scientists look to the PETM for clues about the current warming of Earth, although Clyde cautions that “Earth 50 million years ago was very different than it is today, so it’s not a perfect analog.” While scientists still don’t fully understand the causes of these hyperthermal events, “they seem to be triggered by warming,” Clyde says. It’s possible, he says, that less dramatic warming events destabilized these large amounts of carbon, releasing them into the atmosphere where they, in turn, warmed the Earth even more.

“This work indicates that there is some part of the carbon cycle that we don’t understand, and it could accentuate global warming,” Clyde says.

Why The Future Is Better Than You Think (Reason.com)

Sharif Christopher Matar | March 15, 2012

Can a Masai Warrior in Africa today communicate better than Ronald Reagan could? If he’s on a cell phone, Peter Diamandis says he can.

Peter Diamandis is the founder and chairman of the X Prize Foundation, which offers big cash prizes “to bring about radical breakthroughs for the benefit of humanity.” Reason’s Tim Cavanaugh sat down to talk with Peter about his new book Abundance and why he think we live in an “incredible time”, but no one realizes it. Peter thinks that there are some powerful human forces combined with technological advancements that are transforming the world for the better.

“The challenge is that the rate of innovation is so fast…” Peter says, “the government can’t keep up with it.” If the government tries to play “catch up” with regulations and policy, the technology with just go overseas. Certain inovations in “food, water, housing, health, education is getting better and better.” Peter “hopes we are not going to be in a situation where, entrenched interests are preventing the consumer from having better health care.”

Filmed by Sharif Matar and Tracy Oppenheimer. Edited by Sharif Matar

The Inside Story on Climate Scientists Under Siege (Wired/The Guardian)

By Suzanne Goldenberg, The Guardian
February 17, 2012 |

It is almost possible to dismiss Michael Mann’s account of a vast conspiracy by the fossil fuel industry to harass scientists and befuddle the public. His story of that campaign, and his own journey from naive computer geek to battle-hardened climate ninja, seems overwrought, maybe even paranoid.

But now comes the unauthorized release of documents showing how a libertarian thinktank, the Heartland Institute, which has in the past been supported by Exxon, spent millions on lavish conferences attacking scientists and concocting projects to counter science teaching for kindergarteners.

Mann’s story of what he calls the climate wars, the fight by powerful entrenched interests to undermine and twist the science meant to guide government policy, starts to seem pretty much on the money. He’s telling it in a book out on March 6, The Hockey Stick and the Climate Wars: Dispatches From the Front Lines.

“They see scientists like me who are trying to communicate the potential dangers of continued fossil fuel burning to the public as a threat. That means we are subject to attacks, some of them quite personal, some of them dishonest.” Mann said in an interview conducted in and around State College, home of Pennsylvania State University, where he is a professor.

It’s a brilliantly sunny day, and the light snowfall of the evening before is rapidly melting.

Mann, who seems fairly relaxed, has just spoken to a full-capacity, and uniformly respectful and supportive crowd at the university.

It’s hard to square the surroundings with the description in the book of how an entire academic discipline has been made to feel under siege, but Mann insists that it is a given.

“It is now part of the job description if you are going to be a scientist working in a socially relevant area like human-caused climate change,” he said.

He should know. For most of his professional life has been at the center of those wars, thanks to a paper he published with colleagues in the late 1990s showing a sharp upward movement in global temperatures in the last half of the 20th century. The graph became known as the “hockey stick”.

If the graph was the stick, then its publication made Mann the puck. Though other prominent scientists, such as Nasa’s James Hansen and more recently Texas Tech University’s Katharine Hayhoe, have also been targeted by contrarian bloggers and thinktanks demanding their institutions turn over their email record, it’s Mann who’s been the favorite target.

He has been regularly vilified on Fox news and contrarian blogs, and by Republican members of Congress. The attorney general of Virginia, who has been fighting in the courts to get access to Mann’s email from his earlier work at the University of Virginia. And then there is the high volume of hate mail, the threats to him and his family.

“A day doesn’t go by when I don’t have to fend off some attack, some specious criticism or personal attack,” he said. “Literally a day doesn’t go by where I don’t have to deal with some of the nastiness that comes out of a campaign that tries to discredit me, and thereby in the view of our detractors to discredit the entire science of climate change.”

By now he and other climate scientists have been in the trenches longer than the U.S. army has been in Afghanistan.

And Mann has proved a willing combatant. He has not gone so far as Hansen, who has been arrested at the White House protesting against tar sands oil and in West Virginia protesting against coal mining. But he spends a significant part of his working life now blogging and tweeting in his efforts to engage with the public – and fending off attacks.

On the eve of his talk at Penn State, a coal industry lobby group calling itself the Common Sense Movement/Secure Energy for America put up a Facebook page demanding the university disinvite their own professor from speaking, and denouncing Mann as a “disgraced academic” pursuing a radical environmental agenda. The university refused. Common Sense appeared to have dismantled the Facebook page.

But Mann’s attackers were merely regrouping. A hostile blogger published a link to Mann’s Amazon page, and his opponents swung into action, denouncing the book as a “fairy tale” and climate change as “the greatest scam in human history.”

It was not the life Mann envisaged when he began work on his post-graduate degree at Yale. All Mann knew then was that he wanted to work on big problems, that resonated outside academia. At heart, he said, he was like one of the amiable nerds on the television show Big Bang Theory.

“At that time I wanted nothing more than just to bury my head in my computer and study data and write papers and write programs,” he said. “That is the way I was raised. That is the culture I came from.”

What happened instead was that the “hockey stick” graph, because it so clearly represented what had happened to the climate over the course of hundreds of years, itself became a proxy in the climate wars. (Mann’s reconstruction of temperatures over the last millennium itself used proxy records from tree rings and coral).

“I think because the hockey stick became an icon, it’s been subject to the fiercest of attacks really in the whole science of climate change,” he said.

The U.N.’s Intergovernmental Panel on Climate Change produced a poster-sized graph for the launch of its climate change report in 2001.

Those opposed to climate change began accusing Mann of overlooking important data or even manipulating the records. None of the allegations were ever found to have substance. The hockey stick would eventually be confirmed by more than 10 other studies.

Mann, like other scientists, was just not equipped to deal with the media barrage. “It took the scientific community some time I think to realize that the scientific community is in a street fight with climate change deniers and they are not playing by the rules of engagement of science. The scientific community needed some time to wake up to that.”

By 2005, when Hurricane Katrina drew Americans’ attention to the connection between climate change and coastal flooding, scientists were getting better at making their case to the public. George Bush, whose White House in 2003 deleted Mann’s hockey stick graph from an environmental report, began talking about the need for biofuels. Then Barack Obama was elected on a promise to save a planet in peril.

But as Mann lays out in the book, the campaign to discredit climate change continued to operate, largely below the radar until November 2009 when a huge cache of email from the University of East Anglia’s Climatic Research Unit was released online without authorization.

Right-wing media and bloggers used the emails to discredit an entire body of climate science. They got an extra boost when an embarrassing error about melting of Himalayan glaciers appeared in the U.N.’s IPCC report.

Mann now admits the climate community took far too long to realize the extent of the public relations debacle. Aside from the glacier error, the science remained sound. But Mann said now: “There may have been an overdue amount of complacency among many in the scientific community.”

Mann, who had been at the center of so many debates in America, was at the heart of the East Anglia emails battle too.

Though he has been cleared of any wrongdoing, Mann does not always come off well in those highly selective exchanges of email released by the hackers. In some of the correspondence with fellow scientists, he is abrupt, dismissive of some critics. In our time at State College, he mentions more than once how climate scientists are a “cantankerous” bunch. He has zero patience, for example, for the polite label “climate skeptic” for the network of bloggers and talking heads who try to discredit climate change.

“When it comes to climate change, true skepticism is two-sided. One-sided skepticism is no skepticism at all,” he said. “I will call people who deny the science deniers … I guess I won’t be deterred by the fact that they don’t like the use of that term and no doubt that just endears me to them further.”

“It’s frustrating of course because a lot of us would like to get past this nonsensical debate and on to the real debate to be had about what to do,” he said.

But he said there are compensations in the support he gets from the public. He moves over to his computer to show off a web page: I ❤ climate scientists. He’s one of three featured scientists. “It only takes one thoughtful email of support to offset a thousand thoughtless attacks,” Mann said.

And although there are bad days, he still seems to believe he is on the winning side.

Across America, this is the third successive year of weird weather. The U.S. department of agriculture has just revised its plant hardiness map, reflecting warming trends. That is going to reinforce scientists’ efforts to cut through the disinformation campaign, Mann said.

“I think increasingly the campaign to deny the reality of climate change is going to come up against that brick wall of the evidence being so plain to people whether they are hunters, fishermen, gardeners,” he said.

And if that doesn’t work then Mann is going to fight to convince them.

“Whether I like it or not I am out there on the battlefield,” he said. But he believes the experiences of the last decade have made him, and other scientists, far better fighters.

“Those of us who have had to go through this are battle-hardened and hopefully the better for it,” he said. “I think you are now going to see the scientific community almost uniformly fighting back against this assault on science. I don’t know what’s going to happen in the future, but I do know that my fellow scientists and I are very ready to engage in this battle.”

Video: James West, The Climate Desk

Original story at The Guardian.

Newly Discovered Space Rock Is Headed Toward Earth, Estimated Time of Arrival 2040 (POPSCI.com)

The UN is figuring out how to ward off a potential collision

By Clay Dillow
Posted 02.27.2012 at 1:34 pm

Earth, and the Near-Earth Objects that Threaten It ESA – P.Carril

All eyes are on the asteroid Apophis, but a new threat–just 460 feet wide–dominated the conversation at a recent meeting of the UN Action Team on near-Earth objects (NEOs). Known as 2011 AG5, the asteroid could well be on a collision course with Earth in 2040, and some are already calling on scientists to figure out how to deflect it.

Discovered early last year, 2011 AG5 is still somewhat of a mystery to astronomers, as they have a pretty good idea how big it is but have only been able to observe it for roughly half an orbit. That makes it difficult to project the object’s path over time–and to verify whether it may be a threat in 2040. Ideally, researchers would like to observe at least two full orbits before making projections about an NEO’s path, but that hasn’t stopped several in the astronomy from fixing odds on an impact in 2040.

Specifically, those odds are currently at 1 in 625 for an impact on Feb. 5, 2040. But like most odds, these are fluid. From 2013 to 2016, the asteroid will be observable from the ground, and that will give NEO watchers a better idea of its orbit and future trajectory. If those observations don’t vastly diminish the odds of an impact, there should still be time to do something about it before its 2023 keyhole pass.Like Apophis, which may or may not impact Earth in 2036, 2011 AG5 has a keyhole–a region is space near Earth through which it would travel if indeed it is going to impact us on its next pass. It will make its keyhole pass on its approach near Earth in February 2023 when it comes within just 0.02 astronomical units of Earth (that’s roughly 1.86 million miles). NASA’s Jet Propulsion Lab estimates 2011 AG5’s keyhole is about 62 miles wide–not big at all by astronomical standards, but bigger than Apophis’s.

If 2011 AG5 does look like it is going to pass through that keyhole after the 2013-2016 observations, scientists will have a few years to figure out how to alter its orbit and push it outside of the keyhole in 2023, thus averting disaster 17 years later. Such a deflection mission could be good practice. Apophis will make a run at its keyhole in 2029.

 

O planeta doente (culturaebarbarie.org)

por Guy Debord

A “poluição” está hoje na moda, exatamente da mesma maneira que a revolução: ela se apodera de toda a vida da sociedade e é representada ilusoriamente no espetáculo. Ela é tagarelice tediosa numa pletora de escritos e de discursos errôneos e mistificadores, e, nos fatos, ela pega todo mundo pelo pescoço. Ela se expõe em todo lugar enquanto ideologia e ganha terreno enquanto processo real. Esses dois movimentos antagônicos, o estágio supremo da produção mercantil e o projeto de sua negação total, igualmente ricos de contradições em simesmos, crescem em conjunto. São os dois lados pelos quais se manifesta um mesmo momento histórico há muito tempo esperado e freqüentemente previsto sob figuras parciais inadequadas: a impossibilidade da continuação do funcionamento do capitalismo.

A época que tem todos os meios técnicos de alterar as condições de vida na Terra é igualmente a época que, pelo mesmo desenvolvimento técnico e científico separado, dispõe de todos os meios de controle e de previsão matematicamente indubitável para medir com exatidão antecipada para onde conduz — e em que data — o crescimento automático das forças produtivas alienadas da sociedade de classes: isto é, para medir a degradação rápida das condições de sobrevida, no sentido o mais geral e o mais trivial do termo.

Enquanto imbecis passadistas ainda dissertam sobre, e contra, uma crítica estética de tudo isso, e crêem mostrar-se lúcidos e modernos por se mostrarem esposados com seu século, proclamando que a auto-estrada ou Sarcelles têm sua beleza que se deveria preferir ao desconforto dos “pitorescos” bairros antigos ou ainda fazendo observar gravemente que o conjunto da população come melhor, a despeito das nostalgias da boa cozinha, já o problema da degradação da totalidade do ambiente natural e humano deixou completamente de se colocar no plano da pretensa qualidade antiga, estética ou outra, para se tornar radicalmente o próprio problema da possibilidade material de existência do mundo que persegue um tal movimento. A impossibilidade está de fato já perfeitamente demonstrada por todo o conhecimento científico separado, que discute somente sua data de vencimento; e os paliativos que, se fossem aplicados firmemente, a poderiam regular superficialmente. Uma tal ciência apenas pode acompanhar em direção à destruição o mundo que a produziu e que a mantém; mas ela é obrigada a fazê-lo com os olhos abertos. Ela mostra assim, num nível caricatural, a inutilidade do conhecimento sem uso.

Mede-se e se extrapola com uma precisão excelente o aumento rápido da poluição química da atmosfera respirável, da água dos rios, dos lagos e até mesmo dos oceanos; e o aumento irreversível da radioatividade acumulada pelo desenvolvimento pacífico da energia nuclear, dos efeitos do barulho, da invasão do espaço por produtos de materiais plásticos que podem exigir uma eternidade de depósito universal, da natalidade louca, da falsificação insensata dos alimentos, da lepra urbanística que se estende sempre mais no lugar do que antes foram a cidade e o campo; assim como as doenças mentais — aí compreendidas as fobias neuróticas e as alucinações que não poderiam deixar de se multiplicar bem cedo sobre o tema da própria poluição, da qual se mostra em todo lugar a imagem alarmante — e do suicídio, cujas taxas de expansão se entrecruzam já exatamente com as de edificação de um tal ambiente (para não falar dos efeitos da guerra atômica ou bacteriológica, cujos meios estão posicionados como a espada de Dâmocles, mas permanecem evidentemente evitáveis).

Logo, se a amplitude e a própria realidade dos “terrores do Ano Mil” são ainda um assunto controverso entre os historiadores, o terror do Ano Dois Mil é tão patente quanto bem fundado; ele é desde o presente uma certeza científica. Contudo, o que se passa não é em si mesmo nada novo: é somente o fim necessário do antigo processo. Uma sociedade cada vez mais doente, mas cada vez mais poderosa, recriou em todo lugar concretamente o mundo como ambiente e décorde sua doença, enquanto planeta doente. Uma sociedade que não se tornou ainda homogênea e que não é mais determinada por si mesma, mas cada vez maispor uma parte dela mesma que lhe é superior, desenvolveu um movimento de dominação da natureza que contudo não se dominou a si mesmo. O capitalismo finalmente trouxe a prova, por seu próprio movimento, de que ele não pode mais desenvolver as forças produtivas; e isso não quantitativamente, como muitos acreditaram compreender, mas qualitativamente.

Contudo, para o pensamento burguês, metodologicamente, somente o quantitativo é o sério, o mensurável, o efetivo; e o qualitativo é somente a incerta decoração subjetiva ou artística do verdadeiro real estimado em seu verdadeiro peso. Ao contrário, para o pensamento dialético, portanto, para a história e para o proletariado, o qualitativo é a dimensão a mais decisiva do desenvolvimento real. Eis aí o que o capitalismo e nós terminamos por demonstrar.

Os senhores da sociedade são obrigados agora a falar da poluição, tanto para combatê-la (pois eles vivem, apesar de tudo, no mesmo planeta que nós; é este o único sentido ao qual se pode admitir que o desenvolvimento do capitalismo realizou efetivamente uma certa fusão das classes) e para a dissimular, pois a simples verdade dos danos e dos riscos presentes basta para constituir um imenso fator de revolta, uma exigência materialista dos explorados, tão inteiramente vital quanto o foi a luta dos proletários do século XIX pela possibilidade de comer. Após o fracasso fundamental de todos os reformismos do passado — que aspiram todos eles à solução definitiva do problema das classes —, um novo reformismo se desenha, que obedece às mesmas necessidades que os precedentes: lubrificar a máquina e abrir novas oportunidades de lucros às empresas de ponta. O setor mais moderno da indústria se lança nos diferentes paliativos da poluição, como em um novo nicho de mercado, tanto mais rentável quanto mais uma boa parte do capital monopolizado pelo Estado nele está a empregar e a manobrar. Mas se este novo reformismo tem de antemão a garantia de seu fracasso, exatamente pelas mesmas razões que os reformismos passados, ele guarda em face deles a radical diferença de que não tem mais tempo diante de si.

O desenvolvimento da produção se verificou inteiramente até aqui enquanto realização daeconomia política: desenvolvimento da miséria, que invadiu e estragou o próprio meio da vida. A sociedade em que os produtores se matam no trabalho, e cujo resultado devem somente contemplar, lhes deixa claramente ver, e respirar, o resultado geral do trabalho alienado enquanto resultado de morte. Na sociedade da economia superdesenvolvida, tudo entrou na esfera dos bens econômicos, mesmo a água das fontes e o ar das cidades, quer dizer que tudo se tornou o mal econômico, “negação acabada do homem” que atinge agora sua perfeita conclusão material. O conflito entre as forças produtivas modernas e as relações de produção, burguesas ou burocráticas, da sociedade capitalista entrou em sua fase última. A produção da não-vida prosseguiu cada vez mais seu processo linear e cumulativo; vindo a atravessar um último limiar em seu progresso, ela produz agora diretamente a morte.

A função última, confessada, essencial, da economia desenvolvida hoje, no mundo inteiro em que reina o trabalho-mercadoria, que assegura todo o poder a seus patrões, é a produção dos empregos. Está-se bem longe das idéias “progressistas” do século anterior [século XIX] sobre a diminuição possível do trabalho humano pela multiplicação científica e técnica da produtividade, que se supunha assegurar sempre mais facilmente a satisfação das necessidades anteriormente reconhecidas por todos reais e sem alteração fundamental da qualidade mesma dos bens que se encontrariam disponíveis. É presentemente para produzir empregos, até nos campos esvaziados de camponeses, ou seja, para utilizar o trabalho humano enquanto trabalho alienado, enquanto assalariado, que se faz todo o resto; e, portanto, que se ameaça estupidamente as bases, atualmente mais frágeis ainda que o pensamento de um Kennedy ou de um Brejnev, da vida da espécie.

O velho oceano é em si mesmo indiferente à poluição; mas a história não o é. Ela somente pode ser salva pela abolição do trabalho-mercadoria. E nunca a consciência histórica teve tanta necessidade de dominar com tanta urgência seu mundo, pois o inimigo que está à sua porta não é mais a ilusão, mas sua morte.

Quando os pobres senhores da sociedade da qual vemos a deplorável conclusão, bem pior do que todas as condenações que puderam fulminar outrora os mais radicais dos utopistas, devem presentemente reconhecer que nosso ambiente se tornou social, que a gestão detudo se tornou um negócio diretamente político, até as ervas dos campos e a possibilidade de beber, até a possibilidade de dormir sem muitos soníferos ou de tomar um banho sem sofrer de alergias, num tal momento se deve ver também que a velha política especializada deve reconhecer que ela está completamente finda.

Ela está finda na forma suprema de seu voluntarismo: o poder burocrático totalitário dos regimes ditos socialistas, porque os burocratas no poder não se mostraram capazes nem mesmo de gerir o estágio anterior da economia capitalista. Se eles poluem muito menos — apenas os Estados Unidos produzem sozinhos 50% da poluição mundial — é porque são muito mais pobres. Eles somente podem, como por exemplo a China, reunindo em bloco uma parte desproporcionada de sua contabilidade de miséria, comprar a parte de poluição de prestígio das potências pobres, algumas descobertas e aperfeiçoamentos nas técnicas da guerra termonuclear, ou mais exatamente, do espetáculo ameaçador. Tanta pobreza, material e mental, sustentada por tanto terrorismo, condena as burocracias no poder. E o que condena o poder burguês mais modernizado é o resultado insuportável de tanta riquezaefetivamente empestada. A gestão dita democrática do capitalismo, em qualquer país que seja, somente oferece suas eleições-demissões que, sempre se viu, nunca mudava nada no conjunto, e mesmo muito pouco no detalhe, numa sociedade de classes que se imaginava poder durar indefinidamente. Elas aí não mudam nada de mais no momento em que a própria gestão enlouquece e finge desejar, para cortar certos problemas secundários embora urgentes, algumas vagas diretrizes do eleitorado alienado e cretinizado (U.S.A., Itália, Inglaterra, França). Todos os observadores especializados sempre salientaram — sem se preocuparem em explicar — o fato de que o eleitor não muda nunca de “opinião”: é justamente porque é eleitor, o que assume, por um breve instante, o papel abstrato que é precisamente destinado a impedir de ser por si mesmo, e de mudar (o mecanismo foi demonstrado centenas de vezes, tanto pela análise política desmistificada quanto pelas explicações da psicanálise revolucionária). O eleitor não muda mais quando o mundo muda sempre mais precipitadamente em torno dele e, enquanto eleitor, ele não mudaria mesmo às vésperas do fim do mundo. Todo sistema representativo é essencialmente conservador, mesmo se as condições de existência da sociedade capitalista não puderam nunca ser conservadas: elas se modificam sem interrupção, e sempre mais rápido, mas a decisão — que afinal é sempre a decisão de liberar o próprio processo da produção capitalista — é deixada inteiramente aos especialistas da publicidade, quer sejam eles únicos na competição ou em concorrência com aqueles que vão fazer a mesma coisa, e aliás o anunciam abertamente. Contudo, o homem que vota “livremente” nos gaullistas ou no P.C.F., tanto quanto o homem que vota, constrangido e forçado, num Gomulka, é capaz de mostrar o que ele verdadeiramente é, na semana seguinte, participando de uma greve selvagem ou de uma insurreição.

A autoproclamada “luta contra a poluição”, por seu aspecto estatal e legalista, vai de início criar novas especializações, serviços ministeriais, cargos, promoção burocrática. E sua eficácia estará completamente na medida de tais meios. Mas ela somente pode se tornar uma vontade real ao transformar o sistema produtivo atual em suas próprias raízes. E somente pode ser aplicada firmemente no instante em que todas suas decisões, tomadas democraticamente em conhecimento pleno de causa, pelos produtores, estiverem a todo instante controladas e executadas pelos próprios produtores (por exemplo, os navios derramarão infalivelmente seu petróleo no mar enquanto não estiverem sob a autoridade de reais soviets de marinheiros). Para decidir e executar tudo isso, é preciso que os produtores se tornem adultos: é preciso que se apoderem todos do poder.

O otimismo científico do século XIX se desmoronou em três pontos essenciais. Primeiro, a pretensão de garantir a revolução como resolução feliz dos conflitos existentes (esta era a ilusão hegelo-esquerdista e marxista; a menos notada naintelligentsia burguesa, mas a mais rica e, afinal, a menos ilusória). Segundo, a visão coerente do universo, e mesmo simplesmente, da matéria. Terceiro, o sentimento eufórico e linear do desenvolvimento das forças produtivas. Se nós dominarmos o primeiro ponto, teremos resolvido o terceiro; e saberemos fazer bem mais tarde do segundo nossa ocupação e nosso jogo. Não é preciso tratar dos sintomas, mas da própria doença. Hoje o medo está em todo lugar, somente sairemos dele confiando-nos em nossas próprias forças, em nossa capacidade de destruir toda alienação existente e toda imagem do poder que nos escapou. Remetendo tudo, com exceção de nós próprios, ao único poder dos Conselhos de Trabalhadores possuindo e reconstruindo a todo instante a totalidade do mundo, ou seja, à racionalidade verdadeira, a uma legitimidade nova.

Em matéria de ambiente “natural” e construído, de natalidade, de biologia, de produção, de “loucura” etc., não haverá que escolher entre a festa e a infelicidade, mas, conscientemente e em cada encruzilhada, entre, de um lado, mil possibilidades felizes ou desastrosas, relativamente corrigíveis, e, de outra parte, o nada. As escolhas terríveis do futuro próximo deixam esta única alternativa: democracia total ou burocracia total. Aqueles que duvidam da democracia total devem esforçar-se para fazer por si mesmos a prova dela, dando-lhe a oportunidade de se provar em marcha; ou somente lhes resta comprar seu túmulo a prestações, pois “a autoridade, se a viu em obra, e suas obras a condenam” (Jacques Déjacque).

“A revolução ou a morte”: esse slogan não é mais a expressão lírica da consciência revoltada, é a última palavra do pensamento científico de nosso século [XX]. Isso se aplica aos perigos da espécie como à impossibilidade de adesão pelos indivíduos. Nesta sociedade em que o suicídio progride como se sabe, os especialistas tiveram que reconhecer, com um certo despeito, que ele caíra a quase nada em maio de 1968. Essa primavera obteve assim, sem precisamente subi-lo em assalto, um bom céu, porque alguns carros queimaram e porque a todos os outros faltou combustível para poluir. Quando chove, quando há nuvens sobre Paris, não esqueçam nunca que isso é responsabilidade do governo. A produção industrial alienada faz chover. A revolução faz o bom tempo.

Escrito em 1971, por Guy Debord, para aparecer no nº 13 da revista Internacional Situacionista, este artigo permaneceu inédito até recentemente, quando foi publicado, junto com dois outros textos do mesmo autor, em La Planète malade (Paris, Gallimard, 2004, pp. 77-94). A tradução de “O planeta doente” aqui publicada apareceu pela primeira vez em http://juralibertaire.over-blog.com/article-13908597.html. Tradução de Emiliano Aquino (http://emilianoaquino.blogspot.com/).

Fonte:  http://culturaebarbarie.org/sopro/arquivo/planetadoente.html

19 Climate Games that Could Change the Future (Climate Interactive Blog)

By 

March 9, 2012 – 10:13 a.m.

The prevalence of games in our culture provides an opportunity to increase the understanding of our global challenges. In 2008 the Pew Research Centerestimated that over half of American adults played video games and 80% of young Americans play video games. The vast majority of these games serve purely to entertain. There are a growing number of games that aim to make a difference, however. These games range from those that show players the complexity of creating adequate aid packages and delivering them to places in need to games thatrequire people to get out and work to improve their communities to do well in the game.

Looking at the climate change challenge there are a number of games and interactive tools to broaden our understanding of the dynamics involved.Climate Interactive, for one, has led the development of the role-playing game World Climate, which simulates the UN climate change negotiations and is being adopted from middle school all the way up to executive management-level classrooms. Many are recognizing the power of games and everyone from government agencies to NGOs to a group of teenagers is trying to launch a game to help address climate change. Below are some of the climate and sustainability-related games we’ve found. Let us know if you’ve found others.

Computer Games:

Climate Challenge

1. Climate Challenge: The player acts as a European leader who must make decisions for their nation to reduce CO2 emissions, but must also keep in mind public and international approval, energy, food, and financial needs.

2. Fate of the World: A PC game that challenges players to solve the crises facing the Earth from natural disasters and climate change to political uprisings and international relations.

3. CEO2: A game that puts players at the head of a company in one of four industries. The player must then make decisions to reduce the CO2 and maintain (and increase) the company’s value.

4. VGas: Users build a house and select the best furnishing and lifestyle choices to have the lowest carbon footprint.

5. CO2FX: A multi-player educational game, designed for students in high school, which explores the relationship of climate change to economic, political, and science policy decisions.

6. “Operation: Climate Control” Game: A multi-player computer game where the player’s role is to decide on local environmental policy for Europe through the 21st century.

My2050

7. My2050: An interactive game to determine a scenario for the UK to lower its CO2 emissions 20% below 1990 levels by 2050. The user can select from adjustments in sectors from energy to transit.

8. Plan it Green: Gamers act as the planners of a city to revitalize it to become a greener town through energy retrofits, clean energy jobs, and green building.

9. Logicity: A game that challenges players to reduce their carbon footprints by making decisions in a virtual city.

10. Electrocity: A game designed for school children in New Zealand to plan a city that balances the needs of energy, development, and the environment.

11. Climate Culture: A virtual social networking game based on players’ actual carbon footprints and lifestyle choices. Players compete to earn badges and awards for their decisions.

12. World Without Oil: An alternate reality game that was played out on blogs and other social media platforms for 32 weeks in 2007 by thousands of players to simulate what might happen if there was an oil crisis and oil became inaccessible. Participants wrote blogs and made videos about their experience as if it was real.

13. SimCity 5 (coming 2013): With over 20 years of experience and millions of players the SimCity series has captured imaginations by putting players in control of developing cities. Recently announced, SimCity 5 will add among other things the need to face sustainability challenges like climate change, limited natural resources, and urban walkability.

Role-playing Games:

14. World Climate Exercise: A role-playing game for groups that simulates the UN climate change negotiations by dividing the group into regional and national negotiating teams to negotiate a treaty to 2 degrees or less. 

15. “Stabilization Wedge” Game: A game to show participants the different ways to cut carbon emissions, through the concept of wedges.

Board Games:

16. Climate Catan: Building on the widely popular board game Settlers of Catan, this version adds oil as resource that spurs development but if too much is used it also instigates a climate related disaster which can ruin development.

17. Climate-Poker: A card game with the aim to have the largest climate conference in order to address climate change.

18. Keep Cool- Gambling with the Climate: Players take on the roles of national political leaders trying to address climate change and must make decisions about the type of growth and balance the desires of lobby groups and challenges of natural disasters.

19. Polar Eclipse Game: A game where players navigate different decisions in order to chart a path to future that avoids the worst temperature rise.

Lessons from Gaming for Climate Wonks and Leaders — Video

By 

Games can help us ensure that climate and energy analysis gets used to make a difference. Last week at the Climate Prediction Applications Science Workshopin Miami, Climate Interactive co-director Drew Jones, gave a keynote presentation to an audience of climate analysts, many who are working to communicate the massive amount of climate data to the public.

In Drew’s speech below, he draws out the key things that we are learning from games, like Angry Birds, Farmville, World of Warcraft, and the existing efforts to integrate climate change into games. Also included in this presentation, but left out of the video, was a condensed version of the World Climate Exercise, a game that Climate Interactive has developed to help people explore the complex dynamics encountered at the international climate change negotiations.