Arquivo da tag: ciência

Despite Awareness Of Global Warming Americans Concerned More About Local Environment (Science Daily)

ScienceDaily (Mar. 26, 2008) — British Prime Minister Gordon Brown recently declared climate change a top international threat, and Al Gore urged politicians to get involved to fight global warming. Results from a recent survey conducted by a University of Missouri professor reveal that the U.S. public, while aware of the deteriorating global environment, is concerned predominantly with local and national environmental issues.

Potomac River near Washington DC. The top three issues that the US public wants the government to address are protecting community drinking water, reducing pollution of U.S. rivers and lakes, and improving urban air pollution issues like smog. (Credit: Michele Hogan)

“The survey’s core result is that people care about their communities and express the desire to see government action taken toward local and national issues,” said David Konisky, a policy research scholar with the Institute of Public Policy. “People are hesitant to support efforts concerning global issues even though they believe that environmental quality is poorer at the global level than at the local and national level. This is surprising given the media attention that global warming has recently received and reflects the division of opinion about the severity of climate change.”

Konisky, an assistant professor in the Truman School of Public Affairs at MU, recently surveyed 1,000 adults concerning their attitudes about the environment. The survey polled respondents about their levels of concern for the environment and preferences for government action to address a wide set of environmental issues.

A strong majority of the public expressed general concern about the environment. According to the survey, the top three issues that the public wants the government to address are protecting community drinking water, reducing pollution of U.S. rivers and lakes, and improving urban air pollution issues like smog.  In the survey, global warming ranks eighth in importance.

“Americans are clearly most concerned about pollution issues that might affect their personal health, or the health of their families,” Konisky said.

Additionally, Konisky and his colleagues found that the best predictor of individuals’ environmental preferences is their political attributes. They examined the relationship between party identification and political ideology and support for action to address environmental problems.

“The survey reinforced the stark differences in people’s environmental attitudes, depending on their political leanings,” Konisky said. “Democrats and political liberals clearly express more desire for governmental action to address environmental problems. Republicans and ideological conservatives are much less enthusiastic about further government intervention.”

Results from the survey were recently presented at the annual meeting of the Western Political Science Association in San Diego.

Support for Climate Change Action Drops, Poll Finds (Science Daily)

ScienceDaily (May 8, 2012) — Americans’ support for government action on global warming remains high but has dropped during the past two years, according to a new survey by Stanford researchers in collaboration with Ipsos Public Affairs. Political rhetoric and cooler-than-average weather appear to have influenced the shift, but economics doesn’t appear to have played a role.

The survey directed by Jon Krosnick, a senior fellow at the Stanford Woods Institute for the Environment, shows that support for a range of policies intended to reduce future climate change dropped by an average of 5 percentage points per year between 2010 and 2012.

In a 2010 Stanford survey, more than three-quarters of respondents expressed support for mandating more efficient and less polluting cars, appliances, homes, offices and power plants. Nearly 90 percent of respondents favored federal tax breaks to spur companies to produce more electricity from water, wind and solar energy. On average, 72 percent of respondents supported government action on climate change in 2010. By 2012, that support had dropped to 62 percent.

The drop was concentrated among Americans who distrust climate scientists, even more so among such people who identify themselves as Republicans. Americans who do not trust climate science were especially aware of and influenced by recent shifts in world temperature, and 2011 was tied for the coolest of the last 11 years.

Krosnick pointed out that during the recent campaign, all but one Republican presidential candidate expressed doubt about global warming, and some urged no government action to address the issue. Rick Santorum described belief in climate change as a “pseudo-religion,” while Ron Paul called it a “hoax.” Mitt Romney, the apparent Republican nominee, has said, “I can tell you the right course for America with regard to energy policy is to focus on job creation and not global warming.”

The Stanford-Ipsos study found no evidence that the decline in public support for government action was concentrated among respondents who lived in states struggling the most economically.

The study found that, overall, the majority of Americans continue to support many specific government actions to mitigate global warming’s effect. However, most Americans remain opposed to consumer taxes intended to decrease public use of electricity and gasoline.

Jews Are a ‘Race,’ Genes Reveal (The Jewish Daily Forward)

MONTAGE KURT HOFFMAN

By Jon Entine

Published May 04, 2012, issue of May 11, 2012.

In his new book, “Legacy: A Genetic History of the Jewish People,” Harry Ostrer, a medical geneticist and professor at Albert Einstein College of Medicine in New York, claims that Jews are different, and the differences are not just skin deep. Jews exhibit, he writes, a distinctive genetic signature. Considering that the Nazis tried to exterminate Jews based on their supposed racial distinctiveness, such a conclusion might be a cause for concern. But Ostrer sees it as central to Jewish identity.

“Who is a Jew?” has been a poignant question for Jews throughout our history. It evokes a complex tapestry of Jewish identity made up of different strains of religious beliefs, cultural practices and blood ties to ancient Palestine and modern Israel. But the question, with its echoes of genetic determinism, also has a dark side.

Geneticists have long been aware that certain diseases, from breast cancer to Tay-Sachs, disproportionately affect Jews. Ostrer, who is also director of genetic and genomic testing at Montefiore Medical Center, goes further, maintaining that Jews are a homogeneous group with all the scientific trappings of what we used to call a “race.”

For most of the 3,000-year history of the Jewish people, the notion of what came to be known as “Jewish exceptionalism” was hardly controversial. Because of our history of inmarriage and cultural isolation, imposed or self-selected, Jews were considered by gentiles (and usually referred to themselves) as a “race.” Scholars from Josephus to Disraeli proudly proclaimed their membership in “the tribe.”


Legacy: A Genetic History of the Jewish People
By Harry Ostrer
Oxford University Press, 288 Pages, $24.95

Ostrer explains how this concept took on special meaning in the 20th century, as genetics emerged as a viable scientific enterprise. Jewish distinctiveness might actually be measurable empirically. In “Legacy,” he first introduces us to Maurice Fishberg, an upwardly mobile Russian-Jewish immigrant to New York at the fin de siècle. Fishberg fervently embraced the anthropological fashion of the era, measuring skull sizes to explain why Jews seemed to be afflicted with more diseases than other groups — what he called the “peculiarities of the comparative pathology of the Jews.” It turns out that Fishberg and his contemporary phrenologists were wrong: Skull shape provides limited information about human differences. But his studies ushered in a century of research linking Jews to genetics.

Ostrer divides his book into six chapters representing the various aspects of Jewishness: Looking Jewish, Founders, Genealogies, Tribes, Traits and Identity. Each chapter features a prominent scientist or historical figure who dramatically advanced our understanding of Jewishness. The snippets of biography lighten a dense forest of sometimes-obscure science. The narrative, which consists of a lot of potboiler history, is a slog at times. But for the specialist and anyone touched by the enduring debate over Jewish identity, this book is indispensable.

“Legacy” may cause its readers discomfort. To some Jews, the notion of a genetically related people is an embarrassing remnant of early Zionism that came into vogue at the height of the Western obsession with race, in the late 19th century. Celebrating blood ancestry is divisive, they claim: The authors of “The Bell Curve” were vilified 15 years ago for suggesting that genes play a major role in IQ differences among racial groups.

Furthermore, sociologists and cultural anthropologists, a disproportionate number of whom are Jewish, ridicule the term “race,” claiming there are no meaningful differences between ethnic groups. For Jews, the word still carries the especially odious historical association with Nazism and the Nuremberg Laws. They argue that Judaism has morphed from a tribal cult into a worldwide religion enhanced by thousands of years of cultural traditions.

Is Judaism a people or a religion? Or both? The belief that Jews may be psychologically or physically distinct remains a controversial fixture in the gentile and Jewish consciousness, and Ostrer places himself directly in the line of fire. Yes, he writes, the term “race” carries nefarious associations of inferiority and ranking of people. Anything that marks Jews as essentially different runs the risk of stirring either anti- or philo-Semitism. But that doesn’t mean we can ignore the factual reality of what he calls the “biological basis of Jewishness” and “Jewish genetics.” Acknowledging the distinctiveness of Jews is “fraught with peril,” but we must grapple with the hard evidence of “human differences” if we seek to understand the new age of genetics.

Although he readily acknowledges the formative role of culture and environment, Ostrer believes that Jewish identity has multiple threads, including DNA. He offers a cogent, scientifically based review of the evidence, which serves as a model of scientific restraint.

“On the one hand, the study of Jewish genetics might be viewed as an elitist effort, promoting a certain genetic view of Jewish superiority,” he writes. “On the other, it might provide fodder for anti-Semitism by providing evidence of a genetic basis for undesirable traits that are present among some Jews. These issues will newly challenge the liberal view that humans are created equal but with genetic liabilities.”

Jews, he notes, are one of the most distinctive population groups in the world because of our history of endogamy. Jews — Ashkenazim in particular — are relatively homogeneous despite the fact that they are spread throughout Europe and have since immigrated to the Americas and back to Israel. The Inquisition shattered Sephardi Jewry, leading to far more incidences of intermarriage and to a less distinctive DNA.

In traversing this minefield of the genetics of human differences, Ostrer bolsters his analysis with volumes of genetic data, which are both the book’s greatest strength and its weakness. Two complementary books on this subject — my own “Abraham’s Children: Race, Identity, and the DNA of the Chosen People” and “Jacob’s Legacy: A Genetic View of Jewish History” by Duke University geneticist David Goldstein, who is well quoted in both “Abraham’s Children” and “Legacy” — are more narrative driven, weaving history and genetics, and are consequently much more congenial reads.

The concept of the “Jewish people” remains controversial. The Law of Return, which establishes the right of Jews to come to Israel, is a central tenet of Zionism and a founding legal principle of the State of Israel. The DNA that tightly links Ashkenazi, Sephardi and Mizrahi, three prominent culturally and geographically distinct Jewish groups, could be used to support Zionist territorial claims — except, as Ostrer points out, some of the same markers can be found in Palestinians, our distant genetic cousins, as well. Palestinians, understandably, want their own right of return.

That disagreement over the meaning of DNA also pits Jewish traditionalists against a particular strain of secular Jewish liberals that has joined with Arabs and many non-Jews to argue for an end to Israel as a Jewish nation. Their hero is Shlomo Sand, an Austrian-born Israeli historian who reignited this complex controversy with the 2008 publication of “The Invention of the Jewish People.”

Sand contends that Zionists who claim an ancestral link to ancient Palestine are manipulating history. But he has taken his thesis from novelist Arthur Koestler’s 1976 book, “The Thirteenth Tribe,” which was part of an attempt by post-World War II Jewish liberals to reconfigure Jews not as a biological group, but as a religious ideology and ethnic identity.

The majority of the Ashkenazi Jewish population, as Koestler, and now Sand, writes, are not the children of Abraham but descendants of pagan Eastern Europeans and Eurasians, concentrated mostly in the ancient Kingdom of Khazaria in what is now Ukraine and Western Russia. The Khazarian nobility converted during the early Middle Ages, when European Jewry was forming.

Although scholars challenged Koestler’s and now Sand’s selective manipulation of the facts — the conversion was almost certainly limited to the tiny ruling class and not to the vast pagan population — the historical record has been just fragmentary enough to titillate determined critics of Israel, who turned both Koestler’s and Sand’s books into roaring best-sellers.

Fortunately, re-creating history now depends not only on pottery shards, flaking manuscripts and faded coins, but on something far less ambiguous: DNA. Ostrer’s book is an impressive counterpoint to the dubious historical methodology of Sand and his admirers. And, as a co-founder of the Jewish HapMap — the study of haplotypes, or blocks of genetic markers, that are common to Jews around the world — he is well positioned to write the definitive response.

In accord with most geneticists, Ostrer firmly rejects the fashionable postmodernist dismissal of the concept of race as genetically naive, opting for a more nuanced perspective.

When the human genome was first mapped a decade ago, Francis Collins, then head of the National Genome Human Research Institute, said: “Americans, regardless of ethnic group, are 99.9% genetically identical.” Added J. Craig Venter, who at the time was chief scientist at the private firm that helped sequenced the genome, Celera Genomics, “Race has no genetic or scientific basis.” Those declarations appeared to suggest that “race,” or the notion of distinct but overlapping genetic groups, is “meaningless.”

But Collins and Venter have issued clarifications of their much-misrepresented comments. Almost every minority group has faced, at one time or another, being branded as racially inferior based on a superficial understanding of how genes peculiar to its population work. The inclination by politicians, educators and even some scientists to underplay our separateness is certainly understandable. But it’s also misleading. DNA ensures that we differ not only as individuals, but also as groups.

However slight the differences (and geneticists now believe that they are significantly greater than 0.1%), they are defining. That 0.1% contains some 3 million nucleotide pairs in the human genome, and these determine such things as skin or hair color and susceptibility to certain diseases. They contain the map of our family trees back to the first modern humans.

Both the human genome project and disease research rest on the premise of finding distinguishable differences between individuals and often among populations. Scientists have ditched the term “race,” with all its normative baggage, and adopted more neutral terms, such as “population” and “clime,” which have much of the same meaning. Boiled down to its essence, race equates to “region of ancestral origin.”

Ostrer has devoted his career to investigating these extended family trees, which help explain the genetic basis of common and rare disorders. Today, Jews remain identifiable in large measure by the 40 or so diseases we disproportionately carry, the inescapable consequence of inbreeding. He traces the fascinating history of numerous “Jewish diseases,” such as Tay-Sachs, Gaucher, Niemann-Pick, Mucolipidosis IV, as well as breast and ovarian cancer. Indeed, 10 years ago I was diagnosed as carrying one of the three genetic mutations for breast and ovarian cancer that mark my family and me as indelibly Jewish, prompting me to write “Abraham’s Children.”

Like East Asians, the Amish, Icelanders, Aboriginals, the Basque people, African tribes and other groups, Jews have remained isolated for centuries because of geography, religion or cultural practices. It’s stamped on our DNA. As Ostrer explains in fascinating detail, threads of Jewish ancestry link the sizable Jewish communities of North America and Europe to Yemenite and other Middle Eastern Jews who have relocated to Israel, as well as to the black Lemba of southern Africa and to India’s Cochin Jews. But, in a twist, the links include neither the Bene Israel of India nor Ethiopian Jews. Genetic tests show that both groups are converts, contradicting their founding myths.

Why, then, are Jews so different looking, usually sharing the characteristics of the surrounding populations? Think of red-haired Jews, Jews with blue eyes or the black Jews of Africa. Like any cluster — a genetic term Ostrer uses in place of the more inflammatory “race” — Jews throughout history moved around and fooled around, although mixing occurred comparatively infrequently until recent decades. Although there are identifiable gene variations that are common among Jews, we are not a “pure” race. The time machine of our genes may show that most Jews have a shared ancestry that traces back to ancient Palestine but, like all of humanity, Jews are mutts.

About 80% of Jewish males and 50% of Jewish females trace their ancestry back to the Middle East. The rest entered the “Jewish gene pool” through conversion or intermarriage. Those who did intermarry often left the faith in a generation or two, in effect pruning the Jewish genetic tree. But many converts became interwoven into the Jewish genealogical line. Reflect on the iconic convert, the biblical Ruth, who married Boaz and became the great-grandmother of King David. She began as an outsider, but you don’t get much more Jewish than the bloodline of King David!

To his credit, Ostrer also addresses the third rail of discussions about Jewishness and race: the issue of intelligence. Jews were latecomers to the age of freethinking. While the Enlightenment swept through Christian Europe in the 17th century, the Haskalah did not gather strength until the early 19th century. By the beginning of the new millennium, however, Jews were thought of as among the smartest people on earth. The trend is most prominent in America, which has the largest concentration of Jews outside Israel and a history of tolerance.

Although Jews make up less than 3% of the population, they have won more than 25% of the Nobel Prizes awarded to American scientists since 1950. Jews also account for 20% of this country’s chief executives and make up 22% of Ivy League students. Psychologists and educational researchers have pegged their average IQ at 107.5 to 115, with their verbal IQ at more than 120, a stunning standard deviation above the average of 100 found in those of European ancestry. Like it or not, the IQ debate will become an increasingly important issue going forward, as medical geneticists focus on unlocking the mysteries of the brain.

Many liberal Jews maintain, at least in public, that the plethora of Jewish lawyers, doctors and comedians is the product of our cultural heritage, but the science tells a more complex story. Jewish success is a product of Jewish genes as much as of Jewish moms.

Is it “good for the Jews” to be exploring such controversial subjects? We can’t avoid engaging the most challenging questions in the age of genetics. Because of our history of endogamy, Jews are a goldmine for geneticists studying human differences in the quest to cure disease. Because of our cultural commitment to education, Jews are among the top genetic researchers in the world.

As humankind becomes more genetically sophisticated, identity becomes both more fluid and more fixed. Jews in particular can find threads of our ancestry literally anywhere, muddying traditional categories of nationhood, ethnicity, religious belief and “race.” But such discussions, ultimately, are subsumed by the reality of the common shared ancestry of humankind. Ostrer’s “Legacy” points out that — regardless of the pros and cons of being Jewish — we are all, genetically, in it together. And, in doing so, he gets it just right.

Jon Entine is the founder and director of the Genetic Literacy Project at George Mason University, where he is senior research fellow at the Center for Health and Risk Communication. His website is www.jonentine.com.

Read more: http://www.forward.com/articles/155742/jews-are-a-race-genes-reveal/?p=all#ixzz1uJ67qPdJ

Bruno Latour: Love Your Monsters (Breakthrough)

Breakthrough Journal, No. 2, Fall 2011

Latour - crying baby - AP.jpg

In the summer of 1816, a young British woman by the name of Mary Godwin and her boyfriend Percy Shelley went to visit Lord Byron in Lake Geneva, Switzerland. They had planned to spend much of the summer outdoors, but the eruption of Mount Tambora in Indonesia the previous year had changed the climate of Europe. The weather was so bad that they spent most of their time indoors, discussing the latest popular writings on science and the supernatural.

After reading a book of German ghost stories, somebody suggested they each write their own. Byron’s physician, John Polidori, came up with the idea for The Vampyre, published in 1819,1 which was the first of the “vampire-as-seducer” novels. Godwin’s story came to her in a dream, during which she saw “the pale student of unhallowed arts kneeling beside the thing he had put together.”2 Soon after that fateful summer, Godwin and Shelley married, and in 1818, Mary Shelley’s horror story was published under the title, Frankenstein, Or, the Modern Prometheus.3

Frankenstein lives on in the popular imagination as a cautionary tale against technology. We use the monster as an all-purpose modifier to denote technological crimes against nature. When we fear genetically modified foods we call them “frankenfoods” and “frankenfish.” It is telling that even as we warn against such hybrids, we confuse the monster with its creator. We now mostly refer to Dr. Frankenstein’s monster as Frankenstein. And just as we have forgotten that Frankenstein was the man, not the monster, we have also forgotten Frankenstein’s real sin.

Dr. Frankenstein’s crime was not that he invented a creature through some combination of hubris and high technology, but rather that heabandoned the creature to itself. When Dr. Frankenstein meets his creation on a glacier in the Alps, the monster claims that it was notborn a monster, but that it became a criminal only after being left alone by his horrified creator, who fled the laboratory once the horrible thing twitched to life. “Remember, I am thy creature,” the monster protests, “I ought to be thy Adam; but I am rather the fallen angel, whom thou drivest from joy for no misdeed… I was benevolent and good; misery made me a fiend. Make me happy, and I shall again be virtuous.”

Written at the dawn of the great technological revolutions that would define the 19th and 20th centuries, Frankenstein foresees that the gigantic sins that were to be committed would hide a much greater sin. It is not the case that we have failed to care for Creation, but that we have failed to care for our technological creations. We confuse the monster for its creator and blame our sins against Nature upon our creations. But our sin is not that we created technologies but that we failed to love and care for them. It is as if we decided that we were unable to follow through with the education of our children.4

Let Dr. Frankenstein’s sin serve as a parable for political ecology. At a time when science, technology, and demography make clear that we can never separate ourselves from the nonhuman world — that we, our technologies, and nature can no more be disentangled than we can remember the distinction between Dr. Frankenstein and his monster — this is the moment chosen by millions of well-meaning souls to flagellate themselves for their earlier aspiration to dominion, to repent for their past hubris, to look for ways of diminishing the numbers of their fellow humans, and to swear to make their footprints invisible?

The goal of political ecology must not be to stop innovating, inventing, creating, and intervening. The real goal must be to have the same type of patience and commitment to our creations as God the Creator, Himself. And the comparison is not blasphemous: we have taken the whole of Creation on our shoulders and have become coextensive with the Earth.

What, then, should be the work of political ecology? It is, I believe, tomodernize modernization, to borrow an expression proposed by Ulrich Beck.5 
This challenge demands more of us than simply embracing technology and innovation. It requires exchanging the modernist notion of modernity for what I have called a “compositionist” one that sees the process of human development as neither liberation from Nature nor as a fall from it, but rather as a process of becoming ever-more attached to, and intimate with, a panoply of nonhuman natures.

1.
At the time of the plough we could only scratch the surface of the soil. Three centuries back, we could only dream, like Cyrano de Bergerac, of traveling to the moon. In the past, my Gallic ancestors were afraid of nothing except that the “sky will fall on their heads.”

Today we can fold ourselves into the molecular machinery of soil bacteria through our sciences and technologies. We run robots on Mars. We photograph and dream of further galaxies. And yet we fear that the climate could destroy us.

Everyday in our newspapers we read about more entanglements of all those things that were once imagined to be separable — science, morality, religion, law, technology, finance, and politics. But these things are tangled up together everywhere: in the Intergovernmental Panel on Climate Change, in the space shuttle, and in the Fukushima nuclear power plant.

If you envision a future in which there will be less and less of these entanglements thanks to Science, capital S, you are a modernist. But if you brace yourself for a future in which there will always be more of these imbroglios, mixing many more heterogeneous actors, at a greater and greater scale and at an ever-tinier level of intimacy requiring even more detailed care, then you are… what? A compositionist!

The dominant, peculiar story of modernity is of humankind’semancipation from Nature. Modernity is the thrusting-forward arrow of time — Progress — characterized by its juvenile enthusiasm, risk taking, frontier spirit, optimism, and indifference to the past. The spirit can be summarized in a single sentence: “Tomorrow, we will be able to separate more accurately what the world is really like from the subjective illusions we used to entertain about it.”

The very forward movement of the arrow of time and the frontier spirit associated with it (the modernizing front) is due to a certain conception of knowledge: “Tomorrow, we will be able to differentiate clearly what in the past was still mixed up, namely facts and values, thanks to Science.”

Science is the shibboleth that defines the right direction of the arrow of time because it, and only it, is able to cut into two well-separated parts what had, in the past, remained hopelessly confused: a morass of ideology, emotions, and values on the one hand, and, on the other, stark and naked matters of fact.

The notion of the past as an archaic and dangerous confusion arises directly from giving Science this role. A modernist, in this great narrative, is the one who expects from Science the revelation that Nature will finally be visible through the veils of subjectivity — and subjection — that hid it from our ancestors.

And here has been the great failure of political ecology. Just when all of the human and nonhuman associations are finally coming to the center of our consciousness, when science and nature and technology and politics become so confused and mixed up as to be impossible to untangle, just as these associations are beginning to be shaped in our political arenas and are triggering our most personal and deepest emotions, this is when a new apartheid is declared: leave Nature alone and let the humans retreat — as the English did on the beaches of Dunkirk in the 1940s.

Just at the moment when this fabulous dissonance inherent in the modernist project between what modernists say (emancipation from all attachments!) and what they do (create ever-more attachments!) is becoming apparent to all, along come those alleging to speak for Nature to say the problem lies in the violations and imbroglios — the attachments!

Instead of deciding that the great narrative of modernism (Emancipation) has always resulted in another history altogether (Attachments), the spirit of the age has interpreted the dissonance in quasi-apocalyptic terms: “We were wrong all along, let’s turn our back to progress, limit ourselves, and return to our narrow human confines, leaving the nonhumans alone in as pristine a Nature as possible, mea culpa, mea maxima culpa…

Nature, this great shortcut of due political process, is now used to forbid humans to encroach. Instead of realizing at last that the emancipation narrative is bunk, and that modernism was always about attachments, modernist greens have suddenly shifted gears and have begun to oppose the promises of modernization.

Why do we feel so frightened at the moment that our dreams of modernization finally come true? Why do we suddenly turn pale and wish to fall back on the other side of Hercules’s columns, thinking we are being punished for having transgressed the sign: “Thou shall not transgress?” Was not our slogan until now, as Nordhaus and Shellenberger note in Break Through, “We shall overcome!”?6

In the name of indisputable facts portraying a bleak future for the human race, green politics has succeeded in leaving citizens nothing but a gloomy asceticism, a terror of trespassing Nature, and a diffidence toward industry, innovation, technology, and science. No wonder that, while political ecology claims to embody the political power of the future, it is reduced everywhere to a tiny portion of electoral strap-hangers. Even in countries where political ecology is a little more powerful, it contributes only a supporting force.

Political ecology has remained marginal because it has not grasped either its own politics or its own ecology. It thinks it is speaking of Nature, System, a hierarchical totality, a world without man, an assured Science, but it is precisely these overly ordered pronouncements that marginalize it.

Set in contrast to the modernist narrative, this idea of political ecology could not possibly succeed. There is beauty and strength in the modernist story of emancipation. Its picture of the future is so attractive, especially when put against such a repellent past, that it makes one wish to run forward to break all the shackles of ancient existence.

To succeed, an ecological politics must manage to be at least as powerful as the modernizing story of emancipation without imagining that we are emancipating ourselves from Nature. What the emancipation narrative points to as proof of increasing human mastery over and freedom from Nature — agriculture, fossil energy, technology — can be redescribed as the increasing attachmentsbetween things and people at an ever-expanding scale. If the older narratives imagined humans either fell from Nature or freed themselves from it, the compositionist narrative describes our ever-increasing degree of intimacy with the new natures we are constantly creating. Only “out of Nature” may ecological politics start again and anew.

2.
The paradox of “the environment” is that it emerged in public parlance just when it was starting to disappear. During the heyday of modernism, no one seemed to care about “the environment” because there existed a huge unknown reserve on which to discharge all bad consequences of collective modernizing actions. The environment is what appeared when unwanted consequences came back to haunt the originators of such actions.

But if the originators are true modernists, they will see the return of “the environment” as incomprehensible since they believed they were finally free of it. The return of consequences, like global warming, is taken as a contradiction, or even as a monstrosity, which it is, of course, but only according to the modernist’s narrative of emancipation. In the compositionist’s narrative of attachments, unintended consequences are quite normal — indeed, the most expected things on earth!

Environmentalists, in the American sense of the word, never managed to extract themselves from the contradiction that the environment is precisely not “what lies beyond and should be left alone” — this was the contrary, the view of their worst enemies! The environment is exactly what should be even more managed, taken up, cared for, stewarded, in brief, integrated and internalized in the very fabric of the polity.

France, for its part, has never believed in the notion of a pristine Nature that has so confused the “defense of the environment” in other countries. What we call a “national park” is a rural ecosystem complete with post offices, well-tended roads, highly subsidized cows, and handsome villages.

Those who wish to protect natural ecosystems learn, to their stupefaction, that they have to work harder and harder — that is, to intervene even more, at always greater levels of detail, with ever more subtle care — to keep them “natural enough” for Nature-intoxicated tourists to remain happy.

Like France’s parks, all of Nature needs our constant care, our undivided attention, our costly instruments, our hundreds of thousands of scientists, our huge institutions, our careful funding. But though we have Nature, and we have nurture, we don’t know what it would mean for Nature itself to be nurtured.7

The word “environmentalism” thus designates this turning point in history when the unwanted consequences are suddenly considered to be such a monstrosity that the only logical step appears to be to abstain and repent: “We should not have committed so many crimes; now we should be good and limit ourselves.” Or at least this is what people felt and thought before the breakthrough, at the time when there was still an “environment.”

But what is the breakthrough itself then? If I am right, the breakthrough involves no longer seeing a contradiction between the spirit of emancipation and its catastrophic outcomes, but accepting it as the normal duty of continuing to care for unwanted consequences, even if this means going further and further down into the imbroglios. Environmentalists say: “From now on we should limit ourselves.” Postenvironmentalists exclaim: “From now on, we should stop flagellating ourselves and take up explicitly and seriously what we have been doing all along at an ever-increasing scale, namely, intervening, acting, wanting, caring.” For environmentalists, the return of unexpected consequences appears as a scandal (which it is for the modernist myth of mastery). For postenvironmentalists, the other, unintended consequences are part and parcel of any action.

3.
One way to seize upon the breakthrough from environmentalism to postenvironmentalism is to reshape the very definition of the “precautionary principle.” This strange moral, legal, epistemological monster has appeared in European and especially French politics after many scandals due to the misplaced belief by state authority in the certainties provided by Science.8

When action is supposed to be nothing but the logical consequence of reason and facts (which the French, of all people, still believe), it is quite normal to wait for the certainty of science before administrators and politicians spring to action. The problem begins when experts fail to agree on the reasons and facts that have been taken as the necessary premises of any action. Then the machinery of decision is stuck until experts come to an agreement. It was in such a situation that the great tainted blood catastrophe of the 1980s ensued: before agreement was produced, hundreds of patients were transfused with blood contaminated by the AIDS virus.9

The precautionary principle was introduced to break this odd connection between scientific certainty and political action, stating that even in the absence of certainty, decisions could be made. But of course, as soon as it was introduced, fierce debates began on its meaning. Is it an environmentalist notion that precludes action or a postenvironmentalist notion that finally follows action through to its consequences?

Not surprisingly, the enemies of the precautionary principle — which President Chirac enshrined in the French Constitution as if the French, having indulged so much in rationalism, had to be protected against it by the highest legal pronouncements — took it as proof that no action was possible any more. As good modernists, they claimed that if you had to take so many precautions in advance, to anticipate so many risks, to include the unexpected consequences even before they arrived, and worse, to be responsible for them, then it was a plea for impotence, despondency, and despair. The only way to innovate, they claimed, is to bounce forward, blissfully ignorant of the consequences or at least unconcerned by what lies outside your range of action. Their opponents largely agreed. Modernist environmentalists argued that the principle of precaution dictated no action, no new technology, no intervention unless it could be proven with certainty that no harm would result. Modernists we were, modernists we shall be!

But for its postenvironmental supporters (of which I am one) the principle of precaution, properly understood, is exactly the change ofzeitgeist needed: not a principle of abstention — as many have come to see it — but a change in the way any action is considered, a deep tidal change in the linkage modernism established between science and politics. From now on, thanks to this principle, unexpected consequences are attached to their initiators and have to be followed through all the way.

4.
The link between technology and theology hinges on the notion of mastery. Descartes exclaimed that we should be “maîtres et possesseurs de la nature.”10
But what does it mean to be a master? In the modernist narrative, mastery was supposed to require such total dominance by the master that he was emancipated entirely from any care and worry. This is the myth about mastery that was used to describe the technical, scientific, and economic dominion of Man over Nature.

But if you think about it according to the compositionist narrative, this myth is quite odd: where have we ever seen a master freed from any dependence on his dependents? The Christian God, at least, is not a master who is freed from dependents, but who, on the contrary, gets folded into, involved with, implicated with, and incarnated into His Creation. God is so attached and dependent upon His Creation that he is continually forced (convinced? willing?) to save it. Once again, the sin is not to wish to have dominion over Nature, but to believe that this dominion means emancipation and not attachment.

If God has not abandoned His Creation and has sent His Son to redeem it, why do you, a human, a creature, believe that you can invent, innovate, and proliferate — and then flee away in horror from what you have committed? Oh, you the hypocrite who confesses of one sin to hide a much graver, mortal one! Has God fled in horror after what humans made of His Creation? Then have at least the same forbearance that He has.

The dream of emancipation has not turned into a nightmare. It was simply too limited: it excluded nonhumans. It did not care about unexpected consequences; it was unable to follow through with its responsibilities; it entertained a wholly unrealistic notion of what science and technology had to offer; it relied on a rather impious definition of God, and a totally absurd notion of what creation, innovation, and mastery could provide.

Which God and which Creation should we be for, knowing that, contrary to Dr. Frankenstein, we cannot suddenly stop being involved and “go home?” Incarnated we are, incarnated we will be. In spite of a centuries-old misdirected metaphor, we should, without any blasphemy, reverse the Scripture and exclaim: “What good is it for a man to gain his soul yet forfeit the whole world?” /

1. Polidori, John, et al. 1819. The Vampyre: A Tale. Printed for Sherwood, Neely, and Jones.

2. Shelley, Mary W., 1823. Frankenstein: Or, The Modern Prometheus. Printed for G. and W.B. Whittaker.

3. Ibid.

4. This is also the theme of: Latour, Bruno. 1996. Aramis or the Love of Technology. Translated by Catherine Porter. Cambridge, Mass: Harvard University Press.

5. Beck, Ulrich. 1992. Risk Society: Towards a New Modernity. London: Sage.

6. Nordhaus, Ted, and Michael Shellenberger. 2007. Break Through: From the Death of Environmentalism to the Politics of Possibility. Boston: Houghton Mifflin Harcourt.

7. Descola, Philippe. 2005. Par dela nature et culture. Paris: Gallimard.

8. Sadeleer, Nicolas de, 2006. Implementing the Precautionary Principle: Approaches from Nordic Countries and the EU. Earthscan Publ. Ltd.

9. Hermitte, Marie-Angele. 1996. Le Sang Et Le Droit. Essai Sur La Transfusion Sanguine. Paris: Le Seuil.

10. Descartes, Rene. 1637. Discourse on Method in Discourse on Method and Related Writings. Translated by Desmond M. Clark. 1999. Part 6, 44. New York: Penguin.

The U.S. Has Fallen Behind in Numerical Weather Prediction: Part I

March 28, 2012 – 05:00 AM
By Dr. Cliff Mass (Twitter @CliffMass)

It’s a national embarrassment. It has resulted in large unnecessary costs for the U.S. economy and needless endangerment of our citizens. And it shouldn’t be occurring.

What am I talking about? The third rate status of numerical weather prediction in the U.S. It is a huge story, an important story, but one the media has not touched, probably from lack of familiarity with a highly technical subject. And the truth has been buried or unavailable to those not intimately involved in the U.S. weather prediction enterprise. This is an issue I have mentioned briefly in previous blogs, and one many of you have asked to learn more about. It’s time to discuss it.

Weather forecasting today is dependent on numerical weather prediction, the numerical solution of the equations that describe the atmosphere. The technology of weather prediction has improved dramatically during the past decades as faster computers, better models, and much more data (mainly satellites) have become available.

Supercomputers are used for numerical weather prediciton.

U.S. numerical weather prediction has fallen to third or fourth place worldwide, with the clear leader in global numerical weather prediction (NWP) being the European Center for Medium Range Weather Forecasting (ECMWF). And we have also fallen behind in ensembles (using many models to give probabilistic prediction) and high-resolution operational forecasting. We used to be the world leader decades ago in numerical weather prediction: NWP began and was perfected here in the U.S. Ironically, we have the largest weather research community in the world and the largest collection of universities doing cutting-edge NWP research (like the University of Washington!). Something is very, very wrong and I will talk about some of the issues here. And our nation needs to fix it.

But to understand the problem, you have to understand the competition and the players. And let me apologize upfront for the acronyms.

In the U.S., numerical weather prediction mainly takes place at the National Weather Service’s Environmental Modeling Center (EMC), a part of NCEP (National Centers for Environmental Prediction). They run a global model (GFS) and regional models (e.g., NAM).

The Europeans banded together decades ago to form the European Center for Medium-Range Forecasting (ECMWF), which runs a very good global model. Several European countries run regional models as well.

The United Kingdom Met Office (UKMET) runs an excellent global model and regional models. So does the Canadian Meteorological Center (CMC).

There are other major global NWP centers such as the Japanese Meteorological Agency (JMA), the U.S. Navy (FNMOC), the Australian center, one in Beijing, among others. All of these centers collect worldwide data and do global NWP.

The problem is that both objective and subjective comparisons indicate that the U.S. global model is number 3 or number 4 in quality, resulting in our forecasts being noticeably inferior to the competition. Let me show you a rather technical graph (produced by the NWS) that illustrates this. This figure shows the quality of the 500hPa forecast (about halfway up in the troposphere–approximately 18,000 ft) for the day 5 forecast. The top graph is a measure of forecast skill (closer to 1 is better) from 1996 to 2012 for several models (U.S.–black, GFS; ECMWF-red, Canadian: CMC-blue, UKMET: green, Navy: FNG, orange). The bottom graph shows the difference between the U.S. and other nation’s model skill.

You first notice that forecasts are all getting better. That’s good. But you will notice that the most skillful forecast (closest to one) is clearly the red one…the European Center. The second best is the UKMET office. The U.S. (GFS model) is third…roughly tied with the Canadians.

Here is a global model comparison done by the Canadian Meteorological Center, for various global models from 2009-2012 for the 120 h forecast. This is a plot of error (RMSE, root mean square error) again for 500 hPa, and only for North America. Guess who is best again (lowest error)?–the European Center (green circle). UKMET is next best, and the U.S. (NCEP, blue triangle) is back in the pack.

Lets looks at short-term errors. Here is a plot from a paper by Garrett Wedam, Lynn McMurdie and myself comparing various models at 24, 48, and 72 hr for sea level pressure along the West Coast. Bigger bar means more error. Guess who has the lowest errors by far? You guessed it, ECMWF.

I could show you a hundred of these plots, but the answers are very consistent. ECMWF is the worldwide gold standard in global prediction, with the British (UKMET) second. We are third or fourth (with the Canadians). One way to describe this, is that the ECWMF model is not only better at the short range, but has about one day of additional predictability: their 8 day forecast is about as skillful as our 7 day forecast. Another way to look at it is that with the current upward trend in skill they are 5-7 years ahead of the U.S.

Most forecasters understand the frequent superiority of the ECMWF model. If you read the NWS forecast discussion, which is available online, you will frequently read how they often depend not on the U.S. model, but the ECMWF. And during the January western WA snowstorm, it was the ECMWF model that first indicated the correct solution. Recently, I talked to the CEO of a weather/climate related firm that was moving up to Seattle. I asked them what model they were using: the U.S. GFS? He laughed, of course not…they were using the ECMWF.

A lot of U.S. firms are using the ECMWF and this is very costly, because the Europeans charge a lot to gain access to their gridded forecasts (hundreds of thousands of dollars per year). Can you imagine how many millions of dollars are being spent by U.S. companies to secure ECMWF predictions? But the cost of the inferior NWS forecasts are far greater than that, because many users cannot afford the ECMWF grids and the NWS uses their global predictions to drive the higher-resolution regional models–which are NOT duplicated by the Europeans. All of U.S. NWP is dragged down by these second-rate forecasts and the costs for the nation has to be huge, since so much of our economy is weather sensitive. Inferior NWP must be costing billions of dollars, perhaps many billions.

The question all of you must be wondering is why this bad situation exists. How did the most technologically advanced country in the world, with the largest atmospheric sciences community, end up with third-rate global weather forecasts? I believe I can tell you…in fact, I have been working on this issue for several decades (with little to show for it). Some reasons:

1. The U.S. has inadequate computer power available for numerical weather prediction. The ECMWF is running models with substantially higher resolution than ours because they have more resources available for NWP. This is simply ridiculous–the U.S. can afford the processors and disk space it would take. We are talking about millions or tens of millions of dollars at most to have the hardware we need. A part of the problem has been NWS procurement, that is not forward-leaning, using heavy metal IBM machines at very high costs.

2. The U.S. has used inferior data assimilation. A key aspect of NWP is to assimilate the observations to create a good description of the atmosphere. The European Center, the UKMET Office, and the Canadians using 4DVAR, an advanced approach that requires lots of computer power. We used an older, inferior approach (3DVAR). The Europeans have been using 4DVAR for 20 years! Right now, the U.S. is working on another advanced approach (ensemble-based data assimilation), but it is not operational yet.

3. The NWS numerical weather prediction effort has been isolated and has not taken advantage of the research community. NCEP’s Environmental Modeling Center (EMC) is well known for its isolation and “not invented here” attitude. While the European Center has lots of visitors and workshops, such things are a rarity at EMC. Interactions with the university community have been limited and EMC has been reluctant to use the models and approaches developed by the U.S. research community. (True story: some of the advances in probabilistic weather prediction at the UW has been adopted by the Canadians, while the NWS had little interest). The National Weather Service has invested very little in extramural research and when their budget is under pressure, university research is the first thing they reduce. And the U.S. NWP center has been housed in a decaying building outside of D.C.,one too small for their needs as well. (Good news… a new building should be available soon).

4. The NWS approach to weather related research has been ineffective and divided. The governmnent weather research is NOT in the NWS, but rather in NOAA. Thus, the head of the NWS and his leadership team do not have authority over folks doing research in support of his mission. This has been an extraordinarily ineffective and wasteful system, with the NOAA research teams doing work that often has a marginal benefit for the NWS.

5. Lack of leadership. This is the key issue. The folks in NCEP, NWS, and NOAA leadership have been willing to accept third-class status, providing lots of excuses, but not making the fundamental changes in organization and priority that could deal with the problem. Lack of resources for NWP is another issue…but that is a decision made by NOAA/NWS/Dept of Commerce leadership.

This note is getting long, so I will wait to talk about the other problems in the NWS weather modeling efforts, such as our very poor ensemble (probabilistic) prediction systems. One could write a paper on this…and I may.

I should stress that I am not alone in saying these things. A blue-ribbon panel did a review of NCEP in 2009 and came to similar conclusions (found here). And these issues are frequently noted at conferences, workshops, and meetings.

Let me note that the above is about the modeling aspects of the NWS, NOT the many people in the local forecast offices. This part of the NWS is first-rate. They suffer from inferior U.S. guidance and fortunately have access to the ECMWF global forecasts. And there are some very good people at NCEP that have lacked the resources required and suitable organization necessary to push forward effectively.

This problem at the National Weather Service is not a weather prediction problem alone, but an example of a deeper national malaise. It is related to other U.S. issues, like our inferior K-12 education system. Our nation, gaining world leadership in almost all areas, became smug, self-satisfied, and a bit lazy. We lost the impetus to be the best. We were satisfied to coast. And this attitude must end…in weather prediction, education, and everything else… or we will see our nation sink into mediocrity.

The U.S. can reclaim leadership in weather prediction, but I am not hopeful that things will change quickly without pressure from outside of the NWS. The various weather user communities and our congressional representatives must deliver a strong message to the NWS that enough is enough, that the time for accepting mediocrity is over. And the Weather Service requires the resources to be first rate, something it does not have at this point.

*  *  *

Saturday, April 7, 2012

Lack of Computer Power Undermines U.S. Numerical Weather Prediction (Revised)

In my last blog on this subject, I provided objective evidence of how U.S. numerical weather prediction (NWP), and particularly our global prediction skill, lags between major international centers, such as the European Centre for Medium Range Weather Forecasting (ECMWF), the UKMET office, and the Canadian Meteorological Center (CMC).   I mentioned briefly how the problem extends to high-resolution weather prediction over the U.S. and the use of ensemble (many model runs) weather prediction, both globally and over the U.S.  Our nation is clearly number one in meteorological research and we certainly have the knowledge base to lead the world in numerical weather prediction, but for a number of reasons we are not.  The cost of inferior weather prediction is huge: in lives lost, injuries sustained, and economic impacts unmitigated.  Truly, a national embarrassment. And one we must change.

In this blog, I will describe in some detail one major roadblock in giving the U.S. state-of-the-art weather prediction:  inadequate computer resources.   This situation should clearly have been addressed years ago by leadership in the National Weather Service, NOAA, and the Dept of Commerce, but has not, and I am convinced will not without outside pressure.  It is time for the user community and our congressional representatives to intervene.  To quote Samuel L. Jackson, enough is enough. (…)

In the U.S. we are trying to use less computer resources to do more tasks than the global leaders in numerical weather prediction. (Note: U.S. NWP is done by National Centers for Environmental Prediction’s (NCEP) Environmental Modeling Center (EMC)).  This chart tells the story:
Courtesy of Bill Lapenta, EMC.
ECMWF does global high resolution and ensemble forecasts, and seasonal climate forecasts.  UKMET office also does regional NWP (England is not a big country!) and regional air quality.  NCEP does all of this plus much, much more (high resolution rapid update modeling, hurricane modeling, etc.).   And NCEP has to deal with prediction over a continental-size country.

If you would expect the U.S. has a lot more computer power to balance all these responsibilities and tasks, you would be very wrong.  Right now the U.S. NWS has two IBM supercomputers, each with 4992 processors (IBM Power6 processors).   One computer does the operational work, the other is for back up (research and testing runs are done on the back-up).  About 70 teraflops (trillion floating points operations per second) for each machine.

NCEP (U.S.) Computer
The European Centre has a newer IBM machine with 8192, much faster, processors that gets 182 terraflops (yes, over twice as fast and with far fewer tasks to do).

The UKMET office, serving a far, far smaller country, has two newer IBM machines, each with 7680 processors for 175 teraflops per machine.

Here is a figure, produced at NCEP that compares the relative computer power of NCEP’s machine with the European Centre’s.  The shading indicates computational activity and the x-axis for each represents a 24-h period.  The relative heights allows you to compare computer resources.  Not only does the ECMWF have much more computer power, but they are more efficient in using it…packing useful computations into every available minute.

Courtesy of Bill Lapenta, EMC
Recently, NCEP had a request for proposals for a replacement computer system.  You may not believe this, but the specifications were ONLY for a system at least equal to the one that have.    A report in acomputer magazine suggests that perhaps this new system (IBM got the contract) might be slightly less powerful (around 150 terraflops) than one of the UKMET office systems…but that is not known at this point.

The Canadians?  They have TWO machines like the European Centre’s!

So what kind of system does NCEP require to serve the nation in a reasonable way?

To start, we need to double the resolution of our global model to bring it into line with ECMWF (they are now 15 km global).   Such resolution allows the global model to model regional features (such as our mountains).  Doubling horizontal resolution requires 8 times more computer power.  We need to use better physics (description of things like cloud processes and radiation).  Double again.  And we need better data assimilation (better use of observations to provide an improved starting point for the model).  Double once more.  So we need 32 times more computer power for the high-resolution global runs to allow us to catch up with ECMWF.  Furthermore, we must do the same thing for the ensembles (running many lower resolution global simulations to get probabilistic information).  32 times more computer resources for that (we can use some of the gaps in the schedule of the high resolution runs to fit some of this in…that is what ECMWF does).   There are some potential ways NCEP can work more efficiently as well.  Right now NCEP runs our global model out to 384 hours four times a day (every six hours).  To many of us this seems excessive, perhaps the longest periods (180hr plus) could be done twice a day.  So lets begin with a computer 32 times faster that the current one.

Many workshops and meteorological meetings (such as one on improvements in model physics that was held at NCEP last summer—I was the chair) have made a very strong case that the U.S. requires an ensemble prediction system that runs at 4-km horizontal resolution.  The current national ensemble system has a horizontal resolution about 32 km…and NWS plans to get to about 20 km in a few years…both are inadequate.   Here is an example of the ensemble output (mean of the ensemble members) for the NWS and UW (4km) ensemble systems:  the difference is huge–the NWS system does not even get close to modeling the impacts of the mountains.  It is similarly unable to simulate large convective systems.

Current NWS( NCEP) “high resolution” ensembles (32 km)
4 km ensemble mean from UW system
Let me make one thing clear.  Probabilistic prediction based on ensemble forecasts and reforecasting (running models back for years to get statistics of performance) is the future of weather prediction.  The days of giving a single number for say temperature at day 5 are over.  We need to let people know about uncertainty and probabilities.  The NWS needs a massive increase of computer power to do this. It lacks this computer power now and does not seem destined to get it soon.

A real champion within NOAA of the need for more computer power is Tom Hamill, an expert on data assimilation and model post-processing.   He and colleagues have put together a compelling case for more NWS computer resources for NWP.  Read it here.

Back-of-the-envelope calculations indicates that a good first step– 4km national ensembles–would require about 20,000 processors to do so in a timely manner–but it would revolutionize weather prediction in the U.S., including forecasting convection and in mountainous areas.  This high-resolution ensemble effort would meld with data assimilation over the long-term.

And then there is running super-high resolution numerical weather prediction to get fine-scale details right.  Here in the NW my group runs a 1.3 km horizontal resolution forecast out twice a day for 48h.   Such capability is needed for the entire country.  It does not exist now due to inadequate computer resources.

The bottom line is that the NWS numerical modeling effort needs a huge increase of computer power to serve the needs of the country–and the potential impacts would be transformative.   We could go from having a third-place effort, which is slipping back into the pack, to a world leader.  Furthermore, the added computer power will finally allow NOAA to complete Observing System Simulation Experiments (OSSEs) and Observing System Experiments (OSEs) to make rational decisions about acquisitions of very expensive satellite systems.  The fact that this is barely done today is really amazing and a potential waste of hundreds of millions of dollars on unnecessary satellite systems.

But do to so will require a major jump in computational power, a jump our nation can easily afford.   I would suggest that NWS’s EMC should begin by securing at least a 100,000 processor machine, and down the road something considerably larger.  Keep in mind my department has about 1000 processors in our computational clusters, so this is not as large as you think.

For a country with several billion-dollar weather disasters a year, investment in reasonable computer resrouces for NWP is obvious.
The cost?   Well, I asked Art Mann of Silicon Mechanics (a really wonderful local vendor of computer clusters) to give me rough quote:  using fast AMD chips, you could have such a 100K core machine for 11 million dollars. (this is without any discount!)  OK, this is the U.S. government and they like expensive, heavy metal machines….lets go for 25 million dollars.  The National Center for Atmospheric Research (NCAR) is getting a new machine with around 75,000 processors and the cost will be around 25-35 million dollars.   NCEP will want two machines, so lets budget 60 million dollars. We spend this much money on a single jet fighter, but we can’t invest this amount to greatly improve forecasts and public safety in the U.S.?  We have machines far larger than this for breaking codes, doing simulations of thermonuclear explosions, and simulating climate change.

Yes, a lot of money, but I suspect the cost of the machine would be paid back in a few months from improved forecasts.   Last year we had quite a few (over ten) billion-dollar storms….imagine the benefits of forecasting even a few of them better.  Or the benefits to the wind energy and utility industries, or U.S. aviation, of even modestly improved forecasts.   And there is no doubt such computer resources would improve weather prediction.  The list of benefits is nearly endless.   Recent estimates suggest that  normal weather events cost the U.S. economy nearly 1/2 trillion dollars a year.  Add to that hurricanes, tornadoes, floods, and other extreme weather.  The business case is there.

As someone with an insider’s view of the process, it is clear to me that the current players are not going to move effectively without some external pressure.  In fact, the budgetary pressure on the NWS is very intense right now and they are cutting away muscle and bone at this point (like reducing IT staff in the forecast offices by over 120 people and cutting back on extramural research).  I believe it is time for weather sensitive industries and local government, together with t he general public, to let NOAA management and our congressional representatives know that this acute problem needs to be addressed and addressed soon.   We are acquiring huge computer resources for climate simulations, but only a small fraction of that for weather prediction…which can clearly save lives and help the economy.  Enough is enough.

Posted by Cliff Mass Weather Blog at 8:38 PM

Relação entre cientistas e jornalistas é debatida em seminário (FAPESP)

Divulgação científica ganha peso no meio acadêmico e relacionamento entre as duas classe profissionais se torna mais próximo,dizem especialistas em encontro realizado pela FAPESP

18/04/2012

Por Karina Toledo

Agência FAPESP – Com as ações de divulgação científica ganhando cada vez mais peso no meio acadêmico, a relação entre jornalistas e pesquisadores parece mudar para melhor. Mas é preciso ter em mente que cientistas eminentes não são autoridades em todos os assuntos.

O alerta foi feito pelo biólogo Thomas Lewinsohn, professor da Universidade Estadual de Campinas (Unicamp), durante sua participação no seminário Ciência na Mídia, realizado pela FAPESP no dia 16 de abril.

“Antigamente os pesquisadores davam muito peso para publicação em revistas científicas, o que lhes garantia prestígio acadêmico e financiamento, e quase nenhuma atenção à divulgação científica, que servia apenas para aumentar a popularidade. Hoje estamos perto de um equilíbrio entre os dois ramos”, afirmou.

Percebeu-se que além de popularidade, a exposição na mídia afetava também a influência e o poder de decisão no meio acadêmico, aumentando as chances de ter um projeto financiado e, consequentemente, elevando o prestígio acadêmico.

Um exemplo claro do novo paradigma, segundo Lewinsohn, é a mudança no sistema de avaliação dos cursos de pós-graduação pela Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (Capes). “Hoje se dá um peso maior à visibilidade do trabalho dos cientistas que compõem os quadros”, avaliou.

Outro sinal é a transformação pela qual as mais importantes revistas científicas, entre elas ScienceNature, passaram nos últimos anos, ganhando novas seções com conteúdo noticioso e linguagem mais acessível.

“Está se tornando impossível para o cientista ignorar a mídia. Muitos hoje cortejam os jornalistas e isso dá margem a distorções. Existe uma ideia de que o cientista terá sempre uma opinião racional e bem embasada sobre tudo e isso não é verdade”, afirmou o biólogo.

Por esse motivo, recomendou, os jornalistas devem resistir ao impulso de, na correria das redações, recorrer sempre àquela fonte que tem respostas para todos os temas. “Alguns têm uma agenda pessoal, que nem sempre tem a ver com a ciência.”

Durante sua apresentação, o médico Paulo Saldiva, da Faculdade de Medicina da USP, reclamou do fato de que a maioria dos jornalistas que o procura querer falar de temas que não têm relação com sua área de estudo: os efeitos da poluição atmosférica sobre a saúde.

Outro problema abordado por ele foi o pouco tempo dispensado aos temas e o risco da superficialidade. “Você fala durante meia hora e aparece apenas dez segundos. Esse é o maior pavor dos cientistas”, acrescentou Saldiva.

Para o biólogo Fernando Reinach, que se tornou conhecido após participar do Projeto Genoma , financiado pela FAPESP, e hoje mantém uma coluna de divulgação científica no jornal O Estado de S. Paulo, o grande problema do jornalismo científico é “contar o milagre e não contar o santo”.

“Dá-se muita ênfase à descoberta e não se explora bem os métodos usados. Isso dificulta avaliar se o que está sendo dito é verdade”, opinou.

Reinach contou que após deixar a vida acadêmica manteve o hábito de ler artigos científicos e idealizou a coluna no jornal por considerar que havia muitos temas interessantes escondidos atrás de títulos obscuros. “Tenho o cientista como personagem. Tento dar uma dimensão humana à pesquisa”, revelou.

Já o editor de Ciência do jornal Folha de S. Paulo, Reinaldo José Lopes, falou sobre o encolhimento do espaço nos jornais para as notícias em geral e para ciência em particular. “Como empacotar a notícia, a metodologia e o lado humano em meia página? A gente sente uma impaciência do leitor que é assustadora e isso acaba conduzindo à superficialidade”, disse.

O encontro ainda teve a participação de Roberto Wertman, editor do programa Espaço Aberto Ciência & Tecnologia da Globonews, que comentou as limitações da cobertura científica na TV, extremamente dependente da existência de imagens. E de Sonia López, ex-editora do AlphaGalileu, um dos maiores portais de notícias acadêmicas.

A abertura ficou por conta de Clive Cookson, editor de Ciência do jornal Financial Times, que listou os três principais problemas que, em sua opinião, afetam a qualidade do jornalismo científico.

Em primeiro lugar, Cookson mencionou a tendência de abordar os resultados de pesquisas de forma exagerada e sensacionalista. “O repórter precisa convencer seu editor de que vale a pena publicar aqueles dados e a verdade científica às vezes acaba em segunda plano. E quando o subeditor escreve a manchete a notícia fica ainda mais exagerada”, comentou.

Outro problema é a tendência de abordar os dados de forma negativista, o que pode causar distorções. “A ideia é que notícia ruim vende mais”, disse.

Por último Cookson mencionou a divulgação de notícias não objetivas, permeadas de interesses políticos. “Cientistas devem se ater à ciência. Mas mesmo em situações controversas devem aproveitar para passar sua mensagem. Se deixarem um vazio, fontes com motivações políticas podem se aproveitar.”

A Sharp Rise in Retractions Prompts Calls for Reform (N.Y. Times)

PLEA Dr. Ferric Fang argues that science has changed in worrying ways. Matthew Ryan Williams for The New York Times
By CARL ZIMMER – Published: April 16, 2012

In the fall of 2010, Dr. Ferric C. Fang made an unsettling discovery. Dr. Fang, who is editor in chief of the journal Infection and Immunity, found that one of his authors had doctored several papers.

It was a new experience for him. “Prior to that time,” he said in an interview, “Infection and Immunity had only retracted nine articles over a 40-year period.”

The journal wound up retracting six of the papers from the author, Naoki Mori of the University of the Ryukyus in Japan. And it soon became clear that Infection and Immunity was hardly the only victim of Dr. Mori’s misconduct. Since then, other scientific journals have retracted two dozen of his papers, according to the watchdog blog Retraction Watch.

“Nobody had noticed the whole thing was rotten,” said Dr. Fang, who is a professor at the University of Washington School of Medicine.

Dr. Fang became curious how far the rot extended. To find out, he teamed up with a fellow editor at the journal, Dr. Arturo Casadevall of the Albert Einstein College of Medicine in New York. And before long they reached a troubling conclusion: not only that retractions were rising at an alarming rate, but that retractions were just a manifestation of a much more profound problem — “a symptom of a dysfunctional scientific climate,” as Dr. Fang put it.

Dr. Casadevall, now editor in chief of the journal mBio, said he feared that science had turned into a winner-take-all game with perverse incentives that lead scientists to cut corners and, in some cases, commit acts of misconduct.

“This is a tremendous threat,” he said.

WATCHDOG  Dr. Arturo Casadevall of the Albert Einstein College of Medicine in New York teamed up with Dr. Ferric C. Fang to study a raft of retractions. Ángel Franco/The New York Times

Last month, in a pair of editorials in Infection and Immunity, the two editors issued a pleafor fundamental reforms. They also presented their concerns at the March 27 meeting of the National Academies of Sciences committee on science, technology and the law.

Members of the committee agreed with their assessment. “I think this is really coming to a head,” said Dr. Roberta B. Ness, dean of the University of Texas School of Public Health. And Dr. David Korn of Harvard Medical School agreed that “there are problems all through the system.”

No one claims that science was ever free of misconduct or bad research. Indeed, the scientific method itself is intended to overcome mistakes and misdeeds. When scientists make a new discovery, others review the research skeptically before it is published. And once it is, the scientific community can try to replicate the results to see if they hold up.

Source: Journal of Medical Ethics

But critics like Dr. Fang and Dr. Casadevall argue that science has changed in some worrying ways in recent decades — especially biomedical research, which consumes a larger and larger share of government science spending.

In October 2011, for example, the journal Nature reported that published retractions had increased tenfold over the past decade, while the number of published papers had increased by just 44 percent. In 2010 The Journal of Medical Ethics published a studyfinding the new raft of recent retractions was a mix of misconduct and honest scientific mistakes.

Several factors are at play here, scientists say. One may be that because journals are now online, bad papers are simply reaching a wider audience, making it more likely that errors will be spotted. “You can sit at your laptop and pull a lot of different papers together,” Dr. Fang said.

But other forces are more pernicious. To survive professionally, scientists feel the need to publish as many papers as possible, and to get them into high-profile journals. And sometimes they cut corners or even commit misconduct to get there.

To measure this claim, Dr. Fang and Dr. Casadevall looked at the rate of retractions in 17 journals from 2001 to 2010 and compared it with the journals’ “impact factor,” a score based on how often their papers are cited by scientists. The higher a journal’s impact factor, the two editors found, the higher its retraction rate.

The highest “retraction index” in the study went to one of the world’s leading medical journals, The New England Journal of Medicine. In a statement for this article, it questioned the study’s methodology, noting that it considered only papers with abstracts, which are included in a small fraction of studies published in each issue. “Because our denominator was low, the index was high,” the statement said.

Monica M. Bradford, executive editor of the journal Science, suggested that the extra attention high-impact journals get might be part of the reason for their higher rate of retraction. “Papers making the most dramatic advances will be subject to the most scrutiny,” she said.

Dr. Fang says that may well be true, but adds that it cuts both ways — that the scramble to publish in high-impact journals may be leading to more and more errors. Each year, every laboratory produces a new crop of Ph.D.’s, who must compete for a small number of jobs, and the competition is getting fiercer. In 1973, more than half of biologists had a tenure-track job within six years of getting a Ph.D. By 2006 the figure was down to 15 percent.

Yet labs continue to have an incentive to take on lots of graduate students to produce more research. “I refer to it as a pyramid scheme,” said Paula Stephan, a Georgia State University economist and author of “How Economics Shapes Science,” published in January by Harvard University Press.

In such an environment, a high-profile paper can mean the difference between a career in science or leaving the field. “It’s becoming the price of admission,” Dr. Fang said.

The scramble isn’t over once young scientists get a job. “Everyone feels nervous even when they’re successful,” he continued. “They ask, ‘Will this be the beginning of the decline?’ ”

University laboratories count on a steady stream of grants from the government and other sources. The National Institutes of Health accepts a much lower percentage of grant applications today than in earlier decades. At the same time, many universities expect scientists to draw an increasing part of their salaries from grants, and these pressures have influenced how scientists are promoted.

“What people do is they count papers, and they look at the prestige of the journal in which the research is published, and they see how many grant dollars scientists have, and if they don’t have funding, they don’t get promoted,” Dr. Fang said. “It’s not about the quality of the research.”

Dr. Ness likens scientists today to small-business owners, rather than people trying to satisfy their curiosity about how the world works. “You’re marketing and selling to other scientists,” she said. “To the degree you can market and sell your products better, you’re creating the revenue stream to fund your enterprise.”

Universities want to attract successful scientists, and so they have erected a glut of science buildings, Dr. Stephan said. Some universities have gone into debt, betting that the flow of grant money will eventually pay off the loans. “It’s really going to bite them,” she said.

With all this pressure on scientists, they may lack the extra time to check their own research — to figure out why some of their data doesn’t fit their hypothesis, for example. Instead, they have to be concerned about publishing papers before someone else publishes the same results.

“You can’t afford to fail, to have your hypothesis disproven,” Dr. Fang said. “It’s a small minority of scientists who engage in frank misconduct. It’s a much more insidious thing that you feel compelled to put the best face on everything.”

Adding to the pressure, thousands of new Ph.D. scientists are coming out of countries like China and India. Writing in the April 5 issue of Nature, Dr. Stephan points out that a number of countries — including China, South Korea and Turkey — now offer cash rewards to scientists who get papers into high-profile journals. She has found these incentives set off a flood of extra papers submitted to those journals, with few actually being published in them. “It clearly burdens the system,” she said.

To change the system, Dr. Fang and Dr. Casadevall say, start by giving graduate students a better understanding of science’s ground rules — what Dr. Casadevall calls “the science of how you know what you know.”

They would also move away from the winner-take-all system, in which grants are concentrated among a small fraction of scientists. One way to do that may be to put a cap on the grants any one lab can receive.

Such a shift would require scientists to surrender some of their most cherished practices — the priority rule, for example, which gives all the credit for a scientific discovery to whoever publishes results first. (Three centuries ago, Isaac Newton and Gottfried Leibniz were bickering about who invented calculus.) Dr. Casadevall thinks it leads to rival research teams’ obsessing over secrecy, and rushing out their papers to beat their competitors. “And that can’t be good,” he said.

To ease such cutthroat competition, the two editors would also change the rules for scientific prizes and would have universities take collaboration into account when they decide on promotions.

Ms. Bradford, of Science magazine, agreed. “I would agree that a scientist’s career advancement should not depend solely on the publications listed on his or her C.V.,” she said, “and that there is much room for improvement in how scientific talent in all its diversity can be nurtured.”

Even scientists who are sympathetic to the idea of fundamental change are skeptical that it will happen any time soon. “I don’t think they have much chance of changing what they’re talking about,” said Dr. Korn, of Harvard.

But Dr. Fang worries that the situation could be become much more dire if nothing happens soon. “When our generation goes away, where is the new generation going to be?” he asked. “All the scientists I know are so anxious about their funding that they don’t make inspiring role models. I heard it from my own kids, who went into art and music respectively. They said, ‘You know, we see you, and you don’t look very happy.’ ”

Detribalising Economics (World Economics Association)

By Rob Garnett [r.garnett@tcu.edu]
World Economics Association Newsletter 2(2), April.2012, page 4

In “Why Pluralism?” (2011), Stuart Birks calls for “greater discussion, deliberation, and cross-fertilization of ideas” among schools of economic thought as an antidote to each school’s autarkic tendency to “see itself as owning the ‘truth’ for its area.” As a philosophical postscript, I want to underscore the catholic reach of Birks’s remarks — his genial reminder, properly addressed to all economists, of the minimal requirements for academic inquiry.

The case for academic pluralism in economics is motivated by the ubiquity of “myside bias” (Klein 2011). Whether methodological, ideological, paradigmatic, or all of the above, such groupthink fuels intellectual segregation and bigotry. It turns schools into echo chambers, sealed off from the critical feedback loops that check hubris and propel scholarly progress.

Pluralists know that “The causes of faction cannot be removed . . . Relief is only to be sought in the means of containing its effects” (Hamilton, Madison, and Jay [1788] 2001, 45). So even as they celebrate paradigmatic diversity, they insist that scholars observe two liberal precepts:

1. academic discourse is a commons, no ‘area’ of which can be owned by any school; and

2. within these spaces of inquiry, scholars bear certain ethical duties as academic citizens.

Academic pluralism is the duty to practice “methodological awareness and toleration” (Backhouse 2001, 163) and “to constantly [seek] to learn from those who [do] not share [one’s] ideological or methodological perspective” (Boettke 2004, 379). It is “academic” because it coincides with the epistemological and ethical norms of modern academic freedom (American Association of University Professors 1940). It is “pluralist” because it entails a commitment to conduct one’s scholarly business in a non-sectarian manner.

Could a critical mass of economists ever be persuaded to enact these scholarly virtues? Yes! But admirers of these virtues must be prepared to teach by example. When Warren Samuels passed away in last August, he was eulogized as a first-rate scholar who advanced pluralism by enacting it consistently over his long career. As the Austrian economist Peter Boettke recalls:

Prior to meeting Warren, I think it would be accurate to say that I divided the world neatly into those who are stupid, those who are evil, and those who are smart and good enough to agree with me. . . . Warren destroyed that simple intellectual picture of the world. . . . He didn’t overturn my intellectual commitments . . . but he made [me] more selfcritical and less self-satisfied, and hopefully a better scholar [and] teacher (Boettke 2011).

The pluralism Warren Samuels personified can be achieved by most economic scholars, teachers, and students to a reasonable degree. If we want economics to regain its standing as a serious and humane social science, we must find more ways to activate these dormant capabilities.

References

American Association of University Professors (1940) Statement of Principles on Academic Freedom and Tenure. Washington, DC.

Backhouse, R. E. (2001) On the Credentials of Methodological Pluralism. In J. E. Biddle, J. B. Davis, and S. G. Medema (Eds.), Economics Broadly Considered: Essays in Honor of Warren J. Samuels, 161-181. London: Routledge.

Boettke, P. J. (2011) “Warren Samuels (1933-2011)”, http://www.coordinationproblem.org/2011/08/warren-samuels-1933-2011.html Accessed August 18, 2011.

Boettke, P. J. (2004) Obituary: Don Lavoie (1950-2001). Journal of Economic Methodology 11 (3): 377-379.

Birks, S. (2011) “Why Pluralism?” World Economics Association Newsletter, vol. 1, no. 1.

Hamilton, A., Madison, J., and Jay, J. (2001) [1788] The Federalist. Gideon edition. G. W. Carey and J. McClellan (eds.) Indianapolis, IN: Liberty Fund.

Klein, D. B. (2011) “I Was Wrong, and So Are You.” The Atlantic, December.

[Editor’s note: Readers may also be interested in Garnett, R. F. (Ed.). (1999). What do economists know? London: Routledge]

Q&A: The Anthropology of Searching for Aliens (Wired)

By   – April 4, 2012 |  2:50 pm

The Allen Telescope Array, an interferometry project dedicated to SETI and radio astronomy in Hat Creek, California, at sunset.

Before we can understand an alien civilization, it might be useful to understand our own.

To help in this task, anthropologist Kathryn Denning of York University in Toronto, Canada studies the very human way that scientists, engineers and members of the public think about space exploration and the search for alien life.

From Star Trek to SETI, our modern world is constantly imagining possible futures where we dart around the galaxy engaging with bizarre alien races. Denning points out that when people talk about these futures, they often invoke the past. But they frequently seem to have a poor understanding of history.

For instance, in September at the 100 Year Starship Conference — a symposium created by DARPA for thinking about long-term spaceflight goals — Denning noted that the conference was framed as an extension of old traditions of exploration, for example mentioning Ferdinand Magellan as an exemplary hero who circumnavigated the globe. Not only did Magellan not circumnavigate the globe (he was dismembered in the Philippines before finishing the task), his mission was not entirely laudable.

Anthropologist Kathryn Denning studies the very human way that scientists, engineers, and members of the public think about space exploration and the search for alien life.

“It’s easy to forget that it’s also a story of slavery, war, betrayal, hardship, violence, and death — not just to those who signed up for the journey, but a lot of innocent bystanders,” Denning said during a talk March 30 at the Contact Conference, an annual meeting dedicated to speculation about SETI and space exploration. The misuse of the past matters when thinking about the future, she added, because it deludes people, giving them a poor understanding of how history actually moves.

Wired spoke to Denning about contact with extraterrestrials, the rhetoric of the Space Age, and what it means to be human in the universe.

Wired: What does the field of anthropology bring to thinking about space exploration and SETI?

Kathryn Denning: Anthropologists are good at looking at discourses, and the stories that people tell to structure their lives and their behavior. So there are anthropologists working on the discourse surrounding interstellar flight. And anthropologists have always worked on the phenomenon of UFO abductions and aliens on Earth and that sort of stuff.

With respect to SETI, one of the main contributions is just grounding all of that speculation about other civilizations in actual physical data. In terms of civilization or civilizations, we only have one example — Earth.

And there’s a lot of data here, which has been very poorly mined so far. If people are drawing generalizations about civilizations elsewhere in the universe that don’t even hold here on Earth, then maybe we should throw them out.

Wired: What are some instances of wrong ideas about civilization that get invoked in talking about extraterrestrials?

Denning: I think one good example is the variable of L, the lifetime of civilizations, which dominates the Drake equation. [An estimate of the number of intelligent extraterrestrials that could exist in our galaxy.]

The speculation on this has been frankly goofy sometimes. I mean you can make up basically any value of L that you like and justify it in some way. So people say we should try to use Earth’s data to look at it. We should ask what really does cause civilizations to collapse or revert to a lower order of complexity or technological regime.

And, well, we’re still working that one out actually. We have so much work to do and I think that’s important for people to understand that our models of civilization here on Earth are not as solid as popular culture frequently assumes them to be.

Similarly, many people hold outdated ideas regarding scenarios of contact. We have our iconic case studies, such as Columbus landing in the Americas or Cortez and the Aztecs. But most of those have been revamped with additional historical work in even just the last 30 or 40 years.

So when I hear that standard model of Columbus or Cortez, frankly I want to roll my eyes. For example [Steven] Hawking says — interminably and repeatedly — that when Columbus showed up in the Americas, well, that didn’t turn out very well for the Native Americans. And therefore we should similarly be worried about trying to attract the attention of an alien civilization.

The problem is that it tends to misrepresent Earth’s history. These stories get invoked in models of contact with an alien society, but it’s a biased retelling of Earth’s history and it’s usually not a very good one.

The underlying narrative there is that it went poorly for the Native Americans because they were the inferior civilization. And, by extension, it would go poorly for us because the other party would be the superior civilization. But that simply wasn’t the case for the Native Americans.

One of the reasons I do the work I do is to try and have people get the history a little bit straighter.

Wired: There is an oft-heard narrative for alien contact: after we find a signal, it would revolutionize everything, and humanity would put aside their differences and come together as one. How do you take that narrative as an anthropologist?

Denning: One way to read that, in the most general sense, is that it’s a narrative that makes us feel better.

One of the things that astronomy and space exploration in the 20th century has done is force us to confront the universe in a way that we never did before. We had to start understanding that, yeah, asteroids impact the earth and can wipe out a vast proportion of life, and our planet is a fragile spaceship Earth.

I think this has given us this sort of kind of cosmic anxiety. And it would make us feel a whole lot better if we had neighbors and they were friendly and they could enlighten us.

One of the things that runs through the whole SETI discussion is our problems with technology. There is an inherent assumption that the equipment needed for communication across interstellar space would necessarily evolve in tandem with weapons of mass destruction. Therefore any society that survived long enough to make contact with us would have solved their technological problems.

I think that’s a very hopeful take on it. These stories of contact and what it would do for us, they’ve emerged in concert with these anxieties about the universe and questions about our technology. I think in some way it’s almost like a coping mechanism.

Wired: In terms of space exploration, you’ve said that it’s like we’re entering a new Space Age. Why do you say that and what does it mean?

Denning: I think the biggest difference from the past is the role of corporations. Obviously nation-states have always used contractors, but they’re now achieving a degree of independence that is unprecedented.

When you have private companies that are planning on flying not just to the moon but also to Mars, that’s new and that’s different. We don’t have the government systems in place to deal with that sort of stuff because the outer space treaty and all our international agreements are geared toward nation states.

There are new legal discourses emerging but nothing moves as fast as private enterprise. It’s been specifically set up to move quickly, so nothing moves as fast as, say, the X prize.

Wired: The 1950s/60s Space Age often invoked the rhetoric of colonization or frontierism in thinking about their goals. How do these ideas play out in modern space exploration?

Denning: The ideological stages of colonization are still well underway. As soon as you have technology on another world, that constitutes a de facto claim of some kind. So, in a way, everyone watching Spirit and Opportunity are watching Mars through these robot’s eyes.

That’s not just an interesting kind of little jaunt; it’s a way of making Mars not only human but also American. When you’re naming features on other worlds after people here, these things constitute claims.

For example, NASA renamed the Mars Pathfinder lander the “Carl Sagan Memorial Station.” Any archeologist or anthropologist will tell you that one of the most effective ways of colonizing territory, at least ideologically, is through your dead.

Wired: Is there something you’d like to see as the narrative of the new Space Age?

Denning: I’m going to borrow a term here from a scholar named Bill Kramer. He spoke at the 100-Year Starship Conference and he suggested that instead of boldly going, we humbly go.

To me that really encapsulates it. Instead of getting out there as quickly as possible and using the systems that we used here on Earth, like extracting resources as quickly as possible in order to fuel whatever it is that we’re trying to do. What if we went instead with a collaborative, conservationist stewardship in mind?

What if instead of making messes that we don’t know how to clean up, what if we slowed down a little bit? Because the urgency is manufactured. I mean, I want to see space continue to be explored. It’s cool, and there’s stuff out there that we would like to know.

It doesn’t have to be the answer to all of our needs. Sure, we can harvest sunlight from solar arrays in orbit around the Earth but that’s going to have its own technological problems and geopolitical implications.

But the main problem with energy and resources here on Earth isn’t always that we don’t have enough: it’s that the distribution is unequal, and simply harvesting more is not going to resolve that. Chances are it’s just going to continue to increase inequity, and that doesn’t work well for anyone.

I think what everybody should be learning is that these immense disparities cause profound instabilities, which you have to continue to have to deal with. So I just don’t see it as the answer.

Space colonization is held up as being the natural next stage in our social evolution. Not only that, it’s an absolute necessity for the survival of the species. But if we are our own existential threat, then how does that follow? Wherever we go, there we are.

So the suggestion that ever increasing technology is the solution to problems that have been created by our technology is barking mad.

Wired: In some sense, we have a deterministic view of history when it comes to space exploration: We will go from airplanes to spaceships to conquering the galaxy. Where does that narrative come from and what do you see as some of the downsides of it?

Denning: I think it comes from two places. One is a specific version of history that’s quite progressivist and techno-philic. It’s a version of history that says we just increase in our energy consumption, we increase in our complexity and we increase our goodness. It all ratchets up together, and it’s a kind of Singularity argument.

But it’s combined with this fundamentally apocalyptic view that the current order of things will one day be superseded by another. That’s kind of a Judeo-Christian thing. And it’s sort of a funny coincidence that the future is up there [points skyward]. In many popular space narratives, the heavens and Heaven really swap out. It sounds pretty glib but it’s so frequently suggested that it’s hard to dismiss.

The idea is that longevity – immortality, in fact — the future and our destiny are all up there. And there’s simply no logical reason that should be the case. We have no evidence suggesting we can live anywhere for long periods of time other than on this planet. In fact, the evidence is steadily accumulating that’s it’s going to be really hard to do anything else.

We have problems with bone loss and blindness. Plus we have no evidence that we can reproduce safely in space. These are fairly big stumbling blocks and so this vision of a happy shiny future in space, it’s just so mythic.

Wired: Do you see that as changing, do you think people are coming to understand the problems with the previous narratives?

Denning: I think some are and this is one of the glories of humanity. But we’ll always have a tremendous diversity of opinion.

You’re always going to have these people who think Heaven and the heavens are interchangeable. And they’re going to be looking toward the stars for all kinds of religious or quasi-religious purposes.

Then you’re going to have the extension of the planetary protection mode of thinking. The people who are fundamentally thinking about environmentalism and stewardship and inequity. And then you’re going to have the people interested in militarization, and so on.

You’re always going to have this diversity of viewpoints, of motivations, and behaviors, and I mean: Welcome to Earth.

Wired: You write in a paper (.pdf) that someone in “the physical sciences might say ‘aha, here you have X which, by analogy, means that you must have Y, which means you have Z.’” On the other hand, “a scholar in the human sciences will often not venture past X.”

Denning: Right, we rarely get as far as Z. Most of the time, anthropology is not working as explicitly with a predictive model, it’s a much more descriptive model.

Wired: How do you see that difference between the physical and social sciences play out in the SETI discourse?

Denning: I think there’s been a lot of interesting discussion around the question of whether or not decipherment of an extraterrestrial signal would be possible.

Anthropologists tend to assume the answer is, basically, no. Unless you’re in direct contact, it would be very difficult to establish enough common language. Whereas the physicists and mathematicians tend to say, ‘Well all you need is math.’

And then the anthropologists laugh and it goes on. Maybe that tells you more about the various disciplines than about whether or not contact is possible, but that’s an entertaining and interesting problem.

Wired: What do anthropologists say when they look at the enterprise of SETI? That is, what does it say about us as humans that we are searching for others like ourselves in the universe?

Denning: It’s an interesting question and you can look at it in different ways. In one sense, its just the extension of a long tradition on thinking about what might be out there, which has just gone through a new technological manifestation.

Some people ask me: When did we first start thinking that there might be extraterrestrial life? And my reply is: When did we start thinking that there might not be? The sky has always been very busy, and the default position has always been that it’s populated. That doesn’t mean anything but that ideological substrate has always been there.

Only 200 years ago, we thought there could be people on the moon. Then, we got a good look at the moon and saw, well there’s no Lunarians there. And then there were the Martians — Lowell and all that — and it wasn’t very long ago, less than 100 years ago. As our range of vision keeps on moving outwards, the aliens keep on moving outwards too. And that’s one way you can look at SETI; it’s the logical trajectory of an idea that’s always been around.

And, of course, you can look at it within a religious framework. Our 20th century western culture includes Christianity and beings populating the Heavens. But anthropologically speaking, SETI also could be seen as being a reaction to the collapse of traditional religion.

In a universe where you’re no longer expecting God to provide the order, we are forced to ask: where is the order? Where’s the sense to it all and what are we then a part of?

Image: Diana Goss

MIT Predicts That World Economy Will Collapse By 2030 (POPSCI)

By Rebecca Boyle – Posted 04.05.2012 at 4:30 pm

Crowds and Haze in Shanghai Jeremy Vandel via Flickr

Forty years after its initial publication, a study called The Limits to Growth is looking depressingly prescient. Commissioned by an international think tank called the Club of Rome, the 1972 report found that if civilization continued on its path toward increasing consumption, the global economy would collapse by 2030. Population losses would ensue, and things would generally fall apart.

The study was — and remains — nothing if not controversial, with economists doubting its predictions and decrying the notion of imposing limits on economic growth. Australian researcher Graham Turner has examined its assumptions in great detail during the past several years, and apparently his latest research falls in line with the report’s predictions, according to Smithsonian Magazine. The world is on track for disaster, the magazine says.

The study, initially completed at MIT, relied on several computer models of economic trends and estimated that if things didn’t change much, and humans continued to consume natural resources apace, the world would run out at some point. Oil will peak (some argue it has) before dropping down the other side of the bell curve, yet demand for food and services would only continue to rise. Turner says real-world data from 1970 to 2000 tracks with the study’s draconian predictions: “There is a very clear warning bell being rung here. We are not on a sustainable trajectory,” he tells Smithsonian.

Is this impossible to fix? No, according to both Turner and the original study. If governments enact stricter policies and technologies can be improved to reduce our environmental footprint, economic growth doesn’t have to become a market white dwarf, marching toward inevitable implosion. But just how to do that is another thing entirely.

[Smithsonian]

As linguagens da psicose (Revista Fapesp)

Abordagem matemática evidencia as diferenças entre os discursos de quem tem mania ou esquizofrenia

CARLOS FIORAVANTI | Edição 194 – Abril de 2012

Como o estudo foi feito: os entrevistados relatavam um sonho e a entrevistadora convertia as palavras mais importantes em pontos e as frases em setas para examinar a estrutura da linguagem

Para os psiquiatras e para a maioria das pessoas, é relativamente fácil diferenciar uma pessoa com psicose de quem não apresentou nenhum distúrbio mental já diagnosticado: as do primeiro grupo relatam delírios e alucinações e por vezes se apresentam como messias que vão salvar o mundo. Porém, diferenciar os dois tipos de psicose – mania e esquizofrenia – já não é tão simples e exige um bocado de experiência pessoal, conhecimento e intuição dos especialistas. Uma abordagem matemática desenvolvida no Instituto do Cérebro da Universidade Federal do Rio Grande do Norte (UFRN) talvez facilite essa diferenciação, fundamental para estabelecer os tratamentos mais adequados para cada enfermidade, ao avaliar de modo quantitativo as diferenças nas estruturas de linguagem verbal adotadas por quem tem mania ou esquizofrenia.

A estratégia de análise – com base na teoria dos grafos, que representou as palavras como pontos e a sequência entre elas nas frases por setas – indicou que as pessoas com mania são muito mais prolixas e repetitivas do que as com esquizofrenia, geralmente lacônicas e centradas em um único assunto, sem deixar o pensamento viajar. “A recorrência é uma marca do discurso do paciente com mania, que conta três ou quatro vezes a mesma coisa, enquanto aquele com esquizofrenia fala objetivamente o que tem para falar, sem se desviar, e tem um discurso pobre em sentidos”, diz a psiquiatra Natália Mota, pesquisadora do instituto. “Em cada grupo”, diz Sidarta Ribeiro, diretor do instituto, “o número de palavras, a estrutura da linguagem e outros indicadores são completamente distintos”.

Eles acreditam que conseguiram dar os primeiros passos rumo a uma forma objetiva de diferenciar as duas formas de psicose, do mesmo modo que um hemograma é usado para atestar uma doença infecciosa, desde que os próximos testes, com uma amostra maior de participantes, reforcem a consistência dessa abordagem e os médicos consintam em trabalhar com um assistente desse tipo. Os testes comparativos descritos em um artigo recém-publicado na revista PLoS One indicaram que essa nova abordagem proporciona taxas de acerto da ordem de 93% no diagnóstico, enquanto as escalas psicométricas hoje em uso, com base em questionários de avaliação de sintomas, chegam a apenas 67%. “São métodos complementares”, diz Natália. “As escalas psicométricas e a experiência dos médicos continuam indispensáveis.”

“O resultado é bastante simples, mesmo para quem não entende matemática”, diz o físico Mauro Copelli, da Universidade Federal de Pernambuco (UFPE), que participou desse trabalho. O discurso das pessoas com mania se mostra como um emaranhado de pontos e linhas, enquanto o das com esquizofrenia se apresenta como uma reta, com poucos pontos. A teoria dos grafos, que levou a esses diagramas, tem sido usada há séculos para examinar as trajetórias pelas quais um viajante poderia visitar todas as cidades de uma região, por exemplo. Mais recentemente, tem servido para otimizar o tráfego aéreo, considerando os aeroportos como um conjunto de pontos ou nós conectados entre si por meio dos aviões.

“Na primeira vez que rodei o programa de grafos, as diferenças de linguagem saltaram aos olhos”, conta Natália. Em 2007, ao terminar o curso de medicina e começar a residência médica em psiquiatria no hospital da UFRN, Natália notava que muitos diagnósticos diferenciais de mania e de esquizofrenia dependiam da experiência pessoal e de julgamentos subjetivos dos médicos – os que trabalhavam mais com pacientes com esquizofrenia tendiam a encontrar mais casos de esquizofrenia e menos de mania – e muitas vezes não havia consenso. Já se sabia que as pessoas com mania falam mais e se desviam do tópico central muito mais facilmente que as com esquizofrenia, mas isso lhe pareceu genérico demais. 
Em um congresso científico em 2008 em Fortaleza ela conversou com Copelli, que já colaborava com Ribeiro e a incentivou a trabalhar com grafos. No início ela resistiu, por causa da pouca familiaridade com matemática, mas logo depois a nova teoria lhe pareceu simples e prática.

Para levar o trabalho adiante, ela gravou e, com a ajuda de Nathália Lemos e Ana Cardina Pieretti, transcreveu as entrevistas com 24 pessoas 
(oito com mania, oito com esquizofrenia e oito sem qualquer distúrbio mental diagnosticado), a quem pedia para relatar um sonho; qualquer comentário fora desse tema era considerado um voo da imaginação, bastante comum entre as pessoas com mania.

“Já na transcrição, os relatos dos pacientes com mania eram claramente maiores que os com esquizofrenia”, diz. Em seguida, ela eliminou elementos menos importantes como artigos e preposições, dividiu a frase em sujeito, verbo e objetos, representados por pontos ou nós, enquanto a sequência entre elas na frase era representada por setas, unindo dois nós, e assinalou as que não se referiam ao tema central do relato, ou seja, o sonho recente que ela pedira para os entrevistados contarem, e marcavam um desvio do pensamento, comum entre as pessoas com mania.

Um programa específico para grafos baixado de graça na internet indicava as características relevantes para análise – ou atributos – e representava as principais diferenças de discurso entre os participantes, como quantidades de nós, extensão e densidade das conexões entre os pontos, recorrência, prolixidade (ou logorreia) e desvio do tópico central. “É supersimples”, assegura Natália. Nas validações e análises dos resultados, ela contou também com a colaboração de Osame Kinouchi, da Universidade de São Paulo (USP) em Ribeirão Preto, e Guillermo Cecchi, do Centro de Biologia Computacional da IBM, Estados Unidos.

Resultado: as pessoas com mania obtiveram uma pontuação maior que as com esquizofrenia em quase todos os itens avaliados. “A logorreia típica de pacientes com mania não resulta só do excesso de palavras, mas de um discurso que volta sempre ao mesmo tópico, em comparação com o grupo com esquizofrenia”, ela observou. Curiosamente, os participantes do grupo-controle, sem distúrbio mental diagnosticado, apresentaram estruturas discursivas de dois tipos, ora redundantes como os participantes com mania, ora enxutas como os com esquizofrenia, refletindo as diferenças entre suas personalidades ou a motivação para, naquele momento, falar mais ou menos. “A patologia define o discurso, não é nenhuma novidade”, diz ela. “Os psiquiatras são treinados para reconhecer essas diferenças, mas dificilmente poderão dizer que a recorrência de um paciente com mania está 28% menor, por mais experientes que sejam.”

“O ambiente interdisciplinar do instituto foi essencial para realizar esse estudo, porque eu estava todo dia trocando ideias com gente de outras áreas. Nivaldo Vasconcelos, um engenheiro de computação, me ajudou muito”, diz ela. O Instituto do Cérebro, em funcionamento desde 2007, conta atualmente com 13 professores, 22 estudantes de graduação e 42 de pós, 8 pós-doutorandos e 30 técnicos. “Vencidas as dificuldades iniciais, conseguimos formar um grupo de pesquisadores jovens e talentosos”, comemora Ribeiro. “A casa em que estamos agora tem um jardim amplo, e muitas noites ficamos lá até as duas, três da manhã, falando sobre ciência e tomando chimarrão.”

Artigo científico
MOTA, N.B. et al
Speech graphs provide 
a quantitative measure of thought disorder 
in psychosis. PLoS ONE (no prelo).

The ‘perfect chaos’ of π (The Guardian)

One of the most important numbers is irrational

GRRLSCIENTIST, by The Guardian

π has fascinated mathematicians, engineers and other people for centuries. It is a mathematical constant that is the ratio of a circle’s circumference (C) to its diameter (d);

This also explains why and how this number got its name: the lowercase Greek letter π was first adopted in 1706 as an abbreviation for this number because it is the first letter of the Greek for “perimeter”, specifically of a circle. This symbol is convenient because π is an irrational number, meaning that it cannot be expressed as a ratio of a/b, where a and b are integers, that its digits never terminate, and it does not contain an infinitely repeating sequence.

Even though we know that the decimal for π is approximately 3.14159, we actually do not know all its digits precisely: as of October 2011, we know that π has more than 10 trillion non-repeating digits, and the occurrence of these digits appears to be nearly perfectly statistically random. However, we do know that any given sequence of numbers with a finite length has a 100% probability that it will occur somewhere in π — which is the premise of this fun little π search engine. For example, my 8-digit university student ID number pops up after 3.24 million decimal places. My mobile number pops up after 9.69 million decimal places, although it does not show up within the first 200 million digits of π when I add the country and area codes. Where do your digits pop up in π?

Many formulae in mathematics, science, and engineering involve π, which makes it one of the most important mathematical constants. But who first rigorously calculated the value for this irrational number and how was it done? This interesting video explores those questions in more detail:

Those of you who enjoy music probably already know that there’s a song about π by the amazing British singer and songwriter, Kate Bush, where she sings its digits.

New Understanding to Past Global Warming Events: Hyperthermal Events May Be Triggered by Warming (Science Daily)

These geological deposits make the Bighorn Basin area of Wyoming ideal for studying the PETM. (Credit: Aaron Diefendorf)

ScienceDaily (Apr. 2, 2012) — A series of global warming events called hyperthermals that occurred more than 50 million years ago had a similar origin to a much larger hyperthermal of the period, the Pelaeocene-Eocene Thermal Maximum (PETM), new research has found. The findings, published in Nature Geoscience online on April 1, 2012, represent a breakthrough in understanding the major “burp” of carbon, equivalent to burning the entire reservoir of fossil fuels on Earth, that occurred during the PETM.

“As geologists, it unnerves us that we don’t know where this huge amount of carbon released in the PETM comes from,” says Will Clyde, associate professor of Earth sciences at the University of New Hampshire and a co-author on the paper. “This is the first breakthrough we’ve had in a long time. It gives us a new understanding of the PETM.” The work confirms that the PETM was not a unique event – the result, perhaps, of a meteorite strike – but a natural part of Earth’s carbon cycle.

Working in the Bighorn Basin region of Wyoming, a 100-mile-wide area with a semi-arid climate and stratified rocks that make it ideal for studying the PETM, Clyde and lead author Hemmo Abels of Utrecht University in the Netherlands found the first evidence of the smaller hyperthermal events on land. Previously, the only evidence of such events were from marine records.

“By finding these smaller hyperthermal events in continental records, it secures their status as global events, not just an ocean process. It means they are atmospheric events,” Clyde says.

Their findings confirm that, like the smaller hyperthermals of the era that released carbon into the atmosphere, the release of carbon in the PETM had a similar origin. In addition, the warming-to-carbon release of the PETM and the other hyperthermals are similarly scaled, which the authors interpret as an indication of a similar mechanism of carbon release during all hyperthermals, including the PETM.

“It points toward the fact that we’re dealing with the same source of carbon,” Clyde says.

Working in two areas of the Bighorn Basin just east of Yellowstone National Park – Gilmore Hill and Upper Deer Creek – Clyde and Abels sampled rock and soil to measure carbon isotope records. They then compared these continental recordings of carbon release to equivalent marine records already in existence.

During the PETM, temperatures rose between five and seven degrees Celsius in approximately 10,000 years — “a geological instant,” Clyde calls it. This rise in temperature coincided exactly with a massive global change in mammals, as land bridges opened up connecting the continents. Prior to the PETM, North America had no primates, ancient horses, or split-hoofed mammals like deer or cows.

Scientists look to the PETM for clues about the current warming of Earth, although Clyde cautions that “Earth 50 million years ago was very different than it is today, so it’s not a perfect analog.” While scientists still don’t fully understand the causes of these hyperthermal events, “they seem to be triggered by warming,” Clyde says. It’s possible, he says, that less dramatic warming events destabilized these large amounts of carbon, releasing them into the atmosphere where they, in turn, warmed the Earth even more.

“This work indicates that there is some part of the carbon cycle that we don’t understand, and it could accentuate global warming,” Clyde says.

The Social Sciences’ ‘Physics Envy’ (N.Y.Times)

OPINION – GRAY MATTER

Jessica Hagy

By KEVIN A. CLARKE AND DAVID M. PRIMO

Published: April 01, 2012

HOW scientific are the social sciences?

Economists, political scientists and sociologists have long suffered from an academic inferiority complex: physics envy. They often feel that their disciplines should be on a par with the “real” sciences and self-consciously model their work on them, using language (“theory,” “experiment,” “law”) evocative of physics and chemistry.

This might seem like a worthy aspiration. Many social scientists contend that science has a method, and if you want to be scientific, you should adopt it. The method requires you to devise a theoretical model, deduce a testable hypothesis from the model and then test the hypothesis against the world. If the hypothesis is confirmed, the theoretical model holds; if the hypothesis is not confirmed, the theoretical model does not hold. If your discipline does not operate by this method – known as hypothetico-deductivism – then in the minds of many, it’s not scientific.

Such reasoning dominates the social sciences today. Over the last decade, the National Science Foundation has spent many millions of dollars supporting an initiative called Empirical Implications of Theoretical Models, which espouses the importance of hypothetico-deductivism in political science research. For a time, The American Journal of Political Science explicitly refused to review theoretical models that weren’t tested. In some of our own published work, we have invoked the language of model testing, yielding to the pressure of this way of thinking.

But we believe that this way of thinking is badly mistaken and detrimental to social research. For the sake of everyone who stands to gain from a better knowledge of politics, economics and society, the social sciences need to overcome their inferiority complex, reject hypothetico-deductivism and embrace the fact that they are mature disciplines with no need to emulate other sciences.

The ideal of hypothetico-deductivism is flawed for many reasons. For one thing, it’s not even a good description of how the “hard” sciences work. It’s a high school textbook version of science, with everything messy and chaotic about scientific inquiry safely ignored.

A more important criticism is that theoretical models can be of great value even if they are never supported by empirical testing. In the 1950s, for instance, the economist Anthony Downs offered an elegant explanation for why rival political parties might adopt identical platforms during an election campaign. His model relied on the same strategic logic that explains why two competing gas stations or fast-food restaurants locate across the street from each other – if you don’t move to a central location but your opponent does, your opponent will nab those voters (customers). The best move is for competitors to mimic each other.

This framework has proven useful to generations of political scientists even though Mr. Downs did not empirically test it and despite the fact that its main prediction, that candidates will take identical positions in elections, is clearly false. The model offered insight into why candidates move toward the center in competitive elections, and it proved easily adaptable to studying other aspects of candidate strategies. But Mr. Downs would have had a hard time publishing this model today.

Or consider the famous “impossibility theorem,” developed by the economist Kenneth Arrow, which shows that no single voting system can simultaneously satisfy several important principles of fairness. There is no need to test this model with data – in fact, there is no way to test it – and yet the result offers policy makers a powerful lesson: there are unavoidable trade-offs in the design of voting systems.

To borrow a metaphor from the philosopher of science Ronald Giere, theories are like maps: the test of a map lies not in arbitrarily checking random points but in whether people find it useful to get somewhere.

Likewise, the analysis of empirical data can be valuable even in the absence of a grand theoretical model. Did the welfare reform championed by Bill Clinton in the 1990s reduce poverty? Are teenage employees adversely affected by increases in the minimum wage? Do voter identification laws disproportionately reduce turnout among the poor and minorities? Answering such questions about the effects of public policies does not require sweeping theoretical claims, just careful attention to the data.

Unfortunately, the belief that every theory must have its empirical support (and vice versa) now constrains the kinds of social science projects that are undertaken, alters the trajectory of academic careers and drives graduate training. Rather than attempt to imitate the hard sciences, social scientists would be better off doing what they do best: thinking deeply about what prompts human beings to behave the way they do.

Kevin A. Clarke and David M. Primo, associate professors of political science at the University of Rochester, are the authors of “A Model Discipline: Political Science and the Logic of Representations.”

Conservatives’ Trust in Science at All-Time Low (Slate/L.A.Times)

A new study suggests a growing partisan divide as science plays an increasing role in policy debates.By  | Posted Thursday, March 29, 2012, at 1:29 PM ET

91275814
A new report suggests the number of conservatives who trust science is at an all-time low. Photo by Aude Guerrucci-Pool/Getty Images.

This may explain some of the rhetoric we’ve been hearing in GOP stump speeches of late: The number of conservatives who say they have a “great deal” of trust in science has fallen to 35 percent, down 28 points from the mid-1970s, according to a new academic paper.

The study, which was published Thursday in the American Sociological Review, found that liberal and moderate attitudes toward the topic have remained mostly unchanged since national pollsters first began posing the question in 1974, back when roughly half of all liberals and conservatives expressed significant trust in science.

The peer-reviewed research paper explains: “These results are quite profound because they imply that conservative discontent with science was not attributable to the uneducated but to rising distrust among educated conservatives.”

The man behind the study, UNC Chapel Hill’s Gordon Gauchat, says the change comes as conservatives have rebelled against the so-called “elite.”

“It kind of began with the loss of Barry Goldwater and the construction of Fox News and all these [conservative] think tanks. The perception among conservatives is that they’re at a disadvantage, a minority,” Gauchat explained in an interview with U.S. News. “It’s not surprising that the conservative subculture would challenge what’s viewed as the dominant knowledge production groups in society—science and the media.”

The sociologist suggested that the shift is also likely tied to science’s changing role in the national dialogue. In the middle of the 20th century, science was tied closely with NASA and the Department of Defense, but now it more frequently comes up when the conversation shifts to the environment and government regulations.

“Science has become autonomous from the government—it develops knowledge that helps regulate policy, and in the case of the EPA, it develops policy,” he said. “Science is charged with what religion used to be charged with—answering questions about who we are and what we came from, what the world is about. We’re using it in American society to weigh in on political debates, and people are coming down on a specific side.”

You can read a more of the interview at U.S. News, a more detailed recap of the the study over the Los Angeles Times, or check out the full paper here.

Conservatives’ trust in science has declined sharply

Since 1974, when conservatives had the highest trust in science, their confidence has dropped precipitously, an American Sociological Review study concludes.

By John Hoeffel – Los Angeles TimesMarch 29, 2012
As the Republican presidential race has shown, the conservatives who dominate the primaries are deeply skeptical of science — making Newt Gingrich, for one, regret he ever settled onto a couch with Nancy Pelosi to chat about global warming.A study released Thursday in the American Sociological Review concludes that trust in science among conservatives and frequent churchgoers has declined precipitously since 1974, when a national survey first asked people how much confidence they had in the scientific community. At that time, conservatives had the highest level of trust in scientists.

Confidence in scientists has declined the most among the most educated conservatives, the peer-reviewed research paper found, concluding: “These results are quite profound because they imply that conservative discontent with science was not attributable to the uneducated but to rising distrust among educated conservatives.”

“That’s a surprising finding,” said the report’s author, Gordon Gauchat, in an interview. He has a doctorate in sociology and is a postdoctoral fellow at the University of North Carolina at Chapel Hill.

To highlight the dramatic impact conservative views of science have had on public opinion, Gauchat pointed to results from Gallup, which found in 2012 that just 30% of conservatives believed the Earth was warming as a result of greenhouse gases versus 50% two years earlier. In contrast, the poll showed almost no change in the opinion of liberals, with 74% believing in global warming in 2010 versus 72% in 2008.

Gauchat suggested that the most educated conservatives are most acquainted with views that question the credibility of scientists and their conclusions. “I think those people are most fluent with the conservative ideology,” he said. “They have stronger ideological dispositions than people who are less educated.”

Chris Mooney, who wrote “The Republican War on Science,” which Gauchat cites, agreed. “If you think of the reasons behind this as nature versus nurture, all this would be nurture, that it was the product of the conservative movement,” he said. “I think being educated is a proxy for people paying attention to politics, and when they do, they tune in to Fox News and blogs.”

Gauchat also noted the conservative movement had expanded substantially in power and influence, particularly during the presidencies of Ronald Reagan and George W. Bush, creating an extensive apparatus of think tanks and media outlets. “There’s a whole enterprise,” he said.

Science has also increasingly come under fire, Gauchat said, because its cultural authority and its impact on government have grown. For years, he said, the role science played was mostly behind the scenes, creating better military equipment and sending rockets into space.

But with the emergence of the Environmental Protection Agency, for example, scientists began to play a crucial and visible role in developing regulations.

Jim DiPeso, policy director of Republicans for Environmental Protection, has been trying to move his party to the center on issues such as climate change, but he said many Republicans were wary of science because they believed it was “serving the agenda of the regulatory state.”

“There has been more and more resistance to accepting scientific conclusions,” he said. “There is concern about what those conclusions could lead to in terms of bigger government and more onerous regulation.”

The study also found that Americans with moderate political views have long been the most distrustful of scientists, but that conservatives now are likely to outstrip them.

Moderates are typically less educated than either liberals or conservatives, Gauchat said. “These folks are just generally alienated from science,” he said, describing them as the “least engaged and least knowledgeable about basic scientific facts.”

The study was based on results from the General Social Survey, administered between 1974 and 2010 by the National Opinion Research Center at the University of Chicago.

Gauchat, who has been studying public attitudes toward science for about eight years, has applied for a National Science Foundation grant to investigate why trust in science has waned. He plans to ask a battery of questions, including some focused on scientific controversies, such as those overvaccines and genetically modified foods, to try to understand what makes conservatives and moderates so distrustful.

“It’s not one simple thing,” he said.

john.hoeffel@latimes.com

Neela Banerjee in the Washington bureau contributed to this report.

Why The Future Is Better Than You Think (Reason.com)

Sharif Christopher Matar | March 15, 2012

Can a Masai Warrior in Africa today communicate better than Ronald Reagan could? If he’s on a cell phone, Peter Diamandis says he can.

Peter Diamandis is the founder and chairman of the X Prize Foundation, which offers big cash prizes “to bring about radical breakthroughs for the benefit of humanity.” Reason’s Tim Cavanaugh sat down to talk with Peter about his new book Abundance and why he think we live in an “incredible time”, but no one realizes it. Peter thinks that there are some powerful human forces combined with technological advancements that are transforming the world for the better.

“The challenge is that the rate of innovation is so fast…” Peter says, “the government can’t keep up with it.” If the government tries to play “catch up” with regulations and policy, the technology with just go overseas. Certain inovations in “food, water, housing, health, education is getting better and better.” Peter “hopes we are not going to be in a situation where, entrenched interests are preventing the consumer from having better health care.”

Filmed by Sharif Matar and Tracy Oppenheimer. Edited by Sharif Matar

Americans Listening to Politicians, Not Climate Scientists (Ars Technica/Wired)

By Scott K. Johnson, Ars Technica
February 27, 2012

US public opinion about climate change has been riding a roller coaster over the past decade. After signs of growing acceptance and emphasis around 2006 and 2007, a precipitous decline brought us back to where we started, with fully a quarter of the public not even thinking that the planet has warmed up. It’s not shocking that concerns about climate change would take a back seat to the economic recession, but that doesn’t explain why some are skeptical that global warming is even real.

Since economic turmoil does not extend to past temperature measurements, it seems clear that public acceptance of the data depends at least partly on something other than the data itself. So the natural question is — what’s driving public opinion? Why the big shifts? The answer to that question may hold the key to the US’ response to the changing climate.A recent study published in Climatic Change evaluates the impact of several potential opinion drivers: extreme weather events, public access to scientific information, media coverage, advocacy efforts, and the influence of political leaders. These are compared to a compilation of 74 surveys performed by six different organizations. The polls took place between 2002 and 2010, and provide a total of 84,000 responses. The researchers used all the questions that asked respondents to rate their concern about climate change to calculate a “climate change threat index” that could be tracked through time.

For extreme weather events, the researchers used NOAA’s Climate Extremes Index, which includes things like unusually high temperatures and precipitation events, as well as severe droughts. To evaluate public access to scientific information, they tracked the number of climate change papers published in Science, major assessments like the 2007 IPCC report, and climate change articles published in popular science magazines.

Similarly, media coverage was tracked with a simple count of stories appearing on broadcast evening news shows and in several leading periodicals. Advocacy was measured using a number of “major environmental” and “conservative magazines.” In addition, they captured the influence of Al Gore’s An Inconvenient Truth (a favorite target of climate contrarians) using the number of times it was mentioned in the New York Times.

Finally, they counted up congressional press releases, hearings, and votes on bills related to climate change. For comparison, they also looked at the influence of unemployment, GDP, oil prices, and the number of deaths associated with the wars in Iraq and Afghanistan.

The researchers compared each time series to their climate change threat index. They found no statistically significant correlation with extreme weather events, papers in Science(hardly shocking—when was the last time you found Science in the waiting room at the dentist’s?), or oil prices. There was a minor correlation with major scientific assessments.

While articles in popular science magazines and advocacy efforts (especially An Inconvenient Truth) appeared to have an effect, the impact of news media coverage came about because it is transmitting statements from political leaders, what the researchers refer to as “elite cues.” That’s where the meat of this story lies. Those elite cues were the most significant driver of public opinion, followed by economic factors.

The researchers note that around the time when public acceptance of climate change reached its peak, political bipartisanship on the subject also hit a high point. Republican Senator and (then) presidential candidate John McCain was pushing for climate legislation, and current presidential candidate Newt Gingrich filmed a commercial together with an unlikely partner — Democratic Congresswoman Nancy Pelosi — urging action.

And then things changed. The economy went pear-shaped and Republican rhetoric shifted into attack mode on climate science. Gingrich’s commercial with Pelosi offers one example — opposing candidates in the presidential race have used its mere existence as a weapon against him, and Gingrich has tried to distance himself, calling it “the dumbest thing I’ve done in the last four years.”

Flipping this around, it suggests that serious action on climate change depends on a healthy economy and bipartisan agreement among politicians. If that leaves you pondering a future connection between global warming legislation and icy conditions in hell, the cooperation in 2007 indicates it isn’t totally unthinkable.

In addition, recent polling has shown that acceptance of climate change is, once again, climbing among those who identify as moderate Republicans. It’s unclear how to interpret that in terms of this study’s conclusions. Is economic optimism having an impact, have Republican presidential candidates alienated moderates in the party, or is something totally different responsible?

While it’s certainly not surprising, it’s discouraging to see how little effect scientific outreach efforts and reports have had on public opinion. Even on simple questions like “Is there solid evidence that the Earth has warmed?” — it’s politicians that are driving public opinion, not scientists or the data they produce.

Image: Hurricane Ike in 2008. (NOAA)

What the World Is Made Of (Discovery Magazine)

by Sean Carroll

I know you’re all following the Minute Physics videos (that we talked about here), but just in case my knowledge is somehow fallible you really should start following them. After taking care of why stones are round, and why there is no pink light, Henry Reich is now explaining the fundamental nature of our everyday world: quantum field theory and the Standard Model. It’s a multi-part series, since some things deserve more than a minute, dammit.

Two parts have been posted so far. The first is just an intro, pointing out something we’ve already heard: the Standard Model of Particle physics describes all the world we experience in our everyday lives.

The second one, just up, tackles quantum field theory and the Pauli exclusion principle, of which we’ve been recently speaking. (Admittedly it’s two minutes long, but these are big topics!)

The world is made of fields, which appear to us as particles when we look at them. Something everyone should know.

The Inside Story on Climate Scientists Under Siege (Wired/The Guardian)

By Suzanne Goldenberg, The Guardian
February 17, 2012 |

It is almost possible to dismiss Michael Mann’s account of a vast conspiracy by the fossil fuel industry to harass scientists and befuddle the public. His story of that campaign, and his own journey from naive computer geek to battle-hardened climate ninja, seems overwrought, maybe even paranoid.

But now comes the unauthorized release of documents showing how a libertarian thinktank, the Heartland Institute, which has in the past been supported by Exxon, spent millions on lavish conferences attacking scientists and concocting projects to counter science teaching for kindergarteners.

Mann’s story of what he calls the climate wars, the fight by powerful entrenched interests to undermine and twist the science meant to guide government policy, starts to seem pretty much on the money. He’s telling it in a book out on March 6, The Hockey Stick and the Climate Wars: Dispatches From the Front Lines.

“They see scientists like me who are trying to communicate the potential dangers of continued fossil fuel burning to the public as a threat. That means we are subject to attacks, some of them quite personal, some of them dishonest.” Mann said in an interview conducted in and around State College, home of Pennsylvania State University, where he is a professor.

It’s a brilliantly sunny day, and the light snowfall of the evening before is rapidly melting.

Mann, who seems fairly relaxed, has just spoken to a full-capacity, and uniformly respectful and supportive crowd at the university.

It’s hard to square the surroundings with the description in the book of how an entire academic discipline has been made to feel under siege, but Mann insists that it is a given.

“It is now part of the job description if you are going to be a scientist working in a socially relevant area like human-caused climate change,” he said.

He should know. For most of his professional life has been at the center of those wars, thanks to a paper he published with colleagues in the late 1990s showing a sharp upward movement in global temperatures in the last half of the 20th century. The graph became known as the “hockey stick”.

If the graph was the stick, then its publication made Mann the puck. Though other prominent scientists, such as Nasa’s James Hansen and more recently Texas Tech University’s Katharine Hayhoe, have also been targeted by contrarian bloggers and thinktanks demanding their institutions turn over their email record, it’s Mann who’s been the favorite target.

He has been regularly vilified on Fox news and contrarian blogs, and by Republican members of Congress. The attorney general of Virginia, who has been fighting in the courts to get access to Mann’s email from his earlier work at the University of Virginia. And then there is the high volume of hate mail, the threats to him and his family.

“A day doesn’t go by when I don’t have to fend off some attack, some specious criticism or personal attack,” he said. “Literally a day doesn’t go by where I don’t have to deal with some of the nastiness that comes out of a campaign that tries to discredit me, and thereby in the view of our detractors to discredit the entire science of climate change.”

By now he and other climate scientists have been in the trenches longer than the U.S. army has been in Afghanistan.

And Mann has proved a willing combatant. He has not gone so far as Hansen, who has been arrested at the White House protesting against tar sands oil and in West Virginia protesting against coal mining. But he spends a significant part of his working life now blogging and tweeting in his efforts to engage with the public – and fending off attacks.

On the eve of his talk at Penn State, a coal industry lobby group calling itself the Common Sense Movement/Secure Energy for America put up a Facebook page demanding the university disinvite their own professor from speaking, and denouncing Mann as a “disgraced academic” pursuing a radical environmental agenda. The university refused. Common Sense appeared to have dismantled the Facebook page.

But Mann’s attackers were merely regrouping. A hostile blogger published a link to Mann’s Amazon page, and his opponents swung into action, denouncing the book as a “fairy tale” and climate change as “the greatest scam in human history.”

It was not the life Mann envisaged when he began work on his post-graduate degree at Yale. All Mann knew then was that he wanted to work on big problems, that resonated outside academia. At heart, he said, he was like one of the amiable nerds on the television show Big Bang Theory.

“At that time I wanted nothing more than just to bury my head in my computer and study data and write papers and write programs,” he said. “That is the way I was raised. That is the culture I came from.”

What happened instead was that the “hockey stick” graph, because it so clearly represented what had happened to the climate over the course of hundreds of years, itself became a proxy in the climate wars. (Mann’s reconstruction of temperatures over the last millennium itself used proxy records from tree rings and coral).

“I think because the hockey stick became an icon, it’s been subject to the fiercest of attacks really in the whole science of climate change,” he said.

The U.N.’s Intergovernmental Panel on Climate Change produced a poster-sized graph for the launch of its climate change report in 2001.

Those opposed to climate change began accusing Mann of overlooking important data or even manipulating the records. None of the allegations were ever found to have substance. The hockey stick would eventually be confirmed by more than 10 other studies.

Mann, like other scientists, was just not equipped to deal with the media barrage. “It took the scientific community some time I think to realize that the scientific community is in a street fight with climate change deniers and they are not playing by the rules of engagement of science. The scientific community needed some time to wake up to that.”

By 2005, when Hurricane Katrina drew Americans’ attention to the connection between climate change and coastal flooding, scientists were getting better at making their case to the public. George Bush, whose White House in 2003 deleted Mann’s hockey stick graph from an environmental report, began talking about the need for biofuels. Then Barack Obama was elected on a promise to save a planet in peril.

But as Mann lays out in the book, the campaign to discredit climate change continued to operate, largely below the radar until November 2009 when a huge cache of email from the University of East Anglia’s Climatic Research Unit was released online without authorization.

Right-wing media and bloggers used the emails to discredit an entire body of climate science. They got an extra boost when an embarrassing error about melting of Himalayan glaciers appeared in the U.N.’s IPCC report.

Mann now admits the climate community took far too long to realize the extent of the public relations debacle. Aside from the glacier error, the science remained sound. But Mann said now: “There may have been an overdue amount of complacency among many in the scientific community.”

Mann, who had been at the center of so many debates in America, was at the heart of the East Anglia emails battle too.

Though he has been cleared of any wrongdoing, Mann does not always come off well in those highly selective exchanges of email released by the hackers. In some of the correspondence with fellow scientists, he is abrupt, dismissive of some critics. In our time at State College, he mentions more than once how climate scientists are a “cantankerous” bunch. He has zero patience, for example, for the polite label “climate skeptic” for the network of bloggers and talking heads who try to discredit climate change.

“When it comes to climate change, true skepticism is two-sided. One-sided skepticism is no skepticism at all,” he said. “I will call people who deny the science deniers … I guess I won’t be deterred by the fact that they don’t like the use of that term and no doubt that just endears me to them further.”

“It’s frustrating of course because a lot of us would like to get past this nonsensical debate and on to the real debate to be had about what to do,” he said.

But he said there are compensations in the support he gets from the public. He moves over to his computer to show off a web page: I ❤ climate scientists. He’s one of three featured scientists. “It only takes one thoughtful email of support to offset a thousand thoughtless attacks,” Mann said.

And although there are bad days, he still seems to believe he is on the winning side.

Across America, this is the third successive year of weird weather. The U.S. department of agriculture has just revised its plant hardiness map, reflecting warming trends. That is going to reinforce scientists’ efforts to cut through the disinformation campaign, Mann said.

“I think increasingly the campaign to deny the reality of climate change is going to come up against that brick wall of the evidence being so plain to people whether they are hunters, fishermen, gardeners,” he said.

And if that doesn’t work then Mann is going to fight to convince them.

“Whether I like it or not I am out there on the battlefield,” he said. But he believes the experiences of the last decade have made him, and other scientists, far better fighters.

“Those of us who have had to go through this are battle-hardened and hopefully the better for it,” he said. “I think you are now going to see the scientific community almost uniformly fighting back against this assault on science. I don’t know what’s going to happen in the future, but I do know that my fellow scientists and I are very ready to engage in this battle.”

Video: James West, The Climate Desk

Original story at The Guardian.

O planeta doente (culturaebarbarie.org)

por Guy Debord

A “poluição” está hoje na moda, exatamente da mesma maneira que a revolução: ela se apodera de toda a vida da sociedade e é representada ilusoriamente no espetáculo. Ela é tagarelice tediosa numa pletora de escritos e de discursos errôneos e mistificadores, e, nos fatos, ela pega todo mundo pelo pescoço. Ela se expõe em todo lugar enquanto ideologia e ganha terreno enquanto processo real. Esses dois movimentos antagônicos, o estágio supremo da produção mercantil e o projeto de sua negação total, igualmente ricos de contradições em simesmos, crescem em conjunto. São os dois lados pelos quais se manifesta um mesmo momento histórico há muito tempo esperado e freqüentemente previsto sob figuras parciais inadequadas: a impossibilidade da continuação do funcionamento do capitalismo.

A época que tem todos os meios técnicos de alterar as condições de vida na Terra é igualmente a época que, pelo mesmo desenvolvimento técnico e científico separado, dispõe de todos os meios de controle e de previsão matematicamente indubitável para medir com exatidão antecipada para onde conduz — e em que data — o crescimento automático das forças produtivas alienadas da sociedade de classes: isto é, para medir a degradação rápida das condições de sobrevida, no sentido o mais geral e o mais trivial do termo.

Enquanto imbecis passadistas ainda dissertam sobre, e contra, uma crítica estética de tudo isso, e crêem mostrar-se lúcidos e modernos por se mostrarem esposados com seu século, proclamando que a auto-estrada ou Sarcelles têm sua beleza que se deveria preferir ao desconforto dos “pitorescos” bairros antigos ou ainda fazendo observar gravemente que o conjunto da população come melhor, a despeito das nostalgias da boa cozinha, já o problema da degradação da totalidade do ambiente natural e humano deixou completamente de se colocar no plano da pretensa qualidade antiga, estética ou outra, para se tornar radicalmente o próprio problema da possibilidade material de existência do mundo que persegue um tal movimento. A impossibilidade está de fato já perfeitamente demonstrada por todo o conhecimento científico separado, que discute somente sua data de vencimento; e os paliativos que, se fossem aplicados firmemente, a poderiam regular superficialmente. Uma tal ciência apenas pode acompanhar em direção à destruição o mundo que a produziu e que a mantém; mas ela é obrigada a fazê-lo com os olhos abertos. Ela mostra assim, num nível caricatural, a inutilidade do conhecimento sem uso.

Mede-se e se extrapola com uma precisão excelente o aumento rápido da poluição química da atmosfera respirável, da água dos rios, dos lagos e até mesmo dos oceanos; e o aumento irreversível da radioatividade acumulada pelo desenvolvimento pacífico da energia nuclear, dos efeitos do barulho, da invasão do espaço por produtos de materiais plásticos que podem exigir uma eternidade de depósito universal, da natalidade louca, da falsificação insensata dos alimentos, da lepra urbanística que se estende sempre mais no lugar do que antes foram a cidade e o campo; assim como as doenças mentais — aí compreendidas as fobias neuróticas e as alucinações que não poderiam deixar de se multiplicar bem cedo sobre o tema da própria poluição, da qual se mostra em todo lugar a imagem alarmante — e do suicídio, cujas taxas de expansão se entrecruzam já exatamente com as de edificação de um tal ambiente (para não falar dos efeitos da guerra atômica ou bacteriológica, cujos meios estão posicionados como a espada de Dâmocles, mas permanecem evidentemente evitáveis).

Logo, se a amplitude e a própria realidade dos “terrores do Ano Mil” são ainda um assunto controverso entre os historiadores, o terror do Ano Dois Mil é tão patente quanto bem fundado; ele é desde o presente uma certeza científica. Contudo, o que se passa não é em si mesmo nada novo: é somente o fim necessário do antigo processo. Uma sociedade cada vez mais doente, mas cada vez mais poderosa, recriou em todo lugar concretamente o mundo como ambiente e décorde sua doença, enquanto planeta doente. Uma sociedade que não se tornou ainda homogênea e que não é mais determinada por si mesma, mas cada vez maispor uma parte dela mesma que lhe é superior, desenvolveu um movimento de dominação da natureza que contudo não se dominou a si mesmo. O capitalismo finalmente trouxe a prova, por seu próprio movimento, de que ele não pode mais desenvolver as forças produtivas; e isso não quantitativamente, como muitos acreditaram compreender, mas qualitativamente.

Contudo, para o pensamento burguês, metodologicamente, somente o quantitativo é o sério, o mensurável, o efetivo; e o qualitativo é somente a incerta decoração subjetiva ou artística do verdadeiro real estimado em seu verdadeiro peso. Ao contrário, para o pensamento dialético, portanto, para a história e para o proletariado, o qualitativo é a dimensão a mais decisiva do desenvolvimento real. Eis aí o que o capitalismo e nós terminamos por demonstrar.

Os senhores da sociedade são obrigados agora a falar da poluição, tanto para combatê-la (pois eles vivem, apesar de tudo, no mesmo planeta que nós; é este o único sentido ao qual se pode admitir que o desenvolvimento do capitalismo realizou efetivamente uma certa fusão das classes) e para a dissimular, pois a simples verdade dos danos e dos riscos presentes basta para constituir um imenso fator de revolta, uma exigência materialista dos explorados, tão inteiramente vital quanto o foi a luta dos proletários do século XIX pela possibilidade de comer. Após o fracasso fundamental de todos os reformismos do passado — que aspiram todos eles à solução definitiva do problema das classes —, um novo reformismo se desenha, que obedece às mesmas necessidades que os precedentes: lubrificar a máquina e abrir novas oportunidades de lucros às empresas de ponta. O setor mais moderno da indústria se lança nos diferentes paliativos da poluição, como em um novo nicho de mercado, tanto mais rentável quanto mais uma boa parte do capital monopolizado pelo Estado nele está a empregar e a manobrar. Mas se este novo reformismo tem de antemão a garantia de seu fracasso, exatamente pelas mesmas razões que os reformismos passados, ele guarda em face deles a radical diferença de que não tem mais tempo diante de si.

O desenvolvimento da produção se verificou inteiramente até aqui enquanto realização daeconomia política: desenvolvimento da miséria, que invadiu e estragou o próprio meio da vida. A sociedade em que os produtores se matam no trabalho, e cujo resultado devem somente contemplar, lhes deixa claramente ver, e respirar, o resultado geral do trabalho alienado enquanto resultado de morte. Na sociedade da economia superdesenvolvida, tudo entrou na esfera dos bens econômicos, mesmo a água das fontes e o ar das cidades, quer dizer que tudo se tornou o mal econômico, “negação acabada do homem” que atinge agora sua perfeita conclusão material. O conflito entre as forças produtivas modernas e as relações de produção, burguesas ou burocráticas, da sociedade capitalista entrou em sua fase última. A produção da não-vida prosseguiu cada vez mais seu processo linear e cumulativo; vindo a atravessar um último limiar em seu progresso, ela produz agora diretamente a morte.

A função última, confessada, essencial, da economia desenvolvida hoje, no mundo inteiro em que reina o trabalho-mercadoria, que assegura todo o poder a seus patrões, é a produção dos empregos. Está-se bem longe das idéias “progressistas” do século anterior [século XIX] sobre a diminuição possível do trabalho humano pela multiplicação científica e técnica da produtividade, que se supunha assegurar sempre mais facilmente a satisfação das necessidades anteriormente reconhecidas por todos reais e sem alteração fundamental da qualidade mesma dos bens que se encontrariam disponíveis. É presentemente para produzir empregos, até nos campos esvaziados de camponeses, ou seja, para utilizar o trabalho humano enquanto trabalho alienado, enquanto assalariado, que se faz todo o resto; e, portanto, que se ameaça estupidamente as bases, atualmente mais frágeis ainda que o pensamento de um Kennedy ou de um Brejnev, da vida da espécie.

O velho oceano é em si mesmo indiferente à poluição; mas a história não o é. Ela somente pode ser salva pela abolição do trabalho-mercadoria. E nunca a consciência histórica teve tanta necessidade de dominar com tanta urgência seu mundo, pois o inimigo que está à sua porta não é mais a ilusão, mas sua morte.

Quando os pobres senhores da sociedade da qual vemos a deplorável conclusão, bem pior do que todas as condenações que puderam fulminar outrora os mais radicais dos utopistas, devem presentemente reconhecer que nosso ambiente se tornou social, que a gestão detudo se tornou um negócio diretamente político, até as ervas dos campos e a possibilidade de beber, até a possibilidade de dormir sem muitos soníferos ou de tomar um banho sem sofrer de alergias, num tal momento se deve ver também que a velha política especializada deve reconhecer que ela está completamente finda.

Ela está finda na forma suprema de seu voluntarismo: o poder burocrático totalitário dos regimes ditos socialistas, porque os burocratas no poder não se mostraram capazes nem mesmo de gerir o estágio anterior da economia capitalista. Se eles poluem muito menos — apenas os Estados Unidos produzem sozinhos 50% da poluição mundial — é porque são muito mais pobres. Eles somente podem, como por exemplo a China, reunindo em bloco uma parte desproporcionada de sua contabilidade de miséria, comprar a parte de poluição de prestígio das potências pobres, algumas descobertas e aperfeiçoamentos nas técnicas da guerra termonuclear, ou mais exatamente, do espetáculo ameaçador. Tanta pobreza, material e mental, sustentada por tanto terrorismo, condena as burocracias no poder. E o que condena o poder burguês mais modernizado é o resultado insuportável de tanta riquezaefetivamente empestada. A gestão dita democrática do capitalismo, em qualquer país que seja, somente oferece suas eleições-demissões que, sempre se viu, nunca mudava nada no conjunto, e mesmo muito pouco no detalhe, numa sociedade de classes que se imaginava poder durar indefinidamente. Elas aí não mudam nada de mais no momento em que a própria gestão enlouquece e finge desejar, para cortar certos problemas secundários embora urgentes, algumas vagas diretrizes do eleitorado alienado e cretinizado (U.S.A., Itália, Inglaterra, França). Todos os observadores especializados sempre salientaram — sem se preocuparem em explicar — o fato de que o eleitor não muda nunca de “opinião”: é justamente porque é eleitor, o que assume, por um breve instante, o papel abstrato que é precisamente destinado a impedir de ser por si mesmo, e de mudar (o mecanismo foi demonstrado centenas de vezes, tanto pela análise política desmistificada quanto pelas explicações da psicanálise revolucionária). O eleitor não muda mais quando o mundo muda sempre mais precipitadamente em torno dele e, enquanto eleitor, ele não mudaria mesmo às vésperas do fim do mundo. Todo sistema representativo é essencialmente conservador, mesmo se as condições de existência da sociedade capitalista não puderam nunca ser conservadas: elas se modificam sem interrupção, e sempre mais rápido, mas a decisão — que afinal é sempre a decisão de liberar o próprio processo da produção capitalista — é deixada inteiramente aos especialistas da publicidade, quer sejam eles únicos na competição ou em concorrência com aqueles que vão fazer a mesma coisa, e aliás o anunciam abertamente. Contudo, o homem que vota “livremente” nos gaullistas ou no P.C.F., tanto quanto o homem que vota, constrangido e forçado, num Gomulka, é capaz de mostrar o que ele verdadeiramente é, na semana seguinte, participando de uma greve selvagem ou de uma insurreição.

A autoproclamada “luta contra a poluição”, por seu aspecto estatal e legalista, vai de início criar novas especializações, serviços ministeriais, cargos, promoção burocrática. E sua eficácia estará completamente na medida de tais meios. Mas ela somente pode se tornar uma vontade real ao transformar o sistema produtivo atual em suas próprias raízes. E somente pode ser aplicada firmemente no instante em que todas suas decisões, tomadas democraticamente em conhecimento pleno de causa, pelos produtores, estiverem a todo instante controladas e executadas pelos próprios produtores (por exemplo, os navios derramarão infalivelmente seu petróleo no mar enquanto não estiverem sob a autoridade de reais soviets de marinheiros). Para decidir e executar tudo isso, é preciso que os produtores se tornem adultos: é preciso que se apoderem todos do poder.

O otimismo científico do século XIX se desmoronou em três pontos essenciais. Primeiro, a pretensão de garantir a revolução como resolução feliz dos conflitos existentes (esta era a ilusão hegelo-esquerdista e marxista; a menos notada naintelligentsia burguesa, mas a mais rica e, afinal, a menos ilusória). Segundo, a visão coerente do universo, e mesmo simplesmente, da matéria. Terceiro, o sentimento eufórico e linear do desenvolvimento das forças produtivas. Se nós dominarmos o primeiro ponto, teremos resolvido o terceiro; e saberemos fazer bem mais tarde do segundo nossa ocupação e nosso jogo. Não é preciso tratar dos sintomas, mas da própria doença. Hoje o medo está em todo lugar, somente sairemos dele confiando-nos em nossas próprias forças, em nossa capacidade de destruir toda alienação existente e toda imagem do poder que nos escapou. Remetendo tudo, com exceção de nós próprios, ao único poder dos Conselhos de Trabalhadores possuindo e reconstruindo a todo instante a totalidade do mundo, ou seja, à racionalidade verdadeira, a uma legitimidade nova.

Em matéria de ambiente “natural” e construído, de natalidade, de biologia, de produção, de “loucura” etc., não haverá que escolher entre a festa e a infelicidade, mas, conscientemente e em cada encruzilhada, entre, de um lado, mil possibilidades felizes ou desastrosas, relativamente corrigíveis, e, de outra parte, o nada. As escolhas terríveis do futuro próximo deixam esta única alternativa: democracia total ou burocracia total. Aqueles que duvidam da democracia total devem esforçar-se para fazer por si mesmos a prova dela, dando-lhe a oportunidade de se provar em marcha; ou somente lhes resta comprar seu túmulo a prestações, pois “a autoridade, se a viu em obra, e suas obras a condenam” (Jacques Déjacque).

“A revolução ou a morte”: esse slogan não é mais a expressão lírica da consciência revoltada, é a última palavra do pensamento científico de nosso século [XX]. Isso se aplica aos perigos da espécie como à impossibilidade de adesão pelos indivíduos. Nesta sociedade em que o suicídio progride como se sabe, os especialistas tiveram que reconhecer, com um certo despeito, que ele caíra a quase nada em maio de 1968. Essa primavera obteve assim, sem precisamente subi-lo em assalto, um bom céu, porque alguns carros queimaram e porque a todos os outros faltou combustível para poluir. Quando chove, quando há nuvens sobre Paris, não esqueçam nunca que isso é responsabilidade do governo. A produção industrial alienada faz chover. A revolução faz o bom tempo.

Escrito em 1971, por Guy Debord, para aparecer no nº 13 da revista Internacional Situacionista, este artigo permaneceu inédito até recentemente, quando foi publicado, junto com dois outros textos do mesmo autor, em La Planète malade (Paris, Gallimard, 2004, pp. 77-94). A tradução de “O planeta doente” aqui publicada apareceu pela primeira vez em http://juralibertaire.over-blog.com/article-13908597.html. Tradução de Emiliano Aquino (http://emilianoaquino.blogspot.com/).

Fonte:  http://culturaebarbarie.org/sopro/arquivo/planetadoente.html

You can’t do the math without the words (University of Miami Press Release)

University of Miami anthropological linguist studies the anumeric language of an Amazonian tribe; the findings add new perspective to the way people acquire knowledge, perception and reasoning

Marie Guma Diaz
University of Miami

 VIDEO: Caleb Everett, assistant professor in the department of anthropology at the University of Miami College of Arts and Sciences, talks about the unique insight we gain about people by studying…

CORAL GABLES, FL (February 20, 2012)–Most people learn to count when they are children. Yet surprisingly, not all languages have words for numbers. A recent study published in the journal ofCognitive Science shows that a few tongues lack number words and as a result, people in these cultures have a difficult time performing common quantitative tasks. The findings add new insight to the way people acquire knowledge, perception and reasoning.

The Piraha people of the Amazon are a group of about 700 semi-nomadic people living in small villages of about 10-15 adults, along the Maici River, a tributary of the Amazon. According to University of Miami (UM) anthropological linguist Caleb Everett, the Piraha are surprisingly unable to represent exact amounts. Their language contains just three imprecise words for quantities: Hòi means “small size or amount,” hoì, means “somewhat larger amount,” and baàgiso indicates to “cause to come together, or many.” Linguists refer to languages that do not have number specific words as anumeric.

“The Piraha is a really fascinating group because they are really only one or two groups in the world that are totally anumeric,” says Everett, assistant professor in the Department of Anthropology at the UM College of Arts and Sciences. “This is maybe one of the most extreme cases of language actually restricting how people think.”

His study “Quantity Recognition Among speakers of an Anumeric Language” demonstrates that number words are essential tools of thought required to solve even the simplest quantitative problems, such as one-to-one correspondence.

“I’m interested in how the language you speak affects the way that you think,” says Everett. “The question here is what tools like number words really allows us to do and how they change the way we think about the world.”

The work was motivated by contradictory results on the numerical performance of the Piraha. An earlier article reported the people incapable of performing simple numeric tasks with quantities greater than three, while another showed they were capable of accomplishing such tasks.

Everett repeated all the field experiments of the two previous studies. The results indicated that the Piraha could not consistently perform simple mathematical tasks. For example, one test involved 14 adults in one village that were presented with lines of spools of thread and were asked to create a matching line of empty rubber balloons. The people were not able to do the one-to-one correspondence, when the numbers were greater than two or three.

The study provides a simple explanation for the controversy. Unbeknown to other researchers, the villagers that participated in one of the previous studies had received basic numerical training by Keren Madora, an American missionary that has worked with the indigenous people of the Amazon for 33 years, and co-author of this study. “Her knowledge of what had happened in that village was crucial. I understood then why they got the results that they did,” Everett says.

Madora used the Piraha language to create number words. For instance she used the words “all the sons of the hand,” to indicate the number four. The introduction of number words into the village provides a reasonable explanation for the disagreement in the previous studies.

The findings support the idea that language is a key component in processes of the mind. “When they’ve been introduced to those words, their performance improved, so it’s clearly a linguistic effect, rather than a generally cultural factor,” Everett says. The study highlights the unique insight we gain about people and society by studying mother languages.

“Preservation of mother tongues is important because languages can tell us about aspects of human history, human cognition, and human culture that we would not have access to if the languages are gone,” he says. “From a scientific perspective I think it’s important, but it’s most important from the perspective of the people, because they lose a lot of their cultural heritage when their languages die.”