Arquivo da tag: Semiótica

Visual Perception System Unconsciously Affects Our Preferences (Science Daily)

ScienceDaily (May 23, 2012) — When grabbing a coffee mug out of a cluttered cabinet or choosing a pen to quickly sign a document, what brain processes guide your choices?

New research from Carnegie Mellon University’s Center for the Neural Basis of Cognition (CNBC) shows that the brain’s visual perception system automatically and unconsciously guides decision-making through valence perception. Published in the journal Frontiers in Psychology, the review hypothesizes that valence, which can be defined as the positive or negative information automatically perceived in the majority of visual information, integrates visual features and associations from experience with similar objects or features. In other words, it is the process that allows our brains to rapidly make choices between similar objects.

The findings offer important insights into consumer behavior in ways that traditional consumer marketing focus groups cannot address. For example, asking individuals to react to package designs, ads or logos is simply ineffective. Instead, companies can use this type of brain science to more effectively assess how unconscious visual valence perception contributes to consumer behavior.

To transfer the research’s scientific application to the online video market, the CMU research team is in the process of founding the start-up company neonlabs through the support of the National Science Foundation (NSF) Innovation Corps (I-Corps).

“This basic research into how visual object recognition interacts with and is influenced by affect paints a much richer picture of how we see objects,” said Michael J. Tarr, the George A. and Helen Dunham Cowan Professor of Cognitive Neuroscience and co-director of the CNBC. “What we now know is that common, household objects carry subtle positive or negative valences and that these valences have an impact on our day-to-day behavior.”

Tarr added that the NSF I-Corps program has been instrumental in helping the neonlabs’ team take this basic idea and teaching them how to turn it into a viable company. “The I-Corps program gave us unprecedented access to highly successful, experienced entrepreneurs and venture capitalists who provided incredibly valuable feedback throughout the development process,” he said.

NSF established I-Corps for the sole purpose of assessing the readiness of transitioning new scientific opportunities into valuable products through a public-private partnership. The CMU team of Tarr, Sophie Lebrecht, a CNBC and Tepper School of Business postdoctoral fellow, Babs Carryer, an embedded entrepreneur at CMU’s Project Olympus, and Thomas Kubilius, president of Pittsburgh-based Bright Innovation and adjunct professor of design at CMU, were awarded a $50,000, six-month grant to investigate how understanding valence perception could be used to make better consumer marketing decisions. They are launching neonlabs to apply their model of visual preference to increase click rates on online videos, by identifying the most visually appealing thumbnail from a stream of video. The web-based software product selects a thumbnail based on neuroimaging data on object perception and valence, crowd sourced behavioral data and proprietary computational analyses of large amounts of video streams.

“Everything you see, you automatically dislike or like, prefer or don’t prefer, in part, because of valence perception,” said Lebrecht, lead author of the study and the entrepreneurial lead for the I-Corps grant. “Valence links what we see in the world to how we make decisions.”

Lebrecht continued, “Talking with companies such as YouTube and Hulu, we realized that they are looking for ways to keep users on their sites longer by clicking to watch more videos. Thumbnails are a huge problem for any online video publisher, and our research fits perfectly with this problem. Our approach streamlines the process and chooses the screenshot that is the most visually appealing based on science, which will in the end result in more user clicks.”

Wearing Two Different Hats: Moral Decisions May Depend On the Situation (Science Daily)

ScienceDaily (May 23, 2012) — An individual’s sense of right or wrong may change depending on their activities at the time — and they may not be aware of their own shifting moral integrity — according to a new study looking at why people make ethical or unethical decisions.

Focusing on dual-occupation professionals, the researchers found that engineers had one perspective on ethical issues, yet when those same individuals were in management roles, their moral compass shifted. Likewise, medic/soldiers in the U.S. Army had different views of civilian casualties depending on whether they most recently had been acting as soldiers or medics.

In the study, to be published in a future issue of The Academy of Management Journal, lead author Keith Leavitt of Oregon State University found that workers who tend to have dual roles in their jobs would change their moral judgments based on what they thought was expected of them at the moment.

“When people switch hats, they often switch moral compasses,” Leavitt said. “People like to think they are inherently moral creatures — you either have character or you don’t. But our studies show that the same person may make a completely different decision based on what hat they may be wearing at the time, often without even realizing it.”

Leavitt, an assistant professor of management in the College of Business at OSU, is an expert on non-conscious decision making and business ethics. He studies how people make decisions and moral judgments, often based on non-conscious cues.

He said recent high-profile business scandals, from the collapse of Enron to the Ponzi scheme of Bernie Madoff, have called into question the ethics of professionals. Leavitt said professional organizations, employers and academic institutions may want to train and prepare their members for practical moral tensions they may face when asked to serve in multiple roles.

“What we consider to be moral sometimes depends on what constituency we are answering to at that moment,” Leavitt said. “For a physician, a human life is priceless. But if that same physician is a managed-care administrator, some degree of moral flexibility becomes necessary to meet their obligations to stockholders.”

Leavitt said subtle cues — such as signage and motivation materials around the office — should be considered, along with more direct training that helps employees who juggle multiple roles that could conflict with one another.

“Organizations and businesses need to recognize that even very subtle images and icons can give employees non-conscious clues as to what the firm values,” he said. “Whether they know it or not, people are often taking in messages about what their role is and what is expected of them, and this may conflict with what they know to be the moral or correct decision.”

The researchers conducted three different studies with employees who had dual roles. In one case, 128 U.S. Army medics were asked to complete a series of problem-solving tests, which included subliminal cues that hinted they might be acting as either a medic or a soldier. No participant said the cues had any bearing on their behavior — but apparently they did. A much larger percentage of those in the medic category than in the soldier category were unwilling to put a price on human life.

In another test, a group of engineer-managers were asked to write about a time they either behaved as a typical manager, engineer, or both. Then they were asked whether U.S. firms should engage in “gifting” to gain a foothold in a new market. Despite the fact such a practice would violate federal laws, more than 50 percent of those who fell into the “manager” category said such a practice might be acceptable, compared to 13 percent of those in the engineer category.

“We find that people tend to make decisions that may conflict with their morals when they are overwhelmed, or when they are just doing routine tasks without thinking of the consequences,” Leavitt said. “We tend to play out a script as if our role has already been written. So the bottom line is, slow down and think about the consequences when making an ethical decision.”

What-If and What-Is: The Role of Speculation in Science (N.Y.Times)

SIDE EFFECTS

MPI/Getty Images

An illustration from about 1850 of a dog with a small travois in an Assiniboine encampment.

By JAMES GORMAN – Published: May 24, 2012

Woody Allen once said that when you do comedy, you sit at the children’s table. The same might be said of speculation in science.

And yet speculation is an essential part of science. So how does it fit in? Two recent publications, both about the misty depths of canine and human history, suggest some answers. In one, an international team of scientists concludes that we really don’t know when and where dogs were domesticated. Greger Larson of the University of Durham, in England, the first of 20 authors of that report, said of dog DNA, “it’s a mess.”

In the other, Pat Shipman, an independent scientist and writer suggests that dogs may have helped modern humans push the Neanderthals out of existence and might even have helped shape human evolution.

Is one right and the other wrong? Are both efforts science — one a data-heavy reality check and the other freewheeling speculation? The research reported by Dr. Larson and his colleagues in The Proceedings of the National Academy of Sciences is solid science, easily judged by peers, at any rate. The essay by Dr. Shipman is not meant to come to any conclusion but to prompt thought and more research. It, too, will be judged by other scientists, and read by many nonscientists.

But how is one to judge the value of speculation? There are a few obvious ways. The questions readers ought to ask when confronting a “what-if “as opposed to “what-is” article are: Does the writer make it clear what is known, what is probable, and what is merely possible?

Dr. Shipman was careful to make these distinctions in her essay inAmerican Scientist, and in an interview, when I asked her to walk me through her argument.

First, she said, we know that modern humans and Neanderthals occupied Europe at the same time, from about 45,000 to 25,000 years ago, and that the fortunes of the modern humans rose as those of the Neanderthals fell. Somehow the modern humans outcompeted the Neanderthals. And here we are now, with our computers, our research, and our beloved dogs, which, scientists agree evolved from wolves.

Second, and this point is crucial, Dr. Shipman thinks dogs were very probably around during this time period, although she recognizes that others disagree. She tells us about the research that convinced her, so we can check it ourselves, if we like: a 2009 report of three skulls, the oldest dating to 32,000 years ago, by Mietje Germonpré of The Royal Belgian Institute of Natural Sciences in The Journal of Archaeological Science.

The skulls are clearly of members of the canid family, but that includes wolves, jackals and foxes. Dr. Germonpré and her colleagues concluded that the skulls belonged to dogs. That’s where things get sticky.

The rest of Dr. Shipman’s essay is clear enough. If the humans had dogs, the dogs must have been helping somehow, in hunting or pulling travois. And they may have been so helpful that they gave modern humans an edge over the Neanderthals (unless the Neanderthals had dogs, too). If they helped in hunting, they might have watched human eyes for clues about what was going on, as they do now. Other researchers have suggested that the white of the human eye evolved to foster cooperation because we could more easily see where others were looking, than with plain brown eyes.

If dogs were watching us too, that would have added survival value to having a partly white eye and thus played a role in our evolution. Fair enough, but the dogs had to be there at that time when humans and Neanderthals overlapped. I asked Dr. Larson about Dr. Shipman’s essay, and I confess I expected he might object to its speculative nature. Not so. “I love speculation,” he wrote back, “I do it all the time.” And, he said of Dr. Shipman’s essay, “it’s a lovely chain of reasoning.”

But, he said, “it begins from the premise that the late Pleistocene canid remains are dogs. And they are not.”

He wrote, “there is not a single piece of (credible) evidence to suggest that the domestication process was under way 30,000 years ago.” He cited an article in press in The Journal of Archaeological Science that is highly critical of the Germonpré paper. The article, written by Susan J. Crockford at the University of Victoria and Yaroslav V. Kuzmin at the Siberian branch of the Russian Academy of Sciences, suggests that the skulls in question came from short-faced wolves and do not indicate that the domestication process had begun. Dr. Crockford, who had read Dr. Shipman’s paper, thought it “too speculative for science.” But she did not view the case of early domestication as completely closed.

She said in an e-mail: “We simply need more work on these ancient wolves before we can determine if these canids are incipient dogs (in the process of becoming dogs, although not there yet) or if they simply reflect the normal variation in ancient wolves. At present, I am leaning strongly towards the later (normal variation in wolves).”

Perhaps the way to judge the scientific value of speculation would be to see if it prompts more research, more collecting of fossils, more study. Until then, only proximate answers will exist to the question of where dogs came from.

Mine came from a shelter? How about yours?

Soldiers Who Desecrate the Dead See Themselves as Hunters (Science Daily)

ScienceDaily (May 20, 2012) — Modern day soldiers who mutilate enemy corpses or take body-parts as trophies are usually thought to be suffering from the extreme stresses of battle. But, research funded by the Economic and Social Research Council (ESRC) shows that this sort of misconduct has most often been carried out by fighters who viewed the enemy as racially different from themselves and used images of the hunt to describe their actions.

“The roots of this behaviour lie not in individual psychological disorders,” says Professor Simon Harrison who carried out the study, “but in a social history of racism and in military traditions that use hunting metaphors for war. Although this misconduct is very rare, it has persisted in predictable patterns since the European Enlightenment. This was the period when the first ideologies of race began to appear, classifying some human populations as closer to animals than others.”

European and North American soldiers who have mutilated enemy corpses appear to have drawn racial distinctions of this sort between close and distant enemies. They ‘fought’ their close enemies, and bodies remained untouched after death, but they ‘hunted’ their distant enemies and such bodies became the trophies that demonstrate masculine skill.

Almost always, only enemies viewed as belonging to other ‘races’ have been treated in this way. “This is a specifically racialised form of violence,” suggest Professor Harrison, “and could be considered a type of racially-motivated hate crime specific to military personnel in wartime.”

People tend to associate head-hunting and other trophy-taking with ‘primitive’ warfare. They consider wars fought by professional militaries as rational and humane. However, such contrasts are misleading. The study shows that the symbolic associations between hunting and war that can give rise to abnormal behaviour such as trophy-taking in modern military organisations are remarkably close to those in certain indigenous societies where practices such as head-hunting were a recognised part of the culture.

In both cases, mutilation of the enemy dead occurs when enemies are represented as animals or prey. Parts of the corpse are removed like trophies at ‘the kill’. Metaphors of ‘war-as-hunting’ that lie at the root of such behaviour are still strong in some armed forces in Europe and North America — not only in military training but in the media and in soldiers’ own self-perception.

Professor Harrison gives the example of the Second World War and shows that trophy-taking was rare on the European battlefields but was relatively common in the war in the Pacific, where some Allied soldiers kept skulls of Japanese combatants as mementos or made gifts of their remains to friends back home.

The study also gives a more recent comparison: there have been incidents in Afghanistan in which NATO personnel have desecrated the dead bodies of Taliban combatants but there is no evidence of such misconduct occurring in the conflicts of the former Yugoslavia where NATO forces were much less likely to have considered their opponents racially ‘distant’.

But, it would be wrong to suggest that such behaviour amounts to a tradition. These practices are usually not explicitly taught. Indeed, they seem to be quickly forgotten after the end of wars and veterans often remain unaware of the extent to which they occurred.

Furthermore, attitudes towards the trophies themselves change as the enemy ceases to be the enemy. The study shows how human remains kept by Allied soldiers after the Pacific War became unwanted memory objects over time, which ex-servicemen or their families often donated to museums. In some cases, veterans have made great efforts to seek out the families of Japanese soldiers in order to return their remains and to disconnect themselves from a disturbing past.

Professor Harrison concludes that human trophy-taking is evidence of the power of metaphor in structuring and motivating human behaviour. “It will probably occur, in some form or other, whenever war, hunting and masculinity are conceptually linked,” he says. “Prohibition is clearly not enough to prevent it. We need to recognise the dangers of portraying war in terms of hunting imagery.”

Increased Knowledge About Global Warming Leads To Apathy, Study Shows (Science Daily)

ScienceDaily (Mar. 27, 2008) — The more you know the less you care — at least that seems to be the case with global warming. A telephone survey of 1,093 Americans by two Texas A&M University political scientists and a former colleague indicates that trend, as explained in their recent article in the peer-reviewed journal Risk Analysis.

“More informed respondents both feel less personally responsible for global warming, and also show less concern for global warming,” states the article, titled “Personal Efficacy, the Information Environment, and Attitudes toward Global Warming and Climate Change in the USA.”

The study showed high levels of confidence in scientists among Americans led to a decreased sense of responsibility for global warming.

The diminished concern and sense of responsibility flies in the face of awareness campaigns about climate change, such as in the movies An Inconvenient Truth and Ice Age: The Meltdown and in the mainstream media’s escalating emphasis on the trend.

The research was conducted by Paul M. Kellstedt, a political science associate professor at Texas A&M; Arnold Vedlitz, Bob Bullock Chair in Government and Public Policy at Texas A&M’s George Bush School of Government and Public Service; and Sammy Zahran, formerly of Texas A&M and now an assistant professor of sociology at Colorado State University.

Kellstedt says the findings were a bit unexpected. The focus of the study, he says, was not to measure how informed or how uninformed Americans are about global warming, but to understand why some individuals who are more or less informed about it showed more or less concern.

“In that sense, we didn’t really have expectations about how aware or unaware people were of global warming,” he says.

But, he adds, “The findings that the more informed respondents were less concerned about global warming, and that they felt less personally responsible for it, did surprise us. We expected just the opposite.

“The findings, while rather modest in magnitude — there are other variables we measured which had much larger effects on concern for global warming — were statistically quite robust, which is to say that they continued to appear regardless of how we modeled the data.”

Measuring knowledge about global warming is a tricky business, Kellstedt adds.

“That’s true of many other things we would like to measure in surveys, of course, especially things that might embarrass people (like ignorance) or that they might feel social pressure to avoid revealing (like prejudice),” he says.

“There are no industry standards, so to speak, for measuring knowledge about global warming. We opted for this straightforward measure and realize that other measures might produce different results.”

Now, for better or worse, scientists have to deal with the public’s abundant confidence in them. “But it cannot be comforting to the researchers in the scientific community that the more trust people have in them as scientists, the less concerned they are about their findings,” the researchers conclude in their study.

Lo que dicen las fotos de Lula con cáncer (BBC Mundo)

Gerardo Lissardy

BBC Mundo, Rio de Janeiro
Viernes, 25 de noviembre de 2011

Lula siendo afeitado por su esposa Leticia

Para ningún político debe ser fácil mostrar públicamente una lucha personal contra el cáncer, pero el modo en que lo ha hecho el ex presidente brasileño Luiz Inácio Lula da Silva tiene significados concretos, según sus allegados y expertos.

La noticia del cáncer de laringe que afecta a Lula fue conocida por los brasileños el 29 de octubre, apenas unas horas después que el propio ex presidente fuera diagnosticado con la enfermedad.

Desde entonces, el equipo de comunicación del instituto que encabeza Lula ha enviado regularmente a la prensa mensajes con información del tratamiento de quimioterapia que recibe y hasta de momentos íntimos que vive.

Por ejemplo, hubo fotos de Lula con médicos cuando inició el tratamiento en un hospital de Sao Paulo, fotos en una cama del nosocomio tomado de la mano de su sucesora, la presidenta Dilma Rousseff, y hasta fotos de su esposa Marisa Letícia cortándole a cero su cabello y su barba.

Todas estas imágenes han sido ofrecidas a los medios, libres de reproducción, por el Instituto Lula.

Algunas, en especial las del momento en que perdía su distintiva barba, recorrieron el mundo y se publicaron en las portadas de varios diarios locales y latinoamericanos.

Hay expertos que creen que todo esto responde a una estrategia definida, con valoraciones políticas.

José Chrispiniano, asesor de prensa del Instituto Lula, acepta que el modo de comunicar sobre la enfermedad del ex presidente tiene ciertos objetivos, pero descarta que se trate de vender algo en particular.

“No es de ninguna forma marketing”, dijo en diálogo con BBC Mundo.

“Cuestión muy simbólica”

Lula sin barbaLa oficina del expresidente ha presentado decenas de fotos que documentan la enfermedad de Lula.

Chrispiniano explicó que fue el propio Lula quien tomó la decisión de informar abiertamente sobre su cáncer y tratamiento, desde el momento en que conoció el diagnóstico.

“Aunque no tenga ningún cargo público ahora, es una persona de interés público, entonces el objetivo es divulgar claramente: es una enfermedad tratable y un tratamiento con perspectivas bastante positivas de cura”, señaló.

Además, dijo, se ha buscado evitar una dramatización de la enfermedad (de hecho, en muchas de las fotos divulgadas Lula aparece sonriente) o evitar que parezca “que se están escondiendo cosas”.

La difusión de las fotos de Lula siendo afeitado y mostrando su nuevo aspecto con bigote también fue iniciativa del ex presidente, relató Chrispiniano.

“Era una cuestión muy simbólica de su imagen y quisimos mostrar que pasó ese momento tranquilo, porque (para) muchas personas que tienen esta enfermedad es un momento de mucho estigma”, dijo.

Dos días después del corte de pelo de Lula, su instituto divulgó el viernes 18 fotos del ex presidente recibiendo la visita del director técnico de la selección brasileña de fútbol, Mano Menezes.

“Fuerza, eterno ‘presidente Lula’. Contamos contigo para 2014”, escribió Menezes en la casaca número 10 del combinado nacional que le obsequió a Lula, y que también aparecía en las fotos.

Se trataba de una referencia al Mundial de fútbol que Brasil va a organizar ese año, precisó el comunicado.

“Una estrategia”

Lula con el equipo del hospital de Sao Paolo que lo atiendePara muchos el padecimiento de Lula con el cáncer podría aumentar su ya alta popularidad.

Rousiley Maia, una investigadora de la Universidad Federal de Minas Gerais experta en comunicación y política, cree que la decisión de informar de esta forma sobre el cáncer de Lula “fue deliberadamente una estrategia”.

“En vez de poner sombras (o) tratar con medias palabras (la enfermedad), la estrategia es apelar por el lado humano, ordinario y mortal de la figura”, dijo Maia a BBC Mundo.

Sin embargo, sostuvo que esa decisión es coherente con la “construcción de imagen pública de Lula por varios años”, de un hombre de pueblo que se convirtió en un líder nacional reconocido mundialmente.

“Más allá de la empatía, es una forma de sustentar el carisma y respeto que construyó durante estos años”, opinó. “Este momento de enfermedad personal es una forma de volver a la escena pública de forma central”.

Renzo Taddei, un antropólogo profesor de comunicación, ciudadanía y política en la Universidad Federal de Río de Janeiro (UFRJ), dijo que el manejo público del cáncer de Lula muestra probables aspiraciones políticas a futuro.

“El cáncer es un tema ya clásico de superación y heroísmo en Brasil”, indicó a BBC Mundo.

“Era todo lo que faltaba a Lula: vencer el cáncer. Si lo hace, ya no hay nada más que no pueda hacer (aunque no haya hecho la reforma agraria que Brasil aguarda hace tanto ni las reformas fiscales y políticas)”, agregó.

Cáncer y elecciones

Presidenta Rousseff vista a Lula tras su operaciónLa presidenta Dilma Rousseff también es sobreviviente de un cáncer

Hasta que le fue diagnosticado el cáncer, muchos brasileños se preguntaban si Lula buscaría regresar a la presidencia en las elecciones de 2014, pero él decía que corresponde a Rousseff buscar la reelección.

Cuando Rousseff fue tratada con éxito de un cáncer linfático en 2009, algunos miembros del gobierno de Lula llegaron a especular con que podía salir fortalecida para buscar la presidencia al año siguiente.

Sin embargo, Lula descartó públicamente que ambas cosas pudieran vincularse.

“No puedo imaginar cómo es que alguien sale fortalecido porque tuvo un cáncer”, declaró entonces. “Sólo deseo la recuperación de Dilma”.

Rousseff se recuperó y fue electa presidenta al año siguiente, con el respaldo de Lula.

Arjun Appadurai: A Nation of Business Junkies (Anthropology News)

Guest Columnist
Arjun Appadurai

By Anthropology News on November 3, 2011

I first came to this country in 1967. I have been either a crypto-anthropologist or professional anthropologist for most of that time. Still, because I came here with an interest in India and took the path of least resistance in choosing to maintain India as my principal ethnographic referent, I have always been reluctant to offer opinions about life in these United States. I have begun to do so recently, but mainly in occasional blogs, twitter posts and the like. Now seems to be a good time to ponder whether I have anything to offer to public debate about the media in this country. Since I have been teaching for a few years in a distinguished department of media studies, I feel emboldened to offer my thoughts in this new AN Forum.

My examination of changes in the media over the last few decades is not based on a scientific study. I read the New York Times every day, the Wall Street Journal occasionally, and I subscribe to The Atlantic, Harper’s, The New York Review of Books, the Economist, and a variety of academic journals in anthropology and area studies. I get a smattering of other useful media pieces from friends on Facebook and other social media sites. I also use the Internet to keep up with as much as I can from the press in and about India. At various times in the past, I have subscribed to The Nation, Money Magazine, Foreign Policy, the Times Literary supplement and a few other periodicals.

I have long been interested in how culture and economy interact. Today, I want to make an observation about the single biggest change I have seen over my four decades in the United States, which is a growing and now hegemonic domination of the news and of a great deal of opinion, both in print and on television, by business news. Business news was a specialized affair in the late 1960’s, confined to a few magazines such as Money and Fortune, and to newspapers and TV reporters (not channels). Now, it is hard to find anything but business as the topic of news in all media. Consider television: if you spend even three hours surfing between CNN and BBC on any given day ( surfing for news about Libya or about soccer, for example) you will find yourself regularly assaulted by business news, not just from London, New York and Washington, but from Singapore, Hong Kong, Mumbai and many other places. Look at the serious talk shows and chances are that you will find a talking CEO, describing what’s good about his company, what’s bad about the government and how to read his company’s stock prices. Channels like MSNBC are a form of endless, mind-numbing Jerry Lewis telethon about the economy, with more than a hint of the desperation of the Depression era movie “They Shoot Horses Don’t They?”, as they bid the viewer to make insane bets and to mourn the fallen heroes of failed companies and fired CEO’s.

Turn to the newspapers and things get worse. Any reader of the New York Times will find it hard to get away from the business machine. Start with the lead section, and stories about Obama’s economic plans, mad Republican proposals about taxes, the Euro-crisis and the latest bank scandal will assault you. Some relief is provided by more corporate news: the exit of Steve Jobs, the Op-Ed piece about the responsibilities of the super-rich by Warren Buffet, Donald Trump advertising his new line of housewares to go along with his ugly homes and buildings. Turn to the sports section: it is littered with talk of franchises, salaries, trades, owner antics, stadium projects and more. I need hardly say anything about the section on “Business” itself, which has now virtually become redundant. And if you are still thirsty for more business news, check out the “Home”, “Lifestyle” and Real Estate sections for news on houses you can’t afford and mortgage financing gimmicks you have never heard off. Some measure of relief is to be in the occasional “Science Times” and in the NYT Book Review, which do have some pieces which are not primarily about profit, corporate politics or the recession.

The New York Times is not to blame for this. They are the newspaper of “record’ and that means that they reflect broader trends and cannot be blamed for their compliance with bigger trends. Go through the magazines when you take a flight to Detroit or Mumbai and there is again a feast of news geared to the “business traveler”. This is when I catch up on how to negotiate the best deal, why this is the time to buy gold and what software and hardware to use when I make my next presentation to General Electric. These examples could be multiplied in any number of bookstores, newspaper kiosks, airport lounges, park benches and dentist’s offices.

What does all this reflect? Well, we were always told that the business of America is business. But now we are gradually moving into a society in which the business of American life is also business. Who are we now? We have become (in our fantasies) entrepreneurs, start-up heroes, small investors, consumers, home-owners, day-traders, and a gallery of supporting business types, and no longer fathers, mothers, friends or neighbors. Our very citizenship is now defined by business, whether we are winners or losers. Everyone is an expert on pensions, stocks, retirement packages, vacation deals, credit- card scams and more. Meanwhile, as Paul Krugman has argued in a brilliant recent speech to some of his fellow economists, this discipline, especially macro-economics, has lost all its capacities to analyze, define or repair the huge mess we are in.

The gradual transformation of the imagined reader or viewer into a business junkie is a relatively new disease of advanced capitalism in the United States. The avalanche of business knowledge and information dropping on the American middle-classes ought to have helped us predict – or avoid – the recent economic meltdown, based on crazy credit devices, vulgar scams and lousy regulation. Instead it has made us business junkies, ready to be led like sheep to our own slaughter by Wall Street, the big banks and corrupt politicians. The growing hegemony of business news and knowledge in the popular media over the last few decades has produced a collective silence of the lambs. It is time for a bleat or two.

Dr. Arjun Appadurai is a prominent contemporary social-cultural anthropologist, having formerly served as Provost and Senior Vice President for Academic Affairs at The New School in NYC. He has held various professorial chairs and visiting appointments at some of top institutions in the United States and Europe. In addition, he has served on several scholarly and advisory bodies in the United States, Latin America, Europe and India. Dr. Appadurai is a prolific writer having authored numerous books and scholarly articles. The nature and significance of his contributions throughout his academic career have earned him the reputation as a leading figure in his field. He is the author of The Future as a Cultural Fact: Essays on the Global Condition (Verso: forthcoming 2012).

Ken Routon is the contributing editor of Media Notes. He is a visiting professor of cultural anthropology at the University of New Orleans and the author of Hidden Powers of the State in the Cuban Imagination (University Press of Florida, 2010).

Desafios do “tsunami de dados” (FAPESP)

Lançado pelo Instituto Microsoft Research-FAPESP de Pesquisas em TI, o livro O Quarto Paradigma debate os desafios da eScience, nova área dedicada a lidar com o imenso volume de informações que caracteriza a ciência atual

07/11/2011

Por Fábio de Castro

Agência FAPESP – Se há alguns anos a falta de dados limitava os avanços da ciência, hoje o problema se inverteu. O desenvolvimento de novas tecnologias de captação de dados, nas mais variadas áreas e escalas, tem gerado um volume tão imenso de informações que o excesso se tornou um gargalo para o avanço científico.

Nesse contexto, cientistas da computação têm se unido a especialistas de diferentes áreas para desenvolver novos conceitos e teorias capazes de lidar com a enxurrada de dados da ciência contemporânea. O resultado é chamado de eScience.

Esse é o tema debatido no livro O Quarto Paradigma – Descobertas científicas na era da eScience, lançado no dia 3 de novembro pelo Instituto Microsoft Research-FAPESP de Pesquisas em TI.

Organizado por Tony Hey, Stewart Tansley, Kristin Tolle – todos da Microsoft Research –, a publicação foi lançada na sede da FAPESP, em evento que contou com a presença do diretor científico da Fundação, Carlos Henrique de Brito Cruz.

Durante o lançamento, Roberto Marcondes Cesar Jr., do Instituto de Matemática e Estatística (IME) da Universidade de São Paulo (USP), apresentou a palestra “eScience no Brasil”. “O Quarto Paradigma: computação intensiva de dados avançando a descoberta científica” foi o tema da palestra de Daniel Fay, diretor de Terra, Energia e Meio Ambiente da MSR.

Brito Cruz destacou o interesse da FAPESP em estimular o desenvolvimento da eScience no Brasil. “A FAPESP está muito conectada a essa ideia, porque muitos dos nossos projetos e programas apresentam essa necessidade de mais capacidade de gerenciar grandes conjuntos de dados. O nosso grande desafio está na ciência por trás dessa capacidade de lidar com grandes volumes de dados”, disse.

Iniciativas como o Programa FAPESP de Pesquisa sobre Mudanças Climáticas Globais (PFPMCG), o BIOTA-FAPESP e o Programa FAPESP de Pesquisa em Bioenergia (BIOEN) são exemplos de programas que têm grande necessidade de integrar e processar imensos volumes de dados.

“Sabemos que a ciência avança quando novos instrumentos são disponibilizados. Por outro lado, os cientistas normalmente não percebem o computador como um novo grande instrumento que revoluciona a ciência. A FAPESP está interessada em ações para que a comunidade científica tome consciência de que há grandes desafios na área de eScience”, disse Brito Cruz.

O livro é uma coleção de 26 ensaios técnicos divididos em quatro seções: “Terra e meio ambiente”, “Saúde e bem-estar”, “Infraestrutura científica” e “Comunicação acadêmica”.

“O livro fala da emergência de um novo paradigma para as descobertas científicas. Há milhares de anos, o paradigma vigente era o da ciência experimental, fundamentada na descrição de fenômenos naturais. Há algumas centenas de anos, surgiu o paradigma da ciência teórica, simbolizado pelas leis de Newton. Há algumas décadas, surgiu a ciência computacional, simulando fenômenos complexos. Agora, chegamos ao quarto paradigma, que é o da ciência orientada por dados”, disse Fay.

Com o advento do novo paradigma, segundo ele, houve uma mudança completa na natureza da descoberta científica. Entraram em cena modelos complexos, com amplas escalas espaciais e temporais, que exigem cada vez mais interações multidisciplinares.

“Os dados, em quantidade incrível, são provenientes de diferentes fontes e precisam também de abordagem multidisciplinar e, muitas vezes, de tratamento em tempo real. As comunidades científicas também estão mais distribuídas. Tudo isso transformou a maneira como se fazem descobertas”, disse Fay.

A ecologia, uma das áreas altamente afetadas pelos grandes volumes de dados, é um exemplo de como o avanço da ciência, cada vez mais, dependerá da colaboração entre pesquisadores acadêmicos e especialistas em computação.

“Vivemos em uma tempestade de sensoriamento remoto, sensores terrestres baratos e acesso a dados na internet. Mas extrair as variáveis que a ciência requer dessa massa de dados heterogêneos continua sendo um problema. É preciso ter conhecimento especializado sobre algoritmos, formatos de arquivos e limpeza de dados, por exemplo, que nem sempre é acessível para o pessoal da área de ecologia”, explicou.

O mesmo ocorre em áreas como medicina e biologia – que se beneficiam de novas tecnologias, por exemplo, em registros de atividade cerebral, ou de sequenciamento de DNA – ou a astronomia e física, à medida que os modernos telescópios capturam terabytes de informação diariamente e o Grande Colisor de Hádrons (LHC) gera petabytes de dados a cada ano.

Instituto Virtual

Segundo Cesar Jr., a comunidade envolvida com eScience no Brasil está crescendo. O país tem 2.167 cursos de sistemas de informação ou engenharia e ciências da computação. Em 2009, houve 45 mil formados nessas áreas e a pós-graduação, entre 2007 e 2009, tinha 32 cursos, mil orientadores, 2.705 mestrandos e 410 doutorandos.

“A ciência mudou do paradigma da aquisição de dados para o da análise de dados. Temos diferentes tecnologias que produzem terabytes em diversos campos do conhecimento e, hoje, podemos dizer que essas áreas têm foco na análise de um dilúvio de dados”, disse o membro da Coordenação da Área de Ciência e Engenharia da Computação da FAPESP.

Em 2006, a Sociedade Brasileira de Computação (SBC) organizou um encontro a fim de identificar os problemas-chave e os principais desafios para a área. Isso levou a diferentes propostas para que o Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq) criasse um programa específico para esse tipo de problema.

“Em 2009, realizamos uma série de workshops na FAPESP, reunindo, para discutir essa questão, cientistas de áreas como agricultura, mudanças climáticas, medicina, transcriptômica, games, governo eletrônico e redes sociais. A iniciativa resultou em excelentes colaborações entre grupos de cientistas com problemas semelhantes e originou diversas iniciativas”, disse César Jr.

As chamadas do Instituto Microsoft Research-FAPESP de Pesquisas em TI, segundo ele, têm sido parte importante do conjunto de iniciativas para promover a eScience, assim como a organização da Escola São Paulo de Ciência Avançada em Processamento e Visualização de Imagens Computacionais. Além disso, a FAPESP tem apoiado diversos projetos de pesquisa ligados ao tema.

“A comunidade de eScience em São Paulo tem trabalhado com profissionais de diversas áreas e publicado em revistas de várias delas. Isso é indicação de qualidade adquirida pela comunidade para encarar o grande desafio que teremos nos próximos anos”, disse César Jr., que assina o prefácio da edição brasileira do livro.

  • O Quarto Paradigma
    Organizadores: Tony Hey, Stewart Tansley e Kristin Tolle
    Lançamento: 2011
    Preço: R$ 60
    Páginas: 263
    Mais informações: www.ofitexto.com.br

People Rationalize Situations They’re Stuck With, but Rebel When They Think There’s an out (Science Daily)

ScienceDaily (Nov. 1, 2011) — People who feel like they’re stuck with a rule or restriction are more likely to be content with it than people who think that the rule isn’t definite. The authors of a new study, which will be published in an upcoming issue of Psychological Science, a journal of the Association for Psychological Science, say this conclusion may help explain everything from unrequited love to the uprisings of the Arab Spring.

Psychological studies have found two contradictory results about how people respond to rules. Some research has found that, when there are new restrictions, you rationalize them; your brain comes up with a way to believe the restriction is a good idea. But other research has found that people react negatively against new restrictions, wanting the restricted thing more than ever.

Kristin Laurin of the University of Waterloo thought the difference might be absoluteness — how much the restriction is set in stone. “If it’s a restriction that I can’t really do anything about, then there’s really no point in hitting my head against the wall and trying to fight against it,” she says. “I’m better off if I just give up. But if there’s a chance I can beat it, then it makes sense for my brain to make me want the restricted thing even more, to motivate me to fight” Laurin wrote the new paper with Aaron Kay and Gavan Fitzsimons of Duke University.

In an experiment in the new study, participants read that lowering speed limits in cities would make people safer. Some read that government leaders had decided to reduce speed limits. Of those people, some were told that this legislation would definitely come into effect, and others read that it would probably happen, but that there was still a small chance government officials could vote it down.

People who thought the speed limit was definitely being lowered supported the change more than control subjects, but people who thought there was still a chance it wouldn’t happen supported it less than these control subjects. Laurin says this confirms what she suspected about absoluteness; if a restriction is definite, people find a way to live with it.

This could help explain how uprisings spread across the Arab world earlier this year. When people were living under dictatorships with power that appeared to be absolute, Laurin says, they may have been comfortable with it. But once Tunisia’s president fled, citizens of neighboring countries realized that their governments weren’t as absolute as they seemed — and they could have dropped whatever rationalizations they were using to make it possible to live under an authoritarian regime. Even more, the now non-absolute restriction their governments represented could have exacerbated their reaction, fueling their anger and motivating them to take action.

And how does this relate to unrequited love? It confirms people’s intuitive sense that leading someone can just make them fall for you more deeply, Laurin says. “If this person is telling me no, but I perceive that as not totally absolute, if I still think I have a shot, that’s just going to strengthen my desire and my feeling, that’s going to make me think I need to fight to win the person over,” she says. “If instead I believe no, I definitely don’t have a shot with this person, then I might rationalize it and decide that I don’t like them that much anyway.”

The world at seven billion (BBC)

27 October 2011 Last updated at 23:08 GMT

File photograph of newborn babies in Lucknow, India, in July 2009

As the world population reaches seven billion people, the BBC’s Mike Gallagher asks whether efforts to control population have been, as some critics claim, a form of authoritarian control over the world’s poorest citizens.

The temperature is some 30C. The humidity stifling, the noise unbearable. In a yard between two enormous tea-drying sheds, a number of dark-skinned women patiently sit, each accompanied by an unwieldy looking cloth sack. They are clad in colourful saris, but look tired and shabby. This is hardly surprising – they have spent most of the day in nearby plantation fields, picking tea that will net them around two cents a kilo – barely enough to feed their large families.

Vivek Baid thinks he knows how to help them. He runs the Mission for Population Control, a project in eastern India which aims to bring down high birth rates by encouraging local women to get sterilised after their second child.

As the world reaches an estimated seven billion people, people like Vivek say efforts to bring down the world’s population must continue if life on Earth is to be sustainable, and if poverty and even mass starvation are to be avoided.

There is no doubting their good intentions. Vivek, for instance, has spent his own money on the project, and is passionate about creating a brighter future for India.

But critics allege that campaigners like Vivek – a successful and wealthy male businessman – have tended to live very different lives from those they seek to help, who are mainly poor women.

These critics argue that rich people have imposed population control on the poor for decades. And, they say, such coercive attempts to control the world’s population often backfired and were sometimes harmful.

Population scare

Most historians of modern population control trace its roots back to the Reverend Thomas Malthus, an English clergyman born in the 18th Century who believed that humans would always reproduce faster than Earth’s capacity to feed them.

Giving succour to the resulting desperate masses would only imperil everyone else, he said. So the brutal reality was that it was better to let them starve.

‘Plenty is changed into scarcity’

Thomas Malthus

From Thomas Malthus’ Essay on Population, 1803 edition:

A man who is born into a world already possessed – if he cannot get subsistence from his parents on whom he has a just demand, and if the society do not want his labour, has no claim of right to the smallest portion of food.

At nature’s mighty feast there is no vacant cover for him. She tells him to be gone, and will quickly execute her own orders, if he does not work upon the compassion of some of her guests. If these guests get up and make room for him, other intruders immediately appear demanding the same favour. The plenty that before reigned is changed into scarcity; and the happiness of the guests is destroyed by the spectacle of misery and dependence in every part of the hall.

Rapid agricultural advances in the 19th Century proved his main premise wrong, because food production generally more than kept pace with the growing population.

But the idea that the rich are threatened by the desperately poor has cast a long shadow into the 20th Century.

From the 1960s, the World Bank, the UN and a host of independent American philanthropic foundations, such as the Ford and Rockefeller foundations, began to focus on what they saw as the problem of burgeoning Third World numbers.

The believed that overpopulation was the primary cause of environmental degradation, economic underdevelopment and political instability.

Massive populations in the Third World were seen as presenting a threat to Western capitalism and access to resources, says Professor Betsy Hartmann of Hampshire College, Massachusetts, in the US.

“The view of the south is very much put in this Malthusian framework. It becomes just this powerful ideology,” she says.

In 1966, President Lyndon Johnson warned that the US might be overwhelmed by desperate masses, and he made US foreign aid dependent on countries adopting family planning programmes.

Other wealthy countries such as Japan, Sweden and the UK also began to devote large amounts of money to reducing Third World birth rates.

‘Unmet need’

What virtually everyone agreed was that there was a massive demand for birth control among the world’s poorest people, and that if they could get their hands on reliable contraceptives, runaway population growth might be stopped.

But with the benefit of hindsight, some argue that this so-called unmet need theory put disproportionate emphasis on birth control and ignored other serious needs.

Graph of world population figures

“It was a top-down solution,” says Mohan Rao, a doctor and public health expert at Delhi’s Jawaharlal Nehru University.

“There was an unmet need for contraceptive services, of course. But there was also an unmet need for health services and all kinds of other services which did not get attention. The focus became contraception.”

Had the demographic experts worked at the grass-roots instead of imposing solutions from above, suggests Adrienne Germain, formerly of the Ford Foundation and then the International Women’s Health Coalition, they might have achieved a better picture of the dilemmas facing women in poor, rural communities.

“Not to have a full set of health services meant women were either unable to use family planning, or unwilling to – because they could still expect half their kids to die by the age of five,” she says.

India’s sterilisation ‘madness’

File photograph of Sanjay and Indira Gandhi in 1980

Indira Gandhi and her son Sanjay (above) presided over a mass sterilisation campaign. From the mid-1970s, Indian officials were set sterilisation quotas, and sought to ingratiate themselves with superiors by exceeding them. Stories abounded of men being accosted in the street and taken away for the operation. The head of the World Bank, Robert McNamara, congratulated the Indian government on “moving effectively” to deal with high birth rates. Funding was increased, and the sterilising went on.

In Delhi, some 700,000 slum dwellers were forcibly evicted, and given replacement housing plots far from the city centre, frequently on condition that they were either sterilised or produced someone else for the operation. In poorer agricultural areas, whole villages were rounded up for sterilisation. When residents of one village protested, an official is said to have threatened air strikes in retaliation.

“There was a certain madness,” recalls Nina Puri of the Family Planning Association of India. “All rationality was lost.”

Us and them

In 1968, the American biologist Paul Ehrlich caused a stir with his bestselling book, The Population Bomb, which suggested that it was already too late to save some countries from the dire effects of overpopulation, which would result in ecological disaster and the deaths of hundreds of millions of people in the 1970s.

Instead, governments should concentrate on drastically reducing population growth. He said financial assistance should be given only to those nations with a realistic chance of bringing birth rates down. Compulsory measures were not to be ruled out.

Western experts and local elites in the developing world soon imposed targets for reductions in family size, and used military analogies to drive home the urgency, says Matthew Connelly, a historian of population control at Columbia University in New York.

“They spoke of a war on population growth, fought with contraceptive weapons,” he says. “The war would entail sacrifices, and collateral damage.”

Such language betrayed a lack of empathy with their subjects, says Ms Germain: “People didn’t talk about people. They talked of acceptors and users of family planning.”

Emergency measures

Critics of population control had their say at the first ever UN population conference in 1974.

Karan Singh, India’s health minister at the time, declared that “development is the best contraceptive”.

But just a year later, Mr Singh’s government presided over one of the most notorious episodes in the history of population control.

In June 1975, the Indian premier, Indira Gandhi, declared a state of emergency after accusations of corruption threatened her government. Her son Sanjay used the measure to introduce radical population control measures targeted at the poor.

The Indian emergency lasted less than two years, but in 1975 alone, some eight million Indians – mainly poor men – were sterilised.

Yet, for all the official programmes and coercion, many poor women kept on having babies.

And where they did not, it arguably had less to do with coercive population control than with development, just as Karan Singh had argued in 1974, says historian Matt Connelly.

For example, in India, a disparity in birth rates could already be observed between the impoverished northern states and more developed southern regions like Kerala, where women were more likely to be literate and educated, and their offspring more likely to be healthy.

Women there realised that they could have fewer births and still expect to see their children survive into adulthood.

China: ‘We will not allow your baby to live’

Steven Mosher was a Stanford University anthropologist working in rural China who witnessed some of the early, disturbing moments of Beijing’s One Child Policy.

“I remember very well the evening of 8 March, 1980. The local Communist Party official in charge of my village came over waving a government document. He said: ‘The Party has decided to impose a cap of 1% on population growth this year.’ He said: ‘We’re going to decide who’s going to be allowed to continue their pregnancy and who’s going to be forced to terminate their pregnancy.’ And that’s exactly what they did.”

“These were women in the late second and third trimester of pregnancy. There were several women just days away from giving birth. And in my hearing, a party official said: ‘Do not think that you can simply wait until you go into labour and give birth, because we will not allow your baby to live. You will go home alone’.”

Total control

By now, this phenomenon could be observed in another country too – one that would nevertheless go on to impose the most draconian population control of all.

The One Child Policy is credited with preventing some 400 million births in China, and remains in place to this day. In 1983 alone, more than 16 million women and four million men were sterilised, and 14 million women received abortions.

Assessed by numbers alone, it is said to be by far the most successful population control initiative. Yet it remains deeply controversial, not only because of the human suffering it has caused.

A few years after its inception, the policy was relaxed slightly to allow rural couples two children if their first was not a boy. Boy children are prized, especially in the countryside where they provide labour and care for parents in old age.

But modern technology allows parents to discover the sex of the foetus, and many choose to abort if they are carrying a girl. In some regions, there is now a serious imbalance between men and women.

Moreover, since Chinese fertility was already in decline at the time the policy was implemented, some argue that it bears less responsibility for China’s falling birth rate than its supporters claim.

“I don’t think they needed to bring it down further,” says Indian demographer AR Nanda. “It would have happened at its own slow pace in another 10 years.”

Backlash

In the early 1980s, objections to the population control movement began to grow, especially in the United States.

In Washington, the new Reagan administration removed financial support for any programmes that involved abortion or sterilisation.

“If you give women the tools they need – education, employment, contraception, safe abortion – then they will make the choices that benefit society”

Adrienne Germain

The broad alliance to stem birth rates was beginning to dissolve and the debate become more polarised along political lines.

While some on the political right had moral objections to population control, some on the left saw it as neo-colonialism.

Faith groups condemned it as a Western attack on religious values, but women’s groups feared changes would mean poor women would be even less well-served.

By the time of a major UN conference on population and development in Cairo in 1994, women’s groups were ready to strike a blow for women’s rights, and they won.

The conference adopted a 20-year plan of action, known as the Cairo consensus, which called on countries to recognise that ordinary women’s needs – rather than demographers’ plans – should be at the heart of population strategies.

After Cairo

Today’s record-breaking global population hides a marked long-term trend towards lower birth rates, as urbanisation, better health care, education and access to family planning all affect women’s choices.

With the exception of sub-Saharan Africa and some of the poorest parts of India, we are now having fewer children than we once did – in some cases, failing even to replace ourselves in the next generation. And although total numbers are set to rise still further, the peak is now in sight.

Chinese poster from the 1960s of mother and baby, captioned: Practicing birth control is beneficial for the protection of the health of mother and childChina promoted birth control before implementing its one-child policy

Assuming that this trend continues, total numbers will one day level off, and even fall. As a result, some believe the sense of urgency that once surrounded population control has subsided.

The term population control itself has fallen out of fashion, as it was deemed to have authoritarian connotations. Post-Cairo, the talk is of women’s rights and reproductive rights, meaning the right to a free choice over whether or not to have children.

According to Adrienne Germain, that is the main lesson we should learn from the past 50 years.

“I have a profound conviction that if you give women the tools they need – education, employment, contraception, safe abortion – then they will make the choices that benefit society,” she says.

“If you don’t, then you’ll just be in an endless cycle of trying to exert control over fertility – to bring it up, to bring it down, to keep it stable. And it never comes out well. Never.”

Nevertheless, there remain to this day schemes to sterilise the less well-off, often in return for financial incentives. In effect, say critics, this amounts to coercion, since the very poor find it hard to reject cash.

“The people proposing this argue ‘Don’t worry, everything’ s fine now we have voluntary programmes on the Cairo model’,” says Betsy Hartmann.

“But what they don’t understand is the profound difference in power between rich and poor. The people who provide many services in poor areas are already prejudiced against the people they serve.”

Work in progress

For Mohan Rao, it is an example of how even the Cairo consensus fails to take account of the developing world.

“Cairo had some good things,” he says. “However Cairo was driven largely by First World feminist agendas. Reproductive rights are all very well, but [there needs to be] a whole lot of other kinds of enabling rights before women can access reproductive rights. You need rights to food, employment, water, justice and fair wages. Without all these you cannot have reproductive rights.”

Perhaps, then, the humanitarian ideals of Cairo are still a work in progress.

Meanwhile, Paul Ehrlich has also amended his view of the issue.

If he were to write his book today, “I wouldn’t focus on the poverty-stricken masses”, he told the BBC.

“I would focus on there being too many rich people. It’s crystal clear that we can’t support seven billion people in the style of the wealthier Americans.”

Mike Gallager is the producer of the radio programme Controlling People on BBC World Service

Where do you fit into 7 billion?

The world’s population is expected to hit seven billion in the next few weeks. After growing very slowly for most of human history, the number of people on Earth has more than doubled in the last 50 years. Where do you fit into this story of human life? Fill in your date of birth here to find out.

The world’s population will reach 7 billion at the end of October. Don’t panic (The Economist)

Demography

A tale of three islands

Oct 22nd 2011 | from the print edition

 

IN 1950 the whole population of the earth—2.5 billion—could have squeezed, shoulder to shoulder, onto the Isle of Wight, a 381-square-kilometre rock off southern England. By 1968 John Brunner, a British novelist, observed that the earth’s people—by then 3.5 billion—would have required the Isle of Man, 572 square kilometres in the Irish Sea, for its standing room. Brunner forecast that by 2010 the world’s population would have reached 7 billion, and would need a bigger island. Hence the title of his 1968 novel about over-population, “Stand on Zanzibar” (1,554 square kilometres off east Africa).

Brunner’s prediction was only a year out. The United Nations’ population division now says the world will reach 7 billion on October 31st 2011 (America’s Census Bureau delays the date until March 2012). The UN will even identify someone born that day as the world’s 7 billionth living person. The 6 billionth, Adnan Nevic, was born on October 12th 1999 in Sarajevo, in Bosnia. He will be just past his 12th birthday when the next billion clicks over.

That makes the world’s population look as if it is rising as fast as ever. It took 250,000 years to reach 1 billion, around 1800; over a century more to reach 2 billion (in 1927); and 32 years more to reach 3 billion. But to rise from 5 billion (in 1987) to 6 billion took only 12 years; and now, another 12 years later, it is at 7 billion (see chart 1). By 2050, the UN thinks, there will be 9.3 billion people, requiring an island the size of Tenerife or Maui to stand on.

Odd though it seems, however, the growth in the world’s population is actually slowing. The peak of population growth was in the late 1960s, when the total was rising by almost 2% a year. Now the rate is half that. The last time it was so low was in 1950, when the death rate was much higher. The result is that the next billion people, according to the UN, will take 14 years to arrive, the first time that a billion milestone has taken longer to reach than the one before. The billion after that will take 18 years.

Once upon a time, the passing of population milestones might have been cause for celebration. Now it gives rise to jeremiads. As Hillary Clinton’s science adviser, Nina Fedoroff, told the BBC in 2009, “There are probably already too many people on the planet.” But the notion of “too many” is more flexible than it seems. The earth could certainly not support 10 billion hunter-gatherers, who used much more land per head than modern farm-fed people do. But it does not have to. The earth might well not be able to support 10 billion people if they had exactly the same impact per person as 7 billion do today. But that does not necessarily spell Malthusian doom, because the impact humans have on the earth and on each other can change.

For most people, the big questions about population are: can the world feed 9 billion mouths by 2050? Are so many people ruining the environment? And will those billions, living cheek-by-jowl, go to war more often? On all three counts, surprising as it seems, reducing population growth any more quickly than it is falling anyway may not make much difference.

Start with the link between population and violence. It seems plausible that the more young men there are, the more likely they will be to fight. This is especially true when groups are competing for scarce resources. Some argue that the genocidal conflict in Darfur, western Sudan, was caused partly by high population growth, which led to unsustainable farming and conflicts over land and water. Land pressure also influenced the Rwandan genocide of 1994, as migrants in search of a livelihood in one of the world’s most densely populated countries moved into already settled areas, with catastrophic results.

But there is a difference between local conflicts and what is happening on a global scale. Although the number of sovereign states has increased almost as dramatically as the world’s population over the past half-century, the number of wars between states fell fairly continuously during the period. The number of civil wars rose, then fell. The number of deaths in battle fell by roughly three-quarters. These patterns do not seem to be influenced either by the relentless upward pressure of population, or by the slackening of that pressure as growth decelerates. The difference seems to have been caused by fewer post-colonial wars, the ending of cold-war alliances (and proxy wars) and, possibly, the increase in international peacekeepers.

More people, more damage?

Human activity has caused profound changes to the climate, biodiversity, oceanic acidity and greenhouse-gas levels in the atmosphere. But it does not automatically follow that the more people there are, the worse the damage. In 2007 Americans and Australians emitted almost 20 tonnes of carbon dioxide each. In contrast, more than 60 countries—including the vast majority of African ones—emitted less than 1 tonne per person.

This implies that population growth in poorer countries (where it is concentrated) has had a smaller impact on the climate in recent years than the rise in the population of the United States (up by over 50% in 1970-2010). Most of the world’s population growth in the next 20 years will occur in countries that make the smallest contribution to greenhouse gases. Global pollution will be more affected by the pattern of economic growth—and especially whether emerging nations become as energy-intensive as America, Australia and China.

Population growth does make a bigger difference to food. All things being equal, it is harder to feed 7 billion people than 6 billion. According to the World Bank, between 2005 and 2055 agricultural productivity will have to increase by two-thirds to keep pace with rising population and changing diets. Moreover, according to the bank, if the population stayed at 2005 levels, farm productivity would have to rise by only a quarter, so more future demand comes from a growing population than from consumption per person.

Increasing farm productivity by a quarter would obviously be easier than boosting it by two-thirds. But even a rise of two-thirds is not as much as it sounds. From 1970-2010 farm productivity rose far more than this, by over three-and-a-half times. The big problem for agriculture is not the number of people, but signs that farm productivity may be levelling out. The growth in agricultural yields seems to be slowing down. There is little new farmland available. Water shortages are chronic and fertilisers are over-used. All these—plus the yield-reductions that may come from climate change, and wastefulness in getting food to markets—mean that the big problems are to do with supply, not demand.

None of this means that population does not matter. But the main impact comes from relative changes—the growth of one part of the population compared with another, for example, or shifts in the average age of the population—rather than the absolute number of people. Of these relative changes, falling fertility is most important. The fertility rate is the number of children a woman can expect to have. At the moment, almost half the world’s population—3.2 billion—lives in countries with a fertility rate of 2.1 or less. That number, the so-called replacement rate, is usually taken to be the level at which the population eventually stops growing.

The world’s decline in fertility has been staggering (see chart 2). In 1970 the total fertility rate was 4.45 and the typical family in the world had four or five children. It is now 2.45 worldwide, and lower in some surprising places. Bangladesh’s rate is 2.16, having halved in 20 years. Iran’s fertility fell from 7 in 1984 to just 1.9 in 2006. Countries with below-replacement fertility include supposedly teeming Brazil, Tunisia and Thailand. Much of Europe and East Asia have fertility rates far below replacement levels.

The fertility fall is releasing wave upon wave of demographic change. It is the main influence behind the decline of population growth and, perhaps even more important, is shifting the balance of age groups within a population.

When gold turns to silver

A fall in fertility sends a sort of generational bulge surging through a society. The generation in question is the one before the fertility fall really begins to bite, which in Europe and America was the baby-boom generation that is just retiring, and in China and East Asia the generation now reaching adulthood. To begin with, the favoured generation is in its childhood; countries have lots of children and fewer surviving grandparents (who were born at a time when life expectancy was lower). That was the situation in Europe in the 1950s and in East Asia in the 1970s.

But as the select generation enters the labour force, a country starts to benefit from a so-called “demographic dividend”. This happens when there are relatively few children (because of the fall in fertility), relatively few older people (because of higher mortality previously), and lots of economically active adults, including, often, many women, who enter the labour force in large numbers for the first time. It is a period of smaller families, rising income, rising life expectancy and big social change, including divorce, postponed marriage and single-person households. This was the situation in Europe between 1945 and 1975 (“les trente glorieuses”) and in much of East Asia in 1980-2010.

But there is a third stage. At some point, the gilded generation turns silver and retires. Now the dividend becomes a liability. There are disproportionately more old people depending upon a smaller generation behind them. Population growth stops or goes into reverse, parts of a country are abandoned by the young and the social concerns of the aged grow in significance. This situation already exists in Japan. It is arriving fast in Europe and America, and soon after that will reach East Asia.

A demographic dividend tends to boost economic growth because a large number of working-age adults increases the labour force, keeps wages relatively low, boosts savings and increases demand for goods and services. Part of China’s phenomenal growth has come from its unprecedentedly low dependency ratio—just 38 (this is the number of dependents, children and people over 65, per 100 working adults; it implies the working-age group is almost twice as large as the rest of the population put together). One study by Australia’s central bank calculated that a third of East Asia’s GDP growth in 1965-90 came from its favourable demography. About a third of America’s GDP growth in 2000-10 also came from its increasing population.

The world as a whole reaped a demographic dividend in the 40 years to 2010. In 1970 there were 75 dependents for every 100 adults of working age. In 2010 the number of dependents dropped to just 52. Huge improvements were registered not only in China but also in South-East Asia and north Africa, where dependency ratios fell by 40 points. Even “ageing” Europe and America ended the period with fewer dependents than at the beginning.

A demographic dividend does not automatically generate growth. It depends on whether the country can put its growing labour force to productive use. In the 1980s Latin America and East Asia had similar demographic patterns. But while East Asia experienced a long boom, Latin America endured its “lost decade”. One of the biggest questions for Arab countries, which are beginning to reap their own demographic dividends, is whether they will follow East Asia or Latin America.

But even if demography guarantees nothing, it can make growth harder or easier. National demographic inheritances therefore matter. And they differ a lot.

Where China loses

Hania Zlotnik, the head of the UN’s Population Division, divides the world into three categories, according to levels of fertility (see map). About a fifth of the world lives in countries with high fertility—3 or more. Most are Africans. Sub-Saharan Africa, for example, is one of the fastest-growing parts of the world. In 1975 it had half the population of Europe. It overtook Europe in 2004, and by 2050 there will be just under 2 billion people there compared with 720m Europeans. About half of the 2.3 billion increase in the world’s population over the next 40 years will be in Africa.

The rest of the world is more or less equally divided between countries with below-replacement fertility (less than 2.1) and those with intermediate fertility (between 2.1 and 3). The first group consists of Europe, China and the rest of East Asia. The second comprises South and South-East Asia, the Middle East and the Americas (including the United States).

The low-fertility countries face the biggest demographic problems. The elderly share of Japan’s population is already the highest in the world. By 2050 the country will have almost as many dependents as working-age adults, and half the population will be over 52. This will make Japan the oldest society the world has ever known. Europe faces similar trends, less acutely. It has roughly half as many dependent children and retired people as working-age adults now. By 2050 it will have three dependents for every four adults, so will shoulder a large burden of ageing, which even sustained increases in fertility would fail to reverse for decades. This will cause disturbing policy implications in the provision of pensions and health care, which rely on continuing healthy tax revenues from the working population.

At least these countries are rich enough to make such provision. Not so China. With its fertility artificially suppressed by the one-child policy, it is ageing at an unprecedented rate. In 1980 China’s median age (the point where half the population is older and half younger) was 22 years, a developing-country figure. China will be older than America as early as 2020 and older than Europe by 2030. This will bring an abrupt end to its cheap-labour manufacturing. Its dependency ratio will rise from 38 to 64 by 2050, the sharpest rise in the world. Add in the country’s sexual imbalances—after a decade of sex-selective abortions, China will have 96.5m men in their 20s in 2025 but only 80.3m young women—and demography may become the gravest problem the Communist Party has to face.

Many countries with intermediate fertility—South-East Asia, Latin America, the United States—are better off. Their dependency ratios are not deteriorating so fast and their societies are ageing more slowly. America’s demographic profile is slowly tugging it away from Europe. Though its fertility rate may have fallen recently, it is still slightly higher than Europe’s. In 2010 the two sides of the Atlantic had similar dependency rates. By 2050 America’s could be nearly ten points lower.

But the biggest potential beneficiaries are the two other areas with intermediate fertility—India and the Middle East—and the high-fertility continent of Africa. These places have long been regarded as demographic time-bombs, with youth bulges, poverty and low levels of education and health. But that is because they are moving only slowly out of the early stage of high fertility into the one in which lower fertility begins to make an impact.

At the moment, Africa has larger families and more dependent children than India or Arab countries and is a few years younger (its median age is 20 compared with their 25). But all three areas will see their dependency ratios fall in the next 40 years, the only parts of the world to do so. And they will keep their median ages low—below 38 in 2050. If they can make their public institutions less corrupt, keep their economic policies outward-looking and invest more in education, as East Asia did, then Africa, the Middle East and India could become the fastest-growing parts of the world economy within a decade or so.

Here’s looking at you

Demography, though, is not only about economics. Most emerging countries have benefited from the sort of dividend that changed Europe and America in the 1960s. They are catching up with the West in terms of income, family size and middle-class formation. Most say they want to keep their cultures unsullied by the social trends—divorce, illegitimacy and so on—that also affected the West. But the growing number of never-married women in urban Asia suggests that this will be hard.

If you look at the overall size of the world’s population, then, the picture is one of falling fertility, decelerating growth and a gradual return to the flat population level of the 18th century. But below the surface societies are being churned up in ways not seen in the much more static pre-industrial world. The earth’s population may never need a larger island than Maui to stand on. But the way it arranges itself will go on shifting for centuries to come.

Those fast-talking Japanese! And Spanish! (The Christian Science Monitor)

By Ruth Walker / October 13, 2011

It is the universal experience of anyone having a first serious encounter in a language he or she is learning: “Those people talk so fast I will never be able to understand them, let alone hold my own in a conversation.”

The learner timidly poses a carefully rehearsed question about the availability of tickets for tonight’s performance or directions to the museum or whatever, and the response all but gallops out of the mouth of the native speaker like a runaway horse.

Now researchers at the University of Lyon in France have presented findings that provide language learners some validation for their feelings – but only some. The team found that, objectively, some languages are spoken faster than others, in terms of syllables per minute. But there’s a trade-off: Some languages pack more meaning into their syllables.

The key element turns out to be what the researchers call “density.”

Time magazine published a widely reproduced article on the Lyon research, which originally came out in Language, the journal of the Linguistic Society of America. The team in Lyon recruited several dozen volunteers, each a native speaker of one of several common languages: English, French, German, Italian, Japanese, Mandarin Chinese, or Spanish. Vietnamese was used as a sort of control language.

The volunteers read a series of 20 different texts in their respective native tongues into a recorder. The researchers then counted all the syllables in each of the recordings to determine how many syllables per second were spoken in each language. That’s a lot of counting.

Then they analyzed all these syllables for their information density. To mention Time’s examples: “A single-syllable word like bliss, for example, is rich with meaning – signifying not ordinary happiness but a particularly serene and rapturous kind. The single-syllable word to is less information-dense. And a single syllable like the short ‘i’ sound, as in the word jubilee, has no independent meaning at all.”

Here’s where Vietnamese comes in: It turns out to be the gold standard for information density. Who knew? The researchers assigned an arbitrary value of 1 to Vietnamese syllables, and compared other syllables against that standard.

English turns out to have a density of .91 (91 percent as dense as Vietnamese, in other words) and an average speed of 6.19 syllables per second. Mandarin is slightly denser (.94) but has an average speed of 5.18, which made it the slowest of the group studied.

At the other end of the scale were Spanish, with a density of .63 and a speed of 7.82, and Japanese, with a density of only .49 but a speed of 7.84.

So what makes a language more or less dense? The number of sounds, for one thing. Some languages make do with relatively few consonants and vowels, and so end up with a lot of long words: Hawaiian, for example, with 13 letters.

English, on the other hand, has a relatively large number of vowels – a dozen, although that varies according to dialect. Chinese uses tones, which help make it a “denser” language. And some languages use more inflections – special endings to indicate gender, number, or status – which English, for instance, largely dispenses with.

The researchers concluded that across the board, speakers of the languages they studied conveyed about the same amount of meaning in the same amount of time, whether by speaking faster or packing more meaning into their syllables.

MP3 Players ‘Shrink’ Our Personal Space (Science Daily)

Science Daily (Oct. 12, 2011) — How close could a stranger come to you before you start feeling uncomfortable? Usually, people start feeling uneasy when unfamiliar people come within an arm’s reach. But take the subway (underground rail) during rush hour and you have no choice but to get up close and personal with complete strangers.

Researchers at Royal Holloway, University of London wanted to find out whether there is a way to make this intrusion more tolerable. Their results, published in the journal PLoS One, reveal that listening to music through headphones can change people’s margins of personal space.

Dr Manos Tsakiris, from the Department of Psychology at Royal Holloway, said: “This distance we try to maintain between ourselves and others is a comfort zone surrounding our bodies. Everyone knows where the boundaries of their personal space are even though they may not consciously dictate them. Of course personal space can be modified for example in a number of relationships including family members and romantic partners, but on a busy tube or bus you can find complete strangers encroaching in this space.”

The study, led by Dr Tsakiris and Dr Ana Tajadura-Jiménez from Royal Holloway, involved asking volunteers to listen to positive or negative emotion-inducing music through headphones or through speakers. At the same time, a stranger started walking towards them and the participants were asked to say “stop” when they started feeling uncomfortable.

The results showed that when participants were listening to music that evoked positive emotions through headphones, they let the stranger come closer to them, indicating a change in their own personal space. Dr Tajadura-Jiménez explains: “Listening to music that induces positive emotions delivered through headphones shifts the margins of personal space. Our personal space “shrinks,” allowing others to get closer to us.”

Dr Tsakiris added: “So next time you are ready to board a packed train, turn on your mp3 player and let others come close to you without fear of feeling invaded.”

Some People’s Climate Beliefs Shift With Weather (Columbia University)

Study Shows Daily Malleability on a Long-Term Question

2011-04-06
ThermometerPhoto by domediart, Flickr

Social scientists are struggling with a perplexing earth-science question: as the power of evidence showing manmade global warming is rising, why do opinion polls suggest public belief in the findings is wavering? Part of the answer may be that some people are too easily swayed by the easiest, most irrational piece of evidence at hand: their own estimation of the day’s temperature.

In three separate studies, researchers affiliated with Columbia University’s Center for Research on Environmental Decisions (CRED) surveyed about 1,200 people in the United States and Australia, and found that those who thought the current day was warmer than usual were more likely to believe in and feel concern about global warming than those who thought the day was unusually cold. A new paper describing the studies appears in the current issue of the journal Psychological Science.

“Global warming is so complex, it appears some people are ready to be persuaded by whether their own day is warmer or cooler than usual, rather than think about whether the entire world is becoming warmer or cooler,” said lead author Ye Li, a postdoctoral researcher at the Columbia Business School’s Center for Decision Sciences, which is aligned with CRED. “It is striking that society has spent so much money, time and effort educating people about this issue, yet people are still so easily influenced.”  The study says that “these results join a growing body of work show that irrelevant environmental information, such as the current weather, can affect judgments. … By way of analogy, when asked about the state of the national economy, someone might look at the amount of money in his or her wallet, a factor with only trivial relevance.”

Ongoing studies by other researchers have already provided strong evidence that opinions on climate and other issues can hinge on factors unrelated to scientific observations. Most pointedly, repeated polls have shown that voters identifying themselves as political liberals or Democrats are far more likely to believe in human-influenced climate change than those who identify themselves as conservatives or Republicans. Women believe more than men, and younger people more than older ones. Other, yet-to-be published studies at four other universities have looked at the effects of actual temperature—either the natural one outside, or within a room manipulated by researchers—and show that real-time thermometer readings can affect people’s beliefs as well. These other studies involve researchers at New York University, Temple University, the University of Chicago and the University of California, Berkeley.

In the current paper, respondents were fairly good at knowing if it was unusually hot or cold–perceptions correlated with reality three quarters of the time—and that the perception exerted a powerful control on their attitude. As expected, politics, gender and age all had the predicted influences: for instance, on the researchers’ 1-to-4 scale of belief in global warming, Democrats were 1.5 points higher than Republicans. On the whole though, after controlling for the other factors, the researchers found that perceived temperatures still had nearly two-thirds the power as political belief, and six times the power as gender, to push someone one way or the other a notch along the scale. (The coming NYU/Temple study suggests that those with no strong political beliefs and lower education are the most easily swayed.)

In one of the studies described in the paper, the researchers tried to test the earnestness of the responses by seeing how many of those getting paid $8 for the survey were willing to donate to a real-life charity, Clean Air-Cool Planet. The correlation was strong; those who said it was warmer donated an average of about $2; those who felt it was cooler gave an average of 48 cents.

The researchers say the study not only points to how individuals’ beliefs can change literally with the wind. Li says it is possible that weather may have influenced recent large-scale public opinion polls showing declining faith in climate science. Administered at different times, future ones might turn out differently, he said. These polls, he pointed out, include the national elections, which always take place in November, when things are getting chilly and thus may be empowering conservative forces at a time when climate has become a far more contentious issue than in the past. (Some politicians subsequently played up the heavy snows and cold of winter 2009-2010 as showing global warming was a hoax—even though scientists pointed out that such weather was probably controlled by short-term atmospheric mechanisms, and consistent with long-term warming.) “I’m not sure I’d say that people are manipulated by the weather. But for some percentage of people, it’s certainly pushing them around.” said Li.

The other authors are Eric J. Johnson, co-director of the Center for Decision Sciences; and Lisa Zaval, a Columbia graduate student in psychology.

Original link: http://www.earth.columbia.edu/articles/view/2794

The great difficulty with good hypotheses

“There is one great difficulty with a good hypothesis. When it is completed and rounded, the corners smooth and the content cohesive and coherent, it is likely to become a thing in itself, a work of art. It is then like a finished sonnet or a painting completed. One hates to disturb it. Even if subsequent information should shoot a hole in it, one hates to tear it down because it once was beautiful and whole.”

From The Log from the Sea of Cortez, by John Steinbeck.

I am, therefore I’m right (Christian Science Monitor)

By Jim Sollisch / July 29, 2011

If you’ve ever been on a jury, you might have noticed that a funny thing happens the minute you get behind closed doors. Everybody starts talking about themselves. They say what they would have done if they had been the plaintiff or the defendant. They bring up anecdote after anecdote. It can take hours to get back to the points of law that the judge has instructed you to consider.

Being on a jury (I recently served on my fourth) reminds me why I can’t stomach talk radio. We Americans seem to have lost the ability to talk about anything but our own experiences. We can’t seem to generalize without stereotyping or to consider evidence that goes against our own experience.

I heard a doctor on a radio show the other day talking about a study that found that exercise reduces the incidence of Alzheimer’s. And caller after caller couldn’t wait to make essentially the opposite point: “Well, my grandmother never exercised and she lived to 95, sharp as a tack.” We are in an age summed up by the aphorism: “I experience, therefore I’m right.”

This isn’t a new phenomenon, except by degree. Historically, the hallmarks of an uneducated person were the lack of ability to think critically, to use deductive reasoning, to distinguish the personal from the universal. Now that seems an apt description of many Americans. The culture of “I” is everywhere you look, from the iPod/iPhone/iPad to the fact that memoir is the fastest growing literary genre.

How’d we get here? The same way we seem to get everywhere today: the Internet. The Internet has allowed us to segregate ourselves based on our interests. All cat lovers over here. All people who believe President Obama wasn’t born in the United States over there. For many of us, what we believe has become the most important organizing element in our lives. Once we all had common media experiences: Walter Cronkite, Ed Sullivan, a large daily newspaper. Now each of us can create a personal media network – call it the iNetwork – fed by the RSS feeds of our choosing.

But the Internet doesn’t just cordon us off in our own little pods. It also makes us dumber, as Nicholas Carr points out in his excellent book, “The Shallows: What the Internet is Doing to our Brains.” He argues that the way we consume media changes our brains, not just our behaviors. The Internet rewards shallow thinking: One search leads to thousands of results that skim over the surface of a subject.

Of course, we could dive deeply into any one of the listings, but we don’t. Studies show that people skim on line, they don’t read. The experience has been designed to reward speed and variety, not depth. And there is tangible evidence, based on studies of brain scans, that the medium is changing our physical brains, strengthening the synapses and areas used for referential thinking while weakening the areas used for critical thinking.

And when we diminish our ability to think critically, we, in essence, become less educated. Less capable of reflection and meaningful conversation. Our experience, reinforced by a web of other gut instincts and experiences that match our own, becomes evidence. Case in point: the polarization of our politics. Exhibit A: the debt ceiling impasse.

Ironically, the same medium that helped mobilize people in the Arab world this spring is helping create a more rigid, dysfunctional democracy here: one that’s increasingly polarized, where each side is isolated and capable only of sound bites that skim the surface, a culture where deep reasoning and critical thinking aren’t rewarded.

The challenge for most of us isn’t to go backwards: We can’t disconnect from the Internet. Nor would we want to. But we can work harder to make “search” the metaphor it once was: to discover, not just to skim. The Internet lets us find facts in an instant. But it doesn’t stop us from finding insight, if we’re willing to really search.

Jim Sollisch is creative director at Marcus Thomas Advertising.

Dilma Rousseff – a favela with a presidential name (The Guardian)

Renaming of Brazilian shantytown puts spotlight on problems facing country’s 16 million citizens living in extreme poverty

Tom Phillips in Rio de Janeiro ; guardian.co.uk, Monday 27 June 2011 16.58 BST

Three-month old Karen da Silva – the youngest resident of Dilma Rousseff – with her mother, 23-year-old Maria da Paixao Sequeira da Silva. Photograph: Tom Phillips

They call her Dilma Rousseff’s daughter: a dribbling three-month-old girl, coated in puppy fat and smothered by cooing relatives.

But Karen da Silva is no relation of Brazil’s first-ever female president. She is the first child to be born into one of the country’s newest favelas – the Comunidade Dilma Rousseff, a roadside shantytown on the western outskirts of Rio de Janeiro that was recently re-baptised with the name of the most powerful woman in the country

“She’s Dilma’s baby,” said Vagner Gonzaga dos Santos, a 33-year-old brick-layer-cum-evangelical preacher and the brains behind the decision to change the name of this hitherto unknown favela.

Last month, just as Rousseff was about to complete six months in power, Santos says he received a heaven-sent message suggesting the renaming.

“God lit up my heart,” he said. “The idea was to pay homage to the president and also to get the attention of the government, of our leaders, so they look to us and help the families here. The poor are God’s children too.”

Until recently, the 30-odd shacks that flank the Rio-Sao Paulo highway were known simply as “kilometre 31”. But its transition to Dilma Rousseff has not been entirely smooth.

At first, locals plastered A4 posters on the area’s walls and front doors, announcing the new name. But the posters referred to the Comunidade “Roussef” – one “f” short of the president’s Bulgarian surname. In May a sign was erected welcoming visitors to their shantytown, but again spelling proved an issue. This time the name given was “Dilma Rusself.”

That mistake has now been corrected, after an intervention from the preacher’s wife, who took a pot of red nail varnish to the sign. Locals say the name-change is starting to pay off.

“It’s been good having the president’s names,” said Marlene Silva de Souza, a 57-year-old mother of five and one of the area’s oldest residents. “Now we can say our community’s name with pride. Before we didn’t have a name at all.”

Dozens of Brazilian newspapers have flocked to the community – poking fun at its misspelt sign but also drawing attention to the poor living conditions inside the favela.

“It has brought us a lot of attention … The repercussion has been marvellous. Today things are starting to take shape, things are improving,” said Santos, who hopes local authorities will now formally recognise the favela, bringing public services such as electricity and rubbish collection.

Still, problems abound. Raw sewage trickles out from the houses, through a patchwork of wooden shacks, banana and mango trees and an allotment where onions sprout amid piles of rubbish. Rats and cockroaches proliferate in the wasteland that encircles the area.

Ownership is also an issue. Dilma Rousseff is built on private land – “The owners are Spanish, I think,” says Santos – and on paper the community does not officially exist. Without a fixed abode Karen “Rousseff” da Silva – the favela’s firstborn child – has yet to be legally registered.

Last month the Brazilian government launched a drive to eradicate extreme poverty unveiling programmes that will target 16 million of Brazil’s poorest citizens.

“My government’s most determined fight will be to eradicate extreme poverty and create opportunities for all,” Rousseff said in her inaugural address in January. “I will not rest while there are Brazilians who have no food on their tables, while there are desperate families on the streets [and] while there are poor children abandoned to their own fate.”

Residents of Rousseff’s namesake, who scratch a living selling biscuits and drinks to passing truck drivers, hope such benefits will soon reach them.

A visit from the president herself may also be on the cards, after Santos launched an appeal in the Brazilian media.

“We dream of her coming one day,” said the preacher, perched on a wooden bench outside his redbrick church, the House of Prayers. “It might be impossible for man to achieve, but for God everything is possible.”

Naming a community

Tear-jerking soap operas, political icons, stars of stage and screen – when it comes to baptising a Brazilian favela, all are fair game. The north-eastern city of Recife is home to favelas called Ayrton Senna, Planet of the Apes and Dancing Days, the title of a popular 1970stelenovela,

In the 1980s residents of a shantytown in Belo Horizonte named their community Rock in Rio – a tribute to the Brazilian rock festival that has played host to acts such as Neil Young, David Bowie and Queen.

Rio de Janeiro is home to the Boogie Woogie favela, the Kinder Egg favela and one community called Disneylandia. Vila Kennedy – a slum in west Rio – was named after the American president John F Kennedy and features a three-metre tall replica of the Statue of Liberty. Nearby, locals christened another hilltop slum Jorge Turco or Turkish George. Jorge was reputedly a benevolent gangster who ruled the community decades ago.

Paul Virilio: “Minha língua estrangeira é a velocidade, é a aceleração do real” (L.M. DIPLOMATIQUE Brasil)

03 de Junho de 2011

ENTREVISTA

“Minha língua estrangeira é a velocidade, é a aceleração do real”

por Guilherme Soares dos Santos

Uma das maiores personalidades da França atualmente, o filósofo e urbanista Paul Virilio ocupa um lugar de destaque na cena intelectual. Escritor prolífico, no seu currículo sucedem-se livros, exposições e artigos tais como Velocidade e política, Guerra e cinema, O espaço crítico, Máquina de visão e, recentemente, O grande acelerador [sem tradução ainda para o português] em que ele desenvolve uma cultura crítica, alguns dizem “catastrofista”, sobre as técnicas modernas, bem como seus efeitos de aceleração sobre nossos comportamentos e nossa percepção do mundo, no momento em que a economia mundial depende cada vez mais do investimento na tecnologia.

Paul Virilio recebe o filósofo brasileiro Guilherme Soares dos Santos em Paris, e fala com exclusividade ao Le Monde Diplomatique Brasil sobre suas teses, que tratam da corrida, da lógica da velocidade, ele que é visto por muitos como um reacionário ou um visionário, fala ytambém de sua biografia.

VIRILIO: Há uma coisa fundamental que explicará, talvez, o aspecto “catastrofista” do qual me acusam. Eu sou uma criança da guerra, um war baby, e é um elemento que não foi suficientemente compreendido, porque tentou-se fazer esquecer a guerra. Há dois momentos importantes na Segunda Guerra Mundial (eu a vivi, eu tinha 10 anos, eu nasci em 1932). Houve primeiramente a Guerra Relâmpago, a Blitzkrieg. Censurou-se esse aspecto, e alguns historiadores negam a blitz, isto é, o fato de que a velocidade esteve na base da grande ruína, primeiro da Polônia, e em seguida da França. E então essa Blitzkrieg se esgotou nos países do leste e na União Soviética, porque lá havia a profundidade de campo que permitia amortecer. Eu sou, portanto, uma criança da Blitzkrieg, eu diria mesmo que eu sou talvez o único, o único que desde então jamais cessou de ser marcado pelo poder da velocidade. Não é somente o poder dos transportes, os carros de assalto contra a cavalaria polonesa que desembainhava o sabre contra os panzers… Há também a guerra das ondas da qual eu sou o filho: “Pom, pom, pom, pom”… a rádio de Londres que eu escutava no escuro com o meu pai. Há dois momentos capitais: a Blitzkrieg e a deportação que vai, aliás, junto com o mesmo movimento de invasão. Se a guerra de 1914 foi uma guerra de posição em que os exércitos se exterminaram no mesmo lugar durante anos, com a deportação, conduziu-se à Shoah no curso da Segunda Guerra Mundial.

DIPLOMATIQUE: O senhor é muito sensibilizado pelas tragédias que ocorrem em nossa época. O senhor quis inclusive que fosse criado um “Museu dos Acidentes”, após a exposição feita sobre esse tema na Fundação Cartier para a Arte Contemporânea. O senhor já avançou essa ideia, mas por enquanto sem sucesso junto às instituições, e agora o senhor propõe a criação de uma “Universidade do Desastre”. De que se trata exatamente? Não se poderia pensar que a guerra terá sido, para o senhor, uma universidade do desastre?

VIRILIO: Com efeito. Quando eu falo de “Universidade do Desastre” não é de modo algum o desastre da universidade, é o contrário: eu quero dizer que “o pior provoca o melhor”. A universidade europeia apareceu em Bolonha e alhures aproximadamente em 1100, 1200, após o “grande medo” do ano 1000, em oposição à grande barbárie. E ela foi, essa universidade, um coletivo judeo-cristão, greco-latino e árabe. Alguns negam o grande medo do primeiro milênio, como alguns negam, hoje, a blitz. Há aí alguma coisa que, no meu entendimento, faz parte do segredo da velocidade. Se “o tempo é dinheiro”, a velocidade é poder. Eu lembro que para os banqueiros, para que haja mais-valia, é preciso que haja a velocidade de troca. A questão da velocidade é uma questão mascarada; não mascarada por um complô, mas mascarada por sua simplicidade. Riqueza e velocidade estão vinculadas. É conhecido o vínculo da riqueza e do poder como da lei do mais forte. Mas a lei do mais forte é a lei do mais rápido. A questão da “dromologia” é a questão da velocidade que, hoje, mudou de natureza. Na origem, a velocidade é o tesouro dos faraós; é a tesaurização, quer dizer, a acumulação e, então, muito rapidamente, tornar-se-á especulação. E aí o movimento de acumulação vai passar na aceleração. Os dois estão vinculados. Acumulação do tesouro que tornar-se-á tesouro público, em seguida especulação, e hoje financeirização com os sistemas de cotação automática em alta frequência que fazem explodir a bolsa de valores. Veja, estamos diante de algo extraordinário, é que nós não sabemos o que é a velocidade em nossos dias. As pessoas me dizem que é preciso uma economia política da velocidade, e, de fato, é preciso uma, mas é preciso primeiro uma dromologia, ou seja, revelar na vida política, no sentido amplo do poder, a natureza da velocidade em nosso tempo. Essa velocidade mudou de natureza. Essencialmente ela foi a revolução dos transportes. Até o século XX, até a blitz, vimos que a revolução das riquezas é uma revolução dos transportes: o cavalo, o navio, o trem, o avião, os sinais mecânicos.

DIPLOMATIQUE: No final do século XX, passa-se não mais à revolução dos transportes, mas à das transmissões instantâneas.

VIRILIO: Durante a guerra, ainda garotinho, eu participei da Resistência com os meus pais graças à guerra das ondas que já era uma guerra eletromagnética. Uma guerra da velocidade das ondas. Marconi e sua invenção era, também, uma revolução da velocidade. Começava-se a pôr em obra a velocidade das ondas eletromagnéticas, isto é, das ondas da velocidade da luz. E, claro, com a televisão, os computadores e a Internet, nós entramos numa fase que hoje atinge o seu limite; a velocidade da luz em que o tempo humano, o tempo da negociação, da especulação, em que a inteligência do homem, do especulador, dos cotadores é ultrapassada pelos automatismos. Aliás, quando a bolsa quebrou em 6 de maio do ano passado em Wall Street, em alguns milésimos de segundos houve 23.000 operações, o sistema entrou em pane e bilhões foram perdidos em dez minutos.

DIPLOMATIQUE: O que preocupa o senhor são os limites do tempo humano?

VIRILIO: Sim, é preciso trabalhar sobre a natureza do poder da velocidade atualmente, porque a velocidade da luz é um absoluto e é o limite do tempo humano. Nós estamos no “tempo-máquina”; o tempo humano é sacrificado como os escravos eram sacrificados no culto solar de antigamente. Eu o digo, nós estamos num novo Iluminismo em que a velocidade da luz é um culto. É um poder absoluto que se esconde atrás do progresso, e é por isso que eu afirmo que a velocidade é a propaganda do progresso. Eu não tenho nada contra o progresso. Quando eu digo que é preciso “ir mais devagar”, alguns zombam de mim. Pensam que eu condeno a revolução dos transportes, dos trens, dos carros, dos aviões, que eu sou contra os computadores e contra a Internet. Não é nesse nível que as coisas estão em jogo…

DIPLOMATIQUE: O que o senhor combate é a aceleração do real que põe em questão a percepção das aparências sensíveis e daquilo que a fenomenologia chama de “ser-no-mundo”?

VIRILLIO: Sim, o que a revolução dos transportes era para a aceleração da história e os movimentos migratórios, a revolução das transmissões instantâneas o é para a aceleração da realidade percebida. É um acontecimento alucinante, estupeficante. A velocidade é uma ebriedade. Uma embriaguez que pode ser “scópica” ou sonora – daí, aliás, a passagem do muro do som. Com as telecomunicações, utiliza-se a força de impacto da aceleração para fazer passar coisas que não estão na realidade pública, ou seja, no espaço real público, mas na realidade privada, ou antes transmitidas em tempo real por sociedades privadas. A tal ponto que a questão da imaginação, e aquela, filosófica, do “ser-no-mundo”, do aqui e do agora, tornam-se centrais. Nós estamos, assim, em plena crise da ciência, do que eu chamo de “acidente dos conhecimentos”, “acidente das substâncias” e “acidente das distâncias”. Essa questão da velocidade, desde Einstein, está no cerne da relatividade outrora especial, e hoje generalizada, que está em vias de se chocar contra um muro, o que eu chamo de “muro do tempo”. O que se passou em Wall Street me interessa muito porque as pessoas de Wall Street se chocaram contra o muro do tempo e o muro do dinheiro. É um fenômeno político maior no momento em que os algoritmos e os programas de computador dominam a vida econômica, e eu pretendo que a “relatividade especial” deveria ser um problema encarado pelo Estado. Se o século XX foi o século da conquista do ar e do espaço, eu penso que o século XXI deveria se questionar não somente sobre as nanotecnologias, mas, também, sobre as nanocronologias, isto é, sobre o tempo infinitesimal, sobre a conquista do “infinitamente pequeno do tempo”.

DIPLOMATIQUE: Parece-me que, nos textos do senhor, o estilo quer sempre ecoar o assunto estudado, e quando senhor pensa a velocidade, é a própria escrita que deve ir rápido.

VIRILIO: Absolutamente. E nesse sentido, como o mostra Proust, todo verdadeiro escritor escreve “numa espécie de língua estrangeira”. Minha língua estrangeira é a velocidade, é a aceleração do real. No que respeita à velocidade da minha escrita, trata-se da herança dos futuristas, e eu sinto a dromologia como uma musicologia. O problema não é nem de acelerar nem de desacelerar, mas de seguir uma linha melódica.

DIPLOMATIQUE: Há quem diga que o senhor pratica uma “escrita rapsódica”.

VIRILIO: Trata-se do ritmo. As sociedades antigas eram sociedades ritmológicas. Havia o calendário, a liturgia, as festas que estruturavam a linha melódica de tal ou qual sociedade. Os ritmos são muito importantes, você sabe, pois trata-se do sopro. Quando Bergson e Einstein se encontraram, eles não se compreenderam a esse respeito. O primeiro falava de “duração”, do vivo; o segundo, do vazio e do veloz. Saiba, no entanto, que será necessário conciliá-los, caso contrário o futuro do século XXI será um caos global pior que o nazismo ou o comunismo, que não tem nada a ver com a anarquia. Não, um caos global pior que tudo!

DIPLOMATIQUE: É por isso que o pensamento do senhor torna-se mais e mais dramático e religioso nos últimos tempos?

VIRILIO: Eu sou um católico que se converteu já adulto, isso é importante. Meu pai era comunista, e minha mãe, católica. Acontece que eu conheci o abade Pierre e padres operários. Mas eu permaneço sozinho sobre a minha senda.

DIPLOMATIQUE: Alguns criticam o senhor por descrever situações que seriam exagerações fantasistas, quando o senhor descreve este temor da solidão gerado pelas telecomunicações, notadamente pela Internet ou o celular. Não estaria o senhor realizando, talvez, uma especulação sobre “mundos possíveis”, seguindo uma espécie de método “transcendental” de investigação?

VIRILIO: Eu quero reunir o que foi separado, quero dizer, a filosofia e a física. Trata-se fundamentalmente de uma reinvenção filosófica para fazer frente a esta matematização do mundo, esta rapidez que ultrapassa a consciência. Eu me sinto no limiar de uma filosofia sem igual. Tal como Heráclito ou Parmênides, estamos aqui na origem – daí a Universidade do Desastre. Todo o trabalho desta seria um questionamento sobre o “desastre do êxito”. O que se acaba de descrever é o sucesso da tecnociência. Ora, é imperativo reconciliar e lançar a “filociência”. O que está aí em jogo é a vida ou a morte da humanidade. Se o homem não pode mais falar e se ele transfere o poder de enunciação a aparelhos, encontramo-nos, pois, diante de uma tirania sem igual. Físicos que são meus amigos estão conscientes disso, do que se perfila, um “acidente dos conhecimentos”. Isso nos conduz à árvore da vida que só tem referência na origem do Gênese, ou seja, o mito da vida… E aí eu o digo enquanto cristão e enquanto escritor: o acidente dos conhecimentos é o pecado original.

DIPLOMATIQUE: O que significa em nossa época “ser sábio”, quando nós somos forçados a uma especialização crescente e incessante, assim como à busca de uma “resposta automatizada” nos motores de pesquisa e nos bancos de dados que ultrapassam de longe o que a memória individual pode abarcar em uma vida? Como nós podemos cultivar nossa lucidez nesta maré enlouquecedora de informações?

VIRILIO: No que me toca mais diretamente, eu sou um urbanista, quer dizer que eu trabalho sobre o habitat. E o próprio de um urbanista é trabalhar sobre o habitar, o “ser-aqui”. O “ser-aqui-junto”. É isso o habitat: é o lugar de nossos hábitos. Os dois mantêm um vínculo muito estreito, isto é, a possibilidade de durar; o hábito é o que se reproduz. É o “ser-junto”. Não simplesmente o “ser-junto” do socius, mas o “ser-junto” da natureza no habitat comum com a nossa irmã, a chuva, e o nosso irmão, o sol, como diriam os franciscanos… É isso a arquitetura! É abrigar o vivente.

DIPLOMATIQUE: Perante o excesso contemporâneo de informações, e à velocidade sempre acelerada do desenrolar dos acontecimentos se desdobrando mundialmente nas imagens de nossas telas, não estamos testemunhando uma verdadeira desconstrução da cultura geral?

VIRILIO: Você utilizou na sua pergunta a palavra “desconstrução”. Eu creio que Derrida tinha razão para o fim do século XX. O início do século XX é a destruição pura e simplesmente através da ruína das cidades, através da ruína dos corpos. É a destruição; não se pode dizer que Auschiwtz ou Hiroshima sejam “desconstruções”… São puras destruições. Ora, eu creio – e eu o digo e o escrevo – que o século XXI será a desorientação, quer dizer, a perda de todas as referências – se a humanidade continuar desse jeito, e ela não continuará. Portanto, eu não creio de maneira alguma no fim do mundo. Mas o que eu quero dizer é a desorientação: não sabemos mais onde estamos nem no espaço nem no tempo. E aí, o geômetra que eu sou, o arquiteto que eu sou, sabe o que é a orientação. A arquitetura é primeira; ela é composição; ela é habitat comum entre os seres e as coisas. Pois ser é “ser-no-mundo”, e é o que eu digo: o problema não é de ser, mas de ser-no-mundo, em outras palavras, de ser-no-corpo-territorial. Isso não tem nada a ver com nacionalismo. Simplesmente não se pode ser sem “ser-no-mundo”. Em nossa época, todavia, o essencial se passa no vazio. Se você olhar hoje em dia, o poder não é mais geopolítico, religado ao solo, ele é aeropolítico: as ondas, os aviões e os foguetes traçam o porvir. A história se transferiu da terra ao céu, com toda a dimensão mística de adoração do cosmos, do grande vazio sideral, das ondas que se propagam etc. que isso supõe. As sociedades históricas eram sociedades geopolíticas, ou seja, inscritas nos lugares. O acontecimento, conforme com que eu digo, o acontecimento “tem lugar”; logo existe uma natureza do lugar que tange ao acontecimento. E essa relação com o “ter lugar” foi ocultada. É uma noção tão banal… Significa, portanto, que eu não posso ser sem ter lugar. Não é um problema de identidade – não, situado, orientado, in situ, hic et nunc, aqui e agora.

DIPLOMATIQUE: Uma última pergunta. O senhor pensa que o sistema econômico já está sendo transtornado pela ecologia?

VIRILIO: Doravante a economia e a ecologia devem fundir-se porque o mundo é finito, porque o mundo é demasiado pequeno para o progresso. Nós esgotamos a matéria do mundo, nós poluímos sua substância e nós poluímos suas distâncias. E nós estamos perante à fusão próxima da ecologia e da economia. Vê-se bem as dificuldades com o encontro do Grupo de Informação sobre o Clima em Copenhague. Vê-se bem a dificuldade que há em plena crise econômica, nos Estados Unidos e no mundo, a tomar medidas ecológicas. Portanto, inevitavelmente, o fato de que a Terra é muito pequena para a velocidade, para a velocidade do progresso, exige a fusão dos dois. Daí a importância de uma Universidade do Desastre e a reinvenção de um pensamento, de um intelectual coletivo, como o foi a universidade das origens. É indispensável. Até o momento ela não existe; nós estamos na origem de um novo mundo. E eu gostaria de ser mais jovem para poder viver esse Novo Mundo que vai nascer na dor do confronto. Mas que é indispensável. Nenhum homem, seja qual for a sua cor política, não está à altura desse acontecimento que se assemelha à Renascença italiana… E ao mesmo tempo é tão excitante! É maravilhoso! Como já dizia Karl Krauss: “que grande época”!

Guilherme Soares dos Santos – Filósofo, mestre em filosofia política e ética pela Universidade Paris Sorbonne e, atualmente, é doutorando em filosofia contemporânea pela Universidade Paris 8, onde estuda o pensamento de Gilles Deleuze.

Foto e tradução: Guilherme Soares dos Santos

Palavras chave: Paul Virilio, velocidade, transformação, corrida, cultura, teoria

Why Are Spy Researchers Building a ‘Metaphor Program’? (The Atlantic)

MAY 25 2011, 4:19 PM ET

ALEXIS MADRIGAL – Alexis Madrigal is a senior editor at The Atlantic. He’s the author of Powering the Dream: The History and Promise of Green Technology.
A small research arm of the U.S. government’s intelligence establishment wants to understand how speakers of Farsi, Russian, English, and Spanish see the world by building software that automatically evaluates their use of metaphors.That’s right, metaphors, like Shakespeare’s famous line, “All the world’s a stage,” or more subtly, “The darkness pressed in on all sides.” Every speaker in every language in the world uses them effortlessly, and the Intelligence Advanced Research Projects Activity wants know how what we say reflects our worldviews. They call it The Metaphor Program, and it is a unique effort within the government to probe how a people’s language reveals their mindset.

“The Metaphor Program will exploit the fact that metaphors are pervasive in everyday talk and reveal the underlying beliefs and worldviews of members of a culture,” declared an open solicitation for researchers released last week. A spokesperson for IARPA declined to comment at the time.

diagram.jpg
IARPA wants some computer scientists with experience in processing language in big chunks to come up with methods of pulling out a culture’s relationship with particular concepts.”They really are trying to get at what people think using how they talk,” Benjamin Bergen, a cognitive scientist at the University of California, San Diego, told me. Bergen is one of a dozen or so lead researchers who are expected to vie for a research grant that could be worth tens of millions of dollars over five years, if the team scan show progress towards automatically tagging and processing metaphors across languages.

“IARPA grants are big,” said Jennifer Carter of Applied Research Associates, a 1,600-strong research company that may throw its hat in the Metaphor ring after winning a lead research spot in a separate IARPA solicitation. While no one knows the precise value of the rewards of the IARPA grants and the contracts are believed to vary widely, they tend to support several large teams of multidisciplinary researchers, Carter said. The awards, which would initially go to several teams, could range into the five digits annually. “Generally what happens… there will be a ‘downselect’ each year, so maybe only one team will get money for the whole program,” she said.*

All this to say: The Metaphor Program may represent a nine-figure investment by the government in understanding how people use language. But that’s because metaphor studies aren’t light or frilly and IARPA isn’t afraid of taking on unusual sounding projects if they think they might help intelligence analysts sort through and decode the tremendous amounts of data pouring into their minds.

In a presentation to prospective research “performers,” as they’re known, The Metaphor Program’s manager, Heather McCallum-Bayliss gave the following example of the power of metaphors in political discussions. Her slide reads:

Metaphors shape how people think about complex topics and can influence beliefs. A study presented participants with a report on crime in a city; they were asked how crime should be addressed in the city. The report contained statistics, including crime and murder rates, as well as one of two metaphors, CRIME AS A WILD BEAST or CRIME AS A VIRUS. The participants were influenced by the embedded metaphor…

McCallum-Bayliss appears to be referring to a 2011 paper published in the PLoS ONE, “Metaphors We Think With: The Role of Metaphor in Reasoning,” lead authored by Stanford’s Paul Thibodeau. In that case, if people were given the crime-as-a-virus framing, they were more likely to suggest social reform and less likely to suggest more law enforcement or harsher punishments for criminals. The differences generated by the metaphor alternatives were “were larger than those that exist between Democrats and Republicans, or between men and women,” the study authors noted.

Every writer (and reader) knows that there are clues to how people think and ways to influence each other through our use of words. Metaphor researchers, of whom there are a surprising number and variety, have formalized many of these intuitions into whole branches of cognitive linguistics using studies like the one outlined above (more on that later). But what IARPA’s project calls for is the deployment of spy resources against an entire language. Where you or I might parse a sentence, this project wants to parse, say, all the pages in Farsi on the Internet looking for hidden levers into the consciousness of a people.

“The study of language offers a strategic opportunity for improved counterterrorist intelligence, in that it enables the possibility of understanding of the Other’s perceptions and motivations, be he friend or foe,” the two authors of Computational Methods for Counterterrorism wrote. “As we have seen, linguistic expressions have levels of meaning beyond the literal, which it is critical to address. This is true especially when dealing with texts from a high-context traditionalist culture such as those of Islamic terrorists and insurgents.”

In the first phase of the IARPA program, the researchers would simply try to map from the metaphors a language used to the general affect associated with a concept like “journey” or “struggle.” These metaphors would then be stored in the metaphor repository. In a later stage, the Metaphor Program scientists will be expected to help answer questions like, “What are the perspectives of Pakistan and India with respect to Kashmir?” by using their metaphorical probes into the cultures. Perhaps, a slide from IARPA suggests, metaphors can tell us something about the way Indians and Pakistanis view the role of Britain or the concept of the “nation” or “government.”

The assumption is that common turns of phrase, dissected and reassembled through cognitive linguistics, could say something about the views of those citizens that they might not be able to say themselves. The language of a culture as reflected in a bunch of text on the Internet might hide secrets about the way people think that are so valuable that spies are willing to pay for them.

MORE THAN WORDS

IARPA is modeled on the famed DARPA — progenitors of the Internet among other wonders — and tasked with doing high-risk, high-reward research for the many agencies, the NSA and CIA among them, that make up the American intelligence-gathering force. IARPA is, as you might expect, a low-profile organization. Little information is available from the organization aside from a couple of interviews that its administrator, Lisa Porter, a former NASA official, gave back in 2008 to Wiredand IEEE Spectrum. Neither publication can avoid joking that the agency is like James Bond’s famous research crew, but it turns out that the place is more likely to use “cloak-and-dagger” in a sentence than in actual combat with supervillainy.

A major component of the agency’s work is data mining and analysis. IARPA is split into three program offices with distinct goals: Smart Collection “to dramatically improve the value of collected data from all sources”; Incisive Analysis “to maximize insight from the information we collect, in a timely fashion”; and Safe & Secure Operations “to counter new capabilities implemented by our adversaries that would threaten our ability to operate freely and effectively in a networked world.” The Metaphor Program falls under the office of Incisive Analysis and is headed by the aforementioned McCallum-Bayliss, a former technologist at Lockheed Martin and IBM, who co-filed several patents relating to the processing of names in databases.

Incisive Analysis has put out several calls for other projects. They range widely in scope and domain. The Babel Program seeks to “demonstrate the ability to generate a speech transcription system for any new language within one week to support keyword search performance for effective triage of massive amounts of speech recorded in challenging real-world situations.” ALADDIN aims to create software to automatically monitor massive amounts of video. The FUSE Program is trying to “develop automated methods that aid in the systematic, continuous, and comprehensive assessment of technical emergence” using the scientific and patent literature.

All three projects are technologically exciting, but none of those projects has the poetic ring nor the smell of humanity of The Metaphor Program. The Metaphor Program wants to understand what human beings mean through the unvoiced emotional inflection of our words. That’s normally the work of an examined life, not a piece of spy software.

There is some precedent for the work. It comes from two directions: cognitive linguistics and natural language processing. On the cognitive linguistic side, George Lakoff and Mark Johnson of the University of California, Berkeley did the foundational work, notably in their 1980 book,Metaphors We Live By. As summarized recently by Zoltán Kövecses in his book, Metaphor: A Practical Introduction, Lakoff and Johnson showed that metaphors weren’t just the devices of writers but rather “a valuable cognitive tool without which neither poets nor you and I as ordinary people could live.”

In this school of cognitive linguistics, we need to use more embodied, concrete domains in order to describe more abstract ones. Researchers assembled the linguistic expressions we use like “That class gave me food for thought” and “His idea was half-baked” into a construct called a “conceptual category.” These come in the form of awesomely simple sentences like “Ideas Are Food.” And there are whole great lists of them. (My favorites: Darkness Is a Solid; Time Is Something Moving Toward You; Happiness Is Fluid In a Container; Control Is Up.) The conceptual categories show that humans use one domain (“the source”) to describe another (“the target”). So, take Ideas Are Food: thinking is preparing food and understanding is digestion and believing is swallowing and learning is eating and communicating is feeding. Put simply: We import the logic of the source domain into the target domain.

Below, you can check out how one, “Ideas Are Food,” is expressed, or skip past the gallery to the rest of the story.

The main point here is that metaphors, in this sense, aren’t soft or literary in any narrow sense. Rather, they are a deep and fundamental way that humans make sense of the world. And unfortunately for spies who want to filter the Internet to look for dangerous people, computers can’t make much sense out of sentences like, “We can make beautiful music together,” which Google translates as something about actually playing music when, of course, it really means, “We can be good together.” (Or as the conceptual category would phrase it: “Interpersonal Harmony Is Musical Harmony.”)

While some of the underlying structures of the metaphors — the conceptual categories — are near universal (e.g. Happy Is Up), there are many variations in their range, elaboration, and emphasis. And, of course, not every category is universal. For example, Kövecses points to a special conceptual category in Japanese centered around the hara, or belly, “Anger Is (In The) Hara.” In Zulu, one finds an important category, “Anger Is (Understood As Being) In the Heart,” which would be rare in English. Alternatively, while many cultures conceive of anger as a hot fluid in a container, it’s in English that we “blow off steam,” a turn of phrase that wouldn’t make sense in Zulu.

These relationships have been painstakingly mapped by human analysts over the last 30 years and they represent a deep culturolinguistic knowledge base. For the cognitive linguistic school, all of these uses of language reveal something about the way the people of a culture understand each other and the world. And that’s really the target of the metaphor program, and what makes it unprecedented. They’re after a deeper understanding of the way people use words because the deep patterns encoded in language may help intelligence analysts understand the people, not just the texts.

For Lakoff, it’s about time that the government started taking metaphor seriously. “There have been 30 years of neglect of current linguistics in all government-sponsored research,” he told me. “And finally there is somebody in the government who has managed to do something after many years of trying.”

UC San Diego’s Bergen agreed. “It’s a totally unique project,” he said. “I’ve never seen anything like it.”

But that doesn’t mean it’s going to be easy to create a system that can automatically deduce what Americans’ biases about education from a statement like “The teacher spoon-fed the students.”

Lakoff contends that it will take a long, sustained effort by IARPA (or anyone else) to complete the task. “The quick-and-dirty way” won’t work, he said. “Are they going to do a serious scientific account?”

BUILDING A METAPHOR MACHINE

The metaphor problem is particularly difficult because we don’t even know what the right answers to our queries are, Bergen said.

“If you think about other sorts of automation of language processing, there are right answers,” he said. “In speech recognition, you know what the word should be. So you can do statistical learning. You use humans, tag up a corpus and then run some machine learning algorithms on that. Unfortunately, here, we don’t know what the right answers are.”

For one, we don’t really have a stable way of telling what is and what is not metaphorical language. And metaphorical language is changing all the time. Parsing text for metaphors is tough work for humans and we’re made for it. The kind of intensive linguistic analysis that’s made Lakoff and his students (of whom Bergen was one) famous can take a human two hours for every 500 words on the page.

But it’s that very difficulty that makes people want to deploy computing resources instead of human beings. And they do have some directions that they could take. James Martin of the University of Colorado played a key role in the late 1980s and early 1990s in defining the problem and suggesting a solution. Martin contended “the interpretation of novel metaphors can be accomplished through the systematic extension, elaboration, and combination of knowledge about already well-understood metaphors,” in a 1988 paper.

What that means is that within a given domain — say, “the family” in Arabic — you can start to process text around that. First you’ll have humans go in and tag up the data, finding the metaphors. Then, you’d use what they learned about the target domain “family” to look for metaphorical words that are often associated with it. Then, you run permutations on those words from the source domain to find other metaphors you might not have before. Eventually you build up a repository of metaphors in Arabic around the domain of family.

Of course, that’s not exactly what IARPA’s looking for, but it’s where the research teams will be starting. To get better results, they will have to start to learn a lot more about the relationships between the words in the metaphors. For Lakoff, that means understanding the frames and logics that inform metaphors and structure our thinking as we use them. For Bergen, it means refining the rules by which software can process language. There are three levels of analysis that would then be combined. First, you could know something about the metaphorical bias of an individual word. Crossroads, for example, is generally used in metaphorical terms. Second, words in close proximity might generate a bias, too. “Knockout in the same clause as ‘she’ has a much higher probability of being metaphorical if it’s in close proximity to ‘he,'” Bergen offered as an example. Third, for certain topics, certain words become more active for metaphorical usage. The economy’s movement, for example, probably maps to a source domain of motion through space. So, accelerate to describe something about the economy is probably metaphorical. Create a statistical model to combine the outputs of those three processes and you’ve got a brute-force method for identifying metaphors in a text.

In this particular competition, there will be more nuanced approaches based on parsing the more general relationships between words in text: sorting out which are nouns and how they connect to verbs, etc. “If you have that information, then you can find parts of sentences that don’t look like they should be there,” Bergen explained. A classic kind of identifier would be a type mismatch. “If I am the verb ‘smile,’ I like to have a subject that has a face,” he said. If something without a face is smiling, it might be an indication that some kind of figurative language is being employed.

From these constituent parts — and whatever other wild stuff people cook up —  the teams will try to build a metaphor machine that can convert a language into underlying truths about a culture. Feed text in one end and wait on the other end of the Rube Goldberg software for a series of beliefs about family or America or power.

We might never be able to build such a thing. Indeed, I get the feeling that we can’t, at least not yet. But what if we can?

“Are they going to use it wisely?” Lakoff posed. “Because using it to detect terrorists is not a bad idea, but then the question is: Are they going to use it to spy on us?”

I don’t know, but I know that as an American I think through these metaphors: Problem Is a Target; Society Is a Body; Control Is Up.

* This section of the story was updated to more accurately reflect the intent of Carter’s statement.

Kari Norgaard on climate change denial

Understanding the climate ostrich

BBC News, 15 November 07
By Kari Marie Norgaard
Whitman College, US

Why do people find it hard to accept the increasingly firm messages that climate change is a real and significant threat to livelihoods? Here, a sociologist unravels some of the issues that may lie behind climate scepticism.

“I spent a year doing interviews and ethnographic fieldwork in a rural Norwegian community recently.

In winter, the signs of climate change were everywhere – glaringly apparent in an unfrozen lake, the first ever use of artificial snow at the ski area, and thousands of dollars in lost tourist revenues.

Yet as a political issue, global warming was invisible.

The people I spoke with expressed feelings of deep concern and caring, and a significant degree of ambivalence about the issue of global warming.

This was a paradox. How could the possibility of climate change be both deeply disturbing and almost completely invisible – simultaneously unimaginable and common knowledge?

Self-protection
People told me many reasons why it was difficult to think about this issue. In the words of one man, who held his hands in front of his eyes as he spoke, “people want to protect themselves a bit.”

Community members described fears about the severity of the situation, of not knowing what to do, fears that their way of life was in question, and concern that the government would not adequately handle the problem.

They described feelings of guilt for their own actions, and the difficulty of discussing the issue of climate change with their children.

In some sense, not wanting to know was connected to not knowing how to know. Talking about global warming went against conversation norms.

It wasn’t a topic that people were able to speak about with ease – rather, overall it was an area of confusion and uncertainty. Yet feeling this confusion and uncertainty went against emotional norms of toughness and maintaining control.

Other community members described this sense of knowing and not knowing, of having information but not thinking about it in their everyday lives.

As one young woman told me: “In the everyday I don’t think so much about it, but I know that environmental protection is very important.”

Security risk
The majority of us are now familiar with the basics of climate change.

Worst case scenarios threaten the very basics of our social, political and economic infrastructure.

Yet there has been less response to this environmental problem than any other. Here in the US it seems that only now are we beginning to take it seriously.

How can this be? Why have so few of us engaged in any of the range of possible actions from reducing our airline travel, pressurising our governments and industries to cut emissions, or even talking about it with our family and friends in more than a passing manner?

Indeed, why would we want to know this information?

Why would we want to believe that scenarios of melting Arctic ice and spreading diseases that appear to spell ecological and social demise are in store for us; or even worse, that we see such effects already?

Information about climate change is deeply disturbing. It threatens our sense of individual identity and our trust in our government’s ability to respond.

At the deepest level, large scale environmental problems such as global warming threaten people’s sense of the continuity of life – what sociologist Anthony Giddens calls ontological security.

Thinking about global warming is also difficult for those of us in the developed world because it raises feelings of guilt. We are now aware of how driving automobiles and flying to exotic warm vacations contributes to the problem, and we feel guilty about it.

Tactful denial
If being aware of climate change is an uncomfortable condition which people are motivated to avoid, what happens next?

After all, ignoring the obvious can take a lot of work.

In the Norwegian community where I worked, collectively holding information about global warming at arm’s length took place by participating in cultural norms of attention, emotion, and conversation, and by using a series of cultural narratives to deflect disturbing information and normalise a particular version of reality in which “everything is fine.”

When what a person feels is different from what they want to feel, or are supposed to feel, they usually engage in what sociologists call emotional management.

We have a whole repertoire of techniques or “tools” for ignoring this and other disturbing problems.

As sociologist Evitiar Zerubavel makes clear in his work on the social organisation of denial and secrecy, the means by which we manage to ignore the disturbing realities in front of us are also collectively shaped.

How we cope, how we respond, or how we fail to respond are social as well.

Social rules of focusing our attention include rules of etiquette that involve tact-related ethical obligations to “look the other way” and ignore things we most likely would have noticed about others around us.

Indeed, in many cases, merely following our cultural norms of acceptable conversation and emotional expression serves to keep our attention safely away from that pesky topic of climate change.

Emotions of fear and helplessness can be managed through the use of selective attention; controlling one’s exposure to information, not thinking too far into the future and focusing on something that could be done.

Selective attention can be used to decide what to think about or not to think about, for example screening out painful information about problems for which one does not have solutions: “I don’t really know what to do, so I just don’t think about that”.

The most effective way of managing unpleasant emotions such as fear about your children seems to be by turning our attention to something else, or by focusing attention onto something positive.

Hoodwinking ourselves?
Until recently, the dominant explanation within my field of environmental sociology for why people failed to confront climate change was that they were too poorly informed.

Others pose that Americans are simply too greedy or too individualistic, or suffer from incorrect mental models.

Psychologists have described “faulty” decision-making powers such as “confirmation bias”, and argue that with more appropriate analogies we will be able to manage the information and respond.

Political economists, on the other hand, tell us that we’ve been hoodwinked by increased corporate control of media that limits and moulds available information about global warming.

These are clearly important answers.

Yet the fact that nobody wants information about climate change to be true is a critical piece of the puzzle that also happens to fit perfectly with the agenda of those who have tried to generate climate scepticism.”

Dr Kari Marie Norgaard is a sociologist at Whitman College in Walla Walla, Washington state, US.

See also A Dialog Between Renee Lertzman and Kari Norgaard.

It’s Even Less in Your Genes (The New York Review of Books)

MAY 26, 2011
Richard C. Lewontin

The Mirage of a Space Between Nature and Nurture
by Evelyn Fox Keller
Duke University Press, 107 pp., $64.95; $18.95 (paper)

In trying to analyze the natural world, scientists are seldom aware of the degree to which their ideas are influenced both by their way of perceiving the everyday world and by the constraints that our cognitive development puts on our formulations. At every moment of perception of the world around us, we isolate objects as discrete entities with clear boundaries while we relegate the rest to a background in which the objects exist.

That tendency, as Evelyn Fox Keller’s new book suggests, is one of the most powerful influences on our scientific understanding. As we change our intent, also we identify anew what is object and what is background. When I glance out the window as I write these lines I notice my neighbor’s car, its size, its shape, its color, and I note that it is parked in a snow bank. My interest then changes to the results of the recent storm and it is the snow that becomes my object of attention with the car relegated to the background of shapes embedded in the snow. What is an object as opposed to background is a mental construct and requires the identification of clear boundaries. As one of my children’s favorite songs reminded them:

You gotta have skin.
All you really need is skin.
Skin’s the thing that if you’ve got it outside,
It helps keep your insides in.
Organisms have skin, but their total environments do not. It is by no means clear how to delineate the effective environment of an organism.

One of the complications is that the effective environment is defined by the life activities of the organism itself. “Fish gotta swim and birds gotta fly,” as we are reminded by yet another popular lyric. Thus, as organisms evolve, their environments necessarily evolve with them. Although classic Darwinism is framed by referring to organisms adapting to environments, the actual process of evolution involves the creation of new “ecological niches” as new life forms come into existence. Part of the ecological niche of an earthworm is the tunnel excavated by the worm and part of the ecological niche of a tree is the assemblage of fungi associated with the tree’s root system that provide it with nutrients.

The vulgarization of Darwinism that sees the “struggle for existence” as nothing but the competition for some environmental resource in short supply ignores the large body of evidence about the actual complexity of the relationship between organisms and their resources. First, despite the standard models created by ecologists in which survivorship decreases with increasing population density, the survival of individuals in a population is often greatest not when their “competitors” are at their lowest density but at an intermediate one. That is because organisms are involved not only in the consumption of resources, but in their creation as well. For example, in fruit flies, which live on yeast, the worm-like immature stages of the fly tunnel into rotting fruit, creating more surface on which the yeast can grow, so that, up to a point, the more larvae, the greater the amount of food available. Fruit flies are not only consumers but also farmers.

Second, the presence in close proximity of individual organisms that are genetically different can increase the growth rate of a given type, presumably since they exude growth-promoting substances into the soil. If a rice plant of a particular type is planted so that it is surrounded by rice plants of a different type, it will give a higher yield than if surrounded by its own type. This phenomenon, known for more than a half-century, is the basis of a common practice of mixed-variety rice cultivation in China, and mixed-crop planting has become a method used by practitioners of organic agriculture.

Despite the evidence that organisms do not simply use resources present in the environment but, through their life activities, produce such resources and manufacture their environments, the distinction between organisms and their environments remains deeply embedded in our consciousness. Partly this is due to the inertia of educational institutions and materials. As a coauthor of a widely used college textbook of genetics,(1) I have had to engage in a constant struggle with my coauthors over the course of thirty years in order to introduce students to the notion that the relative reproductive fitness of organisms with different genetic makeups may be sensitive to their frequency in the population.

But the problem is deeper than simply intellectual inertia. It goes back, ultimately, to the unconsidered differentiations we make—at every moment when we distinguish among objects—between those in the foreground of our consciousness and the background places in which the objects happen to be situated. Moreover, this distinction creates a hierarchy of objects. We are conscious not only of the skin that encloses and defines the object, but of bits and pieces of that object, each of which must have its own “skin.” That is the problem of anatomization. A car has a motor and brakes and a transmission and an outer body that, at appropriate moments, become separate objects of our consciousness, objects that at least some knowledgeable person recognizes as coherent entities.

It has been an agony of biology to find boundaries between parts of organisms that are appropriate for an understanding of particular questions. We murder to dissect. The realization of the complex functional interactions and feedbacks that occur between different metabolic pathways has been a slow and difficult process. We do not have simply an “endocrine system” and a “nervous system” and a “circulatory system,” but “neurosecretory” and “neurocirculatory” systems that become the objects of inquiry because of strong forces connecting them. We may indeed stir a flower without troubling a star, but we cannot stir up a hornet’s nest without troubling our hormones. One of the ironies of language is that we use the term “organic” to imply a complex functional feedback and interaction of parts characteristic of living “organisms.” But musical organs, from which the word was adopted, have none of the complex feedback interactions that organisms possess. Indeed the most complex musical organ has multiple keyboards, pedal arrays, and a huge array of stops precisely so that different notes with different timbres can be played simultaneously and independently.

Evelyn Fox Keller sees “The Mirage of a Space Between Nature and Nurture” as a consequence of our false division of the world into living objects without sufficient consideration of the external milieu in which they are embedded, since organisms help create effective environments through their own life activities. Fox Keller is one of the most sophisticated and intelligent analysts of the social and psychological forces that operate in intellectual life and, in particular, of the relation of gender in our society both to the creation and acceptance of scientific ideas. The central point of her analysis has been that gender itself (as opposed to sex) is socially constructed, and that construction has influenced the development of science:

If there is a single point on which all feminist scholarship…has converged, it is the importance of recognizing the social construction of gender…. All of my work on gender and science proceeds from this basic recognition. My endeavor has been to call attention to the ways in which the social construction of a binary opposition between “masculine” and “feminine” has influenced the social construction of science.(2)

Beginning with her consciousness of the role of gender in influencing the construction of scientific ideas, she has, over the last twenty-five years, considered how language, models, and metaphors have had a determinative role in the construction of scientific explanation in biology.

A major critical concern of Fox Keller’s present book is the widespread attempt to partition in some quantitative way the contribution made to human variation by differences in biological inheritance, that is, differences in genes, as opposed to differences in life experience. She wants to make clear a distinction between analyzing the relative strength of the causes of variation among individuals and groups, an analysis that is coherent in principle, and simply assigning the relative contributions of biological and environmental causes to the value of some character in an individual.

It is, for example, all very well to say that genetic variation is responsible for 76 percent of the observed variation in adult height among American women while the remaining 24 percent is a consequence of differences in nutrition. The implication is that if all variation in nutrition were abolished then 24 percent of the observed height variation among individuals in the population in the next generation would disappear. To say, however, that 76 percent of Evelyn Fox Keller’s height was caused by her genes and 24 percent by her nutrition does not make sense. The nonsensical implication of trying to partition the causes of her individual height would be that if she never ate anything she would still be three quarters as tall as she is.

In fact, Keller is too optimistic about the assignment of causes of variation even when considering variation in a population. As she herself notes parenthetically, the assignment of relative proportions of population variation to different causes in a population depends on there being no specific interaction between the causes. She gives as a simple example the sound of two different drummers playing at a distance from us. If each drummer plays each drum for us, we should be able to tell the effect of different drummers as opposed to differences between drums. But she admits that is only true if the drummers themselves do not change their ways of playing when they change drums.

Keller’s rather casual treatment of the interaction between causal factors in the case of the drummers, despite her very great sophistication in analyzing the meaning of variation, is a symptom of a fault that is deeply embedded in the analytic training and thinking of both natural and social scientists. If there are several variable factors influencing some phenomenon, how are we to assign the relative importance to each in determining total variation? Let us take an extreme example. Suppose that we plant seeds of each of two different varieties of corn in two different locations with the following results measured in bushels of corn produced (see Table 1).

There are differences between the varieties in their yield from location to location and there are differences between locations from variety to variety. So, both variety and location matter. But there is no average variation between locations when averaged over varieties or between varieties when averaged over locations. Just by knowing the variation in yield associated with location and variety separately does not tell us which factor is the more important source of variation; nor do the facts of location and variety exhaust the description of that variation.

There is a third source of variation called the “interaction,” the variation that cannot be accounted for simply by the separate average effects of location and variety. There is no difference that appears between the average of different varieties or average of different locations, suggesting that neither location or variety matters to yield. Yet the yields of corn were different when different particular combinations of variety and location are observed. These effects of particular combinations of factors, not accounted for by the average effects of each factor separately, are thrown into an unanalyzed category called “interaction” with no concrete physical model made explicit.

In real life there will be some difference between the varieties when averaged over locations and some variation between locations when averaged over varieties; but there will also be some interaction variation accounting for the failure of the separately identified main effects to add up to the total variation. In an extreme case, as for example our jungle drummers with a common consciousness of what drums should sound like, it may turn out to be all interaction.

The Mirage of a Space Between Nature and Nurture appears in an era when biological—and specifically, genetic—causation is taken as the preferred explanation for all human physical differences. Although the early and mid-twentieth century was a period of immense popularity of genetic explanations for class and race differences in mental ability and temperament, especially among social scientists, such theories have now virtually disappeared from public view, largely as a result of a considerable effort of biologists to explain the errors of those claims.

The genes for IQ have never been found. Ironically, at the same time that genetics has ceased to be a popular explanation for human intellectual and temperamental differences, genetic theories for the causation of virtually every physical disorder have become the mode. “DNA” has replaced “IQ” as the abbreviation of social import. The announcement in February 2001 that two groups of investigators had sequenced the entire human genome was taken as the beginning of a new era in medicine, an era in which all diseases would be treated and cured by the replacement of faulty DNA. William Haseltine, the chairman of the board of the private company Human Genome Sciences, which participated in the genome project, assured us that “death is a series of preventable diseases.” Immortality, it appeared, was around the corner. For nearly ten years announcements of yet more genetic differences between diseased and healthy individuals were a regular occurrence in the pages of The New York Times and in leading general scientific publications like Science and Nature.

Then, on April 15, 2009, there appeared in The New York Times an article by the influential science reporter and fan of DNA research Nicholas Wade, under the headline “Study of Genes and Diseases at an Impasse.” In the same week the journal Science reported that DNA studies of disease causation had a “relatively low impact.” Both of these articles were instigated by several articles in The New England Journal of Medicine, which had come to the conclusion that the search for genes underlying common causes of mortality had so far yielded virtually nothing useful. The failure to find such genes continues and it seems likely that the search for the genes causing most common diseases will go the way of the search for the genes for IQ.

A major problem in understanding what geneticists have found out about the relation between genes and manifest characteristics of organisms is an overly flexible use of language that creates ambiguities of meaning. In particular, their use of the terms “heritable” and “heritability” is so confusing that an attempt at its clarification occupies the last two chapters of The Mirage of a Space Between Nature and Nurture. When a biological characteristic is said to be “heritable,” it means that it is capable of being transmitted from parents to offspring, just as money may be inherited, although neither is inevitable. In contrast, “heritability” is a statistical concept, the proportion of variation of a characteristic in a population that is attributable to genetic variation among individuals. The implication of “heritability” is that some proportion of the next generation will possess it.

The move from “heritable” to “heritability” is a switch from a qualitative property at the level of an individual to a statistical characterization of a population. Of course, to have a nonzero heritability in a population, a trait must be heritable at the individual level. But it is important to note that even a trait that is perfectly heritable at the individual level might have essentially zero heritability at the population level. If I possess a unique genetic variant that enables me with no effort at all to perform a task that many other people have learned to do only after great effort, then that ability is heritable in me and may possibly be passed on to my children, but it may also be of zero heritability in the population.

One of the problems of exploring an intellectual discipline from the outside is that the importance of certain basic methodological considerations is not always apparent to the observer, considerations that mold the entire intellectual structure that characterizes the field. So, in her first chapter, “Nature and Nurture as Alternatives,” Fox Keller writes that “my concern is with the tendency to think of nature and nurture as separable and hence as comparable, as forces to which relative strength can be assigned.” That concern is entirely appropriate for an external critic, and especially one who, like Fox Keller, comes from theoretical physics rather than experimental biology. Experimental geneticists, however, find environmental effects a serious distraction from the study of genetic and molecular mechanisms that are at the center of their interest, so they do their best to work with cases in which environmental effects are at a minimum or in which those effects can be manipulated at will. If the machine model of organisms that underlies our entire approach to the study of biology is to work for us, we must restrict our objects of study to those in which we can observe and manipulate all the gears and levers.

For much of the history of experimental genetics the chief organism of study was the fruit fly, Drosophila melanogaster, in which very large numbers of different gene mutations with visible effects on the form and behavior of the flies had been discovered. The catalog of these mutations described, in addition to genetic information, a description of the way in which mutant flies differed from normal (“wild type”) and assigned each mutation a “Rank” between 1 and 4. Rank 1 mutations were the most reliable for genetic study because every individual with the mutant genetic type could be easily and reliably recognized by the observer, whereas some proportion of individuals carrying mutations of other ranks could be indistinguishable from normal, depending on the environmental conditions in which they developed. Geneticists, if they could, avoided depending on poorer-rank mutations for their experiments. Only about 20 percent of known mutations were of Rank 1.

With the recent shift from the study of classical genes in controlled breeding experiments to the sequencing of DNA as the standard method of genetic study, the situation has gotten much worse. On the one hand, about 99 percent of the DNA in a cell is of completely unknown functional significance and any two unrelated individuals will differ from each other at large numbers of DNA positions. On the other hand, the attempt to assign the causes of particular diseases and metabolic malfunctions in humans to specific mutations has been a failure, with the exception of a few classical cases like sickle-cell anemia. The study of genes for specific diseases has indeed been of limited value. The reason for that limited value is in the very nature of genetics as a way of studying organisms.

Genetics, from its very beginning, has been a “subtractive” science. That is, it is based on the analysis of the difference between natural or “wild-type” organisms and those with some genetic defect that may interfere in some observable way with regular function. But to carry out such comparison it is necessary that the organisms being studied are, to the extent possible, identical in all other respects, and that the comparison is carried out in an environment that does not, itself, generate atypical responses yet allows the possible effect of the genetic perturbation to be observed. We must face the possibility that such a subtractive approach will never be able to reveal the way in which nature and nurture interact in normal circumstances.

An alternative to the standard subtractive method of genetic perturbations would be a synthetic approach in which living systems would be constructed ab initio from their molecular elements. It is now clear that most of the DNA in an organism is not contained in genes in the usual sense. That is, 98–99 percent of the DNA is not a code for a sequence of amino acids that will be assembled into long chains that will fold up to become the proteins that are essential to the formation of organisms; yet that nongenic DNA is transmitted faithfully from generation to generation just like the genic DNA.

It appears that the sequence of this nongenic DNA, which used to be called “junk-DNA,” is concerned with regulating how often, when, and in which cells the DNA of genes is read in order to produce the long strings of amino acids that will be folded into proteins and which of the many alternative possible foldings will occur. As the understanding and possibility of control of the synthesis of the bits and pieces of living cells become more complete, the temptation to create living systems from elementary bits and pieces will become greater and greater. Molecular biologists, already intoxicated with their ability to manipulate life at its molecular roots, are driven by the ambition to create it. The enterprise of “Synthetic Biology” is already in existence.

In May 2010 the consortium originally created by J. Craig Venter to sequence the human genome gave birth to a new organization, Synthetic Genomics, which announced that it had created an organism by implanting a synthetic genome in a bacterial cell whose own original genome had been removed. The cell then proceeded to carry out the functions of a living organism, including reproduction. One may argue that the hardest work, putting together all the rest of the cell from bits and pieces, is still to be done before it can be said that life has been manufactured, but even Victor Frankenstein started with a dead body. We all know what the consequences of that may be.

1. Anthony J.F. Griffiths, Susan R. Wessler, Sean B. Carroll, and Richard C. Lewontin, Introduction to Genetic Analysis , ninth edition (W.H. Freeman, 2008).

2. The Scientist , Vol. 5, No. 1 (January 7, 1991