Arquivo da tag: Semiótica

Festival interativo leva visitantes a experimentar situações de desastre ambiental (Agência Brasil)

01/6/2012 – 10h42

por Thais Leitão, da Agência Brasil

Chamada53 Festival interativo leva visitantes a experimentar situações de desastre ambientalRio de Janeiro – Uma floresta que entra em chamas colocando em risco a vida de animais e da vegetação existente; uma geleira intacta que de repente começa a derreter ou uma casa que sofre inundação. Todas essas situações, provocadas pelo desequilíbrio ambiental, podem ser experimentadas pelo público durante o Green Nation Fest, festival interativo e sensorial que começou hoje (31) na Quinta da Boa Vista, zona norte do Rio de Janeiro, e vai até 7 de junho.

De acordo com o diretor da organização não governamental (ONG) Centro de Cultura, Informação e Meio Ambiente (Cima), que organiza do evento, Marcos Didonet, o objetivo é levar experiências práticas aos visitantes e estimular o público a agir de forma mais sustentável. A Cima desenvolve há mais de 20 anos ações em parceria com instituições privadas, governamentais e multilaterais.

“O objetivo é alcançar o grande público que não está acostumado a vivenciar a questão ambiental, trazendo o assunto de forma mais interessante, agradável e prática. Para isso, nossos artistas e cientistas bolaram essas instalações capazes de promover sensações que serão ainda mais frequentes se não mudarmos nossos padrões de consumo e comportamentos cotidianos”, afirmou.

No local, também há tendas onde ocorrem oficinas lúdicas e educativas. Em uma delas, montada pelo Instituto Estadual do Ambiente (Inea), um grupo de 30 alunos da rede municipal do Rio aprendeu, hoje, a produzir carteiras usando caixas de leite e recortes de tecido.

Para a estudante Ana Beatriz Leão, 14 anos, a ideia é criativa e pode servir para presentear amigos. “É legal porque a gente geralmente joga no lixo e agora sabe que dá para fazer outras coisas com a caixa. A que eu fiz, vou dar para uma amiga que tenho certeza que vai gostar”, contou a adolescente.

Na mesma tenda, os visitantes podem conferir outros produtos feitos com material reutilizado, como uma pequena bateria produzida com latinhas de refrigerante, livros infantis com retalhos de tecidos e bonecos com caixa de sapato.

Entre os meninos, uma das atividades preferidas é o Gol de Bicicleta na qual os participantes pedalam e geram energia para seu time. A cada watt gerado, um gol é marcado para o time de preferência. Além disso, uma bateria é abastecida e leva energia para ser utilizada em outra instalação do festival.

Os amigos Gustavo Fonseca e Roberto Damião, ambos de 11 anos, também alunos da rede municipal do Rio, disseram que a experiência é “muito intensa”.

“Foi muito legal porque a gente aprendeu outra maneira de gerar energia e ainda fez gol pro Mengão”, disse Roberto, que torce pelo Flamengo.

O evento, com entrada gratuita, também oferece uma a Mostra Internacional de Cinema, com 12 longas-metragens, e seminários com convidados brasileiros e internacionais sobre economia verde e criativa, que serão abertos para debates. A programação completa pode ser conferida no site www.greennationfest.com.br.

* Publicado originalmente no site da Agência Brasil.

 

How Bad Is It? (The New Inquiry)

By GEORGE SCIALABBA

Jasper Johns, Green Flag, 1956 (Graphite pencil, crayon and collage on paper)

Pretty bad. Here is a sample of factlets from surveys and studies conducted in the past twenty years. Seventy percent of Americans believe in the existence of angels. Fifty percent believe that the earth has been visited by UFOs; in another poll, 70 percent believed that the U.S. government is covering up the presence of space aliens on earth. Forty percent did not know whom the U.S. fought in World War II. Forty percent could not locate Japan on a world map. Fifteen percent could not locate the United States on a world map. Sixty percent of Americans have not read a book since leaving school. Only 6 percent now read even one book a year. According to a very familiar statistic that nonetheless cannot be repeated too often, the average American’s day includes six minutes playing sports, five minutes reading books, one minute making music, 30 seconds attending a play or concert, 25 seconds making or viewing art, and four hours watching television.

Among high-school seniors surveyed in the late 1990s, 50 percent had not heard of the Cold War. Sixty percent could not say how the United States came into existence. Fifty percent did not know in which century the Civil War occurred. Sixty percent could name each of the Three Stooges but not the three branches of the U.S. government. Sixty percent could not comprehend an editorial in a national or local newspaper.

Intellectual distinction isn’t everything, it’s true. But things are amiss in other areas as well: sociability and trust, for example. “During the last third of the twentieth century,” according to Robert Putnam in Bowling Alone, “all forms of social capital fell off precipitously.” Tens of thousands of community groups – church social and charitable groups, union halls, civic clubs, bridge clubs, and yes, bowling leagues — disappeared; by Putnam’s estimate, one-third of our social infrastructure vanished in these years. Frequency of having friends to dinner dropped by 45 percent; card parties declined 50 percent; Americans’ declared readiness to make new friends declined by 30 percent. Belief that most other people could be trusted dropped from 77 percent to 37 percent. Over a five-year period in the 1990s, reported incidents of aggressive driving rose by 50 percent — admittedly an odd, but probably not an insignificant, indicator of declining social capital.

Still, even if American education is spotty and the social fabric is fraying, the fact that the U.S. is the world’s richest nation must surely make a great difference to our quality of life? Alas, no. As every literate person knows, economic inequality in the United States is off the charts – at third-world levels. The results were recently summarized by James Speth in Orion magazine. Of the 20 advanced democracies in the Organization for Economic Cooperation and Development (OECD), the U.S. has the highest poverty rate, for both adults and children; the lowest rate of social mobility; the lowest score on UN indexes of child welfare and gender inequality; the highest ratio of health care expenditure to GDP, combined with the lowest life expectancy and the highest rates of infant mortality, mental illness, obesity, inability to afford health care, and personal bankruptcy resulting from medical expenses; the highest homicide rate; and the highest incarceration rate. Nor are the baneful effects of America’s social and economic order confined within our borders; among OECD nations the U.S. also has the highest carbon dioxide emissions, the highest per capita water consumption, the next-to-largest ecological footprint, the next-to-lowest score on the Yale Environmental Performance Index, the highest (by a colossal margin) per capita rate of military spending and arms sales, and the next-to-lowest rate of per capita spending on international development and humanitarian assistance.

Contemplating these dreary statistics, one might well conclude that the United States is — to a distressing extent — a nation of violent, intolerant, ignorant, superstitious, passive, shallow, boorish, selfish, unhealthy, unhappy people, addicted to flickering screens, incurious about other societies and cultures, unwilling or unable to assert or even comprehend their nominal political sovereignty. Or, more simply, that America is a failure.

That is indeed what Morris Berman concludes in his three-volume survey of America’s decline: The Twilight of American Culture (2000), Dark Ages America (2006), andWhy America Failed (2011), from which much of the preceding information is taken. Berman is a cultural and intellectual historian, not a social scientist, so his portrait of American civilization, or barbarism, is anecdotal and atmospheric as well as statistical. He is eloquent about harder-to-quantify trends: the transformation of higher (even primary/secondary) education into marketing arenas for predatory corporations; the new form of educational merchandising known as “distance learning”; the colonization of civic and cultural spaces by corporate logos; the centrality of malls and shopping to our social life; the “systematic suppression of silence” and the fact that “there is barely an empty space in our culture not already carrying commercial messages.” Idiot deans, rancid rappers, endlessly chattering sports commentators, an avalanche of half-inch-deep self-help manuals; a plague of gadgets, a deluge of stimuli, an epidemic of rudeness, a desert of mutual indifference: the upshot is our daily immersion in a suffocating stream of kitsch, blather, stress, and sentimental banality. Berman colorfully and convincingly renders the relentless coarsening and dumbing down of everyday life in late (dare we hope?) American capitalism.

In Spenglerian fashion, Berman seeks the source of our civilization’s decline in its innermost principle, its animatingGeist. What he finds at the bottom of our culture’s soul is … hustling; or, to use its respectable academic sobriquet, possessive individualism. Expansion, accumulation, economic growth: this is the ground bass of American history, like the hum of a dynamo in the basement beneath the polite twitterings on the upper stories about “liberty” and “a light unto the nations.” Berman scarcely mentions Marx or historical materialism; instead he offers a nonspecialist and accessible but deeply informed and amply documented review of American history, period by period, war by war, arguing persuasively that whatever the ideological superstructure, the driving energy behind policy and popular aspiration has been a ceaseless, soulless acquisitiveness.

The colonial period, the seedbed of American democracy, certainly featured a good deal of God-talk and virtue-talk, but Mammon more than held its own. Berman sides emphatically with Louis Hartz, who famously argued in The Liberal Tradition in America that American society was essentially Lockean from the beginning: individualistic, ambitious, protocapitalist, with a weak and subordinate communitarian ethic. He finds plenty of support elsewhere as well; for example in Perry Miller, the foremost historian of Puritanism, according to whom the American mind has always “positively lusted for the chance to yield itself to the gratification of technology.” Even Tocqueville, who made many similar observations, “could not comprehend,” wrote Miller, “the passion with which [early Americans] flung themselves into the technological torrent, how they … cried to each other as they went headlong down the chute that here was their destiny, here was the tide that would sweep them toward the unending vistas of prosperity.” Even Emerson and Whitman went through a phase of infatuation with industrial progress, though Hawthorne and Thoreau apparently always looked on the juggernaut with clearer (or more jaundiced) eyes.

Berman also sides, for the most part, with Charles Beard, who drew attention to the economic conflicts underlying the American Revolution and the Civil War. Beard may have undervalued the genuine intellectual ferment that accompanied the Revolution, but he was not wrong in perceiving the motivating force of the pervasive commercial ethic of the age. Joyce Appleby, another eminent historian, poses this question to those who idealize America’s founding: “If the Revolution was fought in a frenzy over corruption, out of fear of tyranny, and with hopes for redemption through civic virtue, where and when are scholars to find the sources for the aggressive individualism, the optimistic materialism, and the pragmatic interest-group politics that became so salient so early in the life of the nation?”

By the mid-nineteenth century, the predominance of commercial interests in American politics was unmistakable. Berman’s lengthy discussion of the Civil War as the pivot of American history takes for granted the inadequacy of triumphalist views of the Civil War. It was not a “battle cry of freedom.” Slavery was central, but for economic rather than moral reasons. The North represented economic modernity and the ethos of material progress; the economy and ethos of the South, based on slavery, was premodern and static. The West — and with it the shape of America’s economic future — was up for grabs, and the North grabbed it away from an equally determined South. Except for the abolitionists, no whites, North or South, gave a damn about blacks. How the West (like the North and South before it) was grabbed, in an orgy of greed, violence, and deceit against the original inhabitants, is a familiar story.

Even more than in Beard, Berman finds his inspiration in William Appleman Williams. When McKinley’s secretary of state John Hay advocated “an open door through which America’s preponderant economic strength would enter and dominate all underdeveloped areas of the world” and his successor William Jennings Bryan (the celebrated populist and anti-imperialist!) told a gathering of businessmen in 1915 that “my Department is your department; the ambassadors, the ministers, the consuls are all yours; it is their business to look after your interests and to guard your rights,” they were enunciating the soul of American foreign policy, as was the much-lauded Wise Man George Kennan when he wrote in a post-World War II State Department policy planning document: “We have about 50 percent of the world’s wealth, but only 6.3 percent of its population … In this situation, we cannot fail to be the object of envy and resentment. Our real task in the coming period is to devise a pattern of relationships which will permit us to maintain this position of disparity … To do so, we will have to dispense with all sentimentality and day-dreaming; and our attention will have to be concentrated everywhere on our immediate national objectives … We should cease to talk about vague and … unreal objectives such as human rights, the raising of the living standards, and democratization. The day is not far off when we are going to have to deal in straight power concepts. The less we are then hampered by idealistic slogans, the better.”

As a former medievalist, Berman finds contemporary parallels to the fall of Rome compelling. By the end of the empire, he points out, economic inequality was drastic and increasing, the legitimacy and efficacy of the state was waning, popular culture was debased, civic virtue among elites was practically nonexistent, and imperial military commitments were hopelessly unsustainable. As these volumes abundantly illustrate, this is 21st century America in a nutshell. The capstone of Berman’s demonstration is a sequence of three long, brilliant chapters in Dark Ages America on the Cold War, the Pax Americana, CIA and military interventions in the Third World, and in particular U.S. policy in the Middle East, where racism and rapacity have combined to produce a stunning debacle. Our hysterical national response to 9/11 — our inability even to make an effort to comprehend the long-festering consequences of our imperial predations — portended, as clearly as anything could, the demise of American global supremacy.

What will become of us? After Rome’s fall, wolves wandered through the cities and Europe largely went to sleep for six centuries. That will not happen again; too many transitions — demographic, ecological, technological, cybernetic — have intervened. The planet’s metabolism has altered. The new Dark Ages will be socially, politically, and spiritually dark, but the economic Moloch — mass production and consumption, destructive growth, instrumental rationality — will not disappear. Few Americans want it to. We are hollow, Berman concludes. It is a devastatingly plausible conclusion.

An interval — long or short, only the gods can say — of oligarchic, intensely surveilled, bread-and-circuses authoritarianism, Blade Runner- or Fahrenheit 451-style, seems the most likely outlook for the 21st and 22nd centuries. Still, if most humans are shallow and conformist, some are not. There is reason to hope that the ever fragile but somehow perennial traditions and virtues of solidarity, curiosity, self-reliance, courtesy, voluntary simplicity, and an instinct for beauty will survive, even if underground for long periods. And cultural rebirths do occur, or at any rate have occurred.

Berman offers little comfort, but he does note a possible role for those who perceive the inevitability of our civilization’s decline. He calls it the “monastic option.” Our eclipse may, after all, not be permanent; and meanwhile individuals and small groups may preserve the best of our culture by living against the grain, within the interstices, by “creating ‘zones of intelligence’ in a private, local way, and then deliberately keeping them out of the public eye.” Even if one’s ideals ultimately perish, this may be the best way to live while they are dying.

There is something immensely refreshing, even cathartic, about Berman’s refusal to hold out any hope of avoiding our civilization’s demise. And our reaction goes some way toward proving his point: We are so sick of hucksters, of authors trying — like everyone else on all sides at all times in this pervasively hustling culture — to sell us something, that it is a relief to encounter someone who isn’t, who has no designs on our money or votes or hopes, who simply has looked into the depths, into our catastrophic future, and is compelled to describe it, as Cassandra was. No doubt his efforts will meet with equal success.

Resilient People More Satisfied With Life (Science Daily)

ScienceDaily (May 23, 2012) — When confronted with adverse situations such as the loss of a loved one, some people never fully recover from the pain. Others, the majority, pull through and experience how the intensity of negative emotions (e.g. anxiety, depression) grows dimmer with time until they adapt to the new situation. A third group is made up of individuals whose adversities have made them grow personally and whose life takes on new meaning, making them feel stronger than before.

Researchers at the Basic Psychology Unit at Universitat Autònoma de Barcelona analyzed the responses of 254 students from the Faculty of Psychology in different questionnaires. The purpose was to evaluate their level of satisfaction with life and find connections between their resilience and their capacity of emotional recovery, one of the components of emotional intelligence which consists in the ability to control one’s emotions and those of others.

Research data shows that students who are more resilient, 20% of those surveyed, are more satisfied with their lives and are also those who believe they have control over their emotions and their state of mind. Resilience therefore has a positive prediction effect on the level of satisfaction with one’s life.

“Some of the characteristics of being resilient can be worked on and improved, such as self-esteem and being able to regulate one’s emotions. Learning these techniques can offer people the resources needed to help them adapt and improve their quality of life”, explains Dr Joaquín T Limonero, professor of the UAB Research Group on Stress and Health at UAB and coordinator of the research.

Published recently in Behavioral Psychology, the study included the participation of UAB researcher Jordi Fernández Castro; professors of the Gimbernat School of Nursing (a UAB-affiliated centre) Joaquín Tomás-Sábado and Amor Aradilla Herrera; and psychologist and researcher of Egarsat, M. José Gómez-Romero.

Visual Perception System Unconsciously Affects Our Preferences (Science Daily)

ScienceDaily (May 23, 2012) — When grabbing a coffee mug out of a cluttered cabinet or choosing a pen to quickly sign a document, what brain processes guide your choices?

New research from Carnegie Mellon University’s Center for the Neural Basis of Cognition (CNBC) shows that the brain’s visual perception system automatically and unconsciously guides decision-making through valence perception. Published in the journal Frontiers in Psychology, the review hypothesizes that valence, which can be defined as the positive or negative information automatically perceived in the majority of visual information, integrates visual features and associations from experience with similar objects or features. In other words, it is the process that allows our brains to rapidly make choices between similar objects.

The findings offer important insights into consumer behavior in ways that traditional consumer marketing focus groups cannot address. For example, asking individuals to react to package designs, ads or logos is simply ineffective. Instead, companies can use this type of brain science to more effectively assess how unconscious visual valence perception contributes to consumer behavior.

To transfer the research’s scientific application to the online video market, the CMU research team is in the process of founding the start-up company neonlabs through the support of the National Science Foundation (NSF) Innovation Corps (I-Corps).

“This basic research into how visual object recognition interacts with and is influenced by affect paints a much richer picture of how we see objects,” said Michael J. Tarr, the George A. and Helen Dunham Cowan Professor of Cognitive Neuroscience and co-director of the CNBC. “What we now know is that common, household objects carry subtle positive or negative valences and that these valences have an impact on our day-to-day behavior.”

Tarr added that the NSF I-Corps program has been instrumental in helping the neonlabs’ team take this basic idea and teaching them how to turn it into a viable company. “The I-Corps program gave us unprecedented access to highly successful, experienced entrepreneurs and venture capitalists who provided incredibly valuable feedback throughout the development process,” he said.

NSF established I-Corps for the sole purpose of assessing the readiness of transitioning new scientific opportunities into valuable products through a public-private partnership. The CMU team of Tarr, Sophie Lebrecht, a CNBC and Tepper School of Business postdoctoral fellow, Babs Carryer, an embedded entrepreneur at CMU’s Project Olympus, and Thomas Kubilius, president of Pittsburgh-based Bright Innovation and adjunct professor of design at CMU, were awarded a $50,000, six-month grant to investigate how understanding valence perception could be used to make better consumer marketing decisions. They are launching neonlabs to apply their model of visual preference to increase click rates on online videos, by identifying the most visually appealing thumbnail from a stream of video. The web-based software product selects a thumbnail based on neuroimaging data on object perception and valence, crowd sourced behavioral data and proprietary computational analyses of large amounts of video streams.

“Everything you see, you automatically dislike or like, prefer or don’t prefer, in part, because of valence perception,” said Lebrecht, lead author of the study and the entrepreneurial lead for the I-Corps grant. “Valence links what we see in the world to how we make decisions.”

Lebrecht continued, “Talking with companies such as YouTube and Hulu, we realized that they are looking for ways to keep users on their sites longer by clicking to watch more videos. Thumbnails are a huge problem for any online video publisher, and our research fits perfectly with this problem. Our approach streamlines the process and chooses the screenshot that is the most visually appealing based on science, which will in the end result in more user clicks.”

Wearing Two Different Hats: Moral Decisions May Depend On the Situation (Science Daily)

ScienceDaily (May 23, 2012) — An individual’s sense of right or wrong may change depending on their activities at the time — and they may not be aware of their own shifting moral integrity — according to a new study looking at why people make ethical or unethical decisions.

Focusing on dual-occupation professionals, the researchers found that engineers had one perspective on ethical issues, yet when those same individuals were in management roles, their moral compass shifted. Likewise, medic/soldiers in the U.S. Army had different views of civilian casualties depending on whether they most recently had been acting as soldiers or medics.

In the study, to be published in a future issue of The Academy of Management Journal, lead author Keith Leavitt of Oregon State University found that workers who tend to have dual roles in their jobs would change their moral judgments based on what they thought was expected of them at the moment.

“When people switch hats, they often switch moral compasses,” Leavitt said. “People like to think they are inherently moral creatures — you either have character or you don’t. But our studies show that the same person may make a completely different decision based on what hat they may be wearing at the time, often without even realizing it.”

Leavitt, an assistant professor of management in the College of Business at OSU, is an expert on non-conscious decision making and business ethics. He studies how people make decisions and moral judgments, often based on non-conscious cues.

He said recent high-profile business scandals, from the collapse of Enron to the Ponzi scheme of Bernie Madoff, have called into question the ethics of professionals. Leavitt said professional organizations, employers and academic institutions may want to train and prepare their members for practical moral tensions they may face when asked to serve in multiple roles.

“What we consider to be moral sometimes depends on what constituency we are answering to at that moment,” Leavitt said. “For a physician, a human life is priceless. But if that same physician is a managed-care administrator, some degree of moral flexibility becomes necessary to meet their obligations to stockholders.”

Leavitt said subtle cues — such as signage and motivation materials around the office — should be considered, along with more direct training that helps employees who juggle multiple roles that could conflict with one another.

“Organizations and businesses need to recognize that even very subtle images and icons can give employees non-conscious clues as to what the firm values,” he said. “Whether they know it or not, people are often taking in messages about what their role is and what is expected of them, and this may conflict with what they know to be the moral or correct decision.”

The researchers conducted three different studies with employees who had dual roles. In one case, 128 U.S. Army medics were asked to complete a series of problem-solving tests, which included subliminal cues that hinted they might be acting as either a medic or a soldier. No participant said the cues had any bearing on their behavior — but apparently they did. A much larger percentage of those in the medic category than in the soldier category were unwilling to put a price on human life.

In another test, a group of engineer-managers were asked to write about a time they either behaved as a typical manager, engineer, or both. Then they were asked whether U.S. firms should engage in “gifting” to gain a foothold in a new market. Despite the fact such a practice would violate federal laws, more than 50 percent of those who fell into the “manager” category said such a practice might be acceptable, compared to 13 percent of those in the engineer category.

“We find that people tend to make decisions that may conflict with their morals when they are overwhelmed, or when they are just doing routine tasks without thinking of the consequences,” Leavitt said. “We tend to play out a script as if our role has already been written. So the bottom line is, slow down and think about the consequences when making an ethical decision.”

What-If and What-Is: The Role of Speculation in Science (N.Y.Times)

SIDE EFFECTS

MPI/Getty Images

An illustration from about 1850 of a dog with a small travois in an Assiniboine encampment.

By JAMES GORMAN – Published: May 24, 2012

Woody Allen once said that when you do comedy, you sit at the children’s table. The same might be said of speculation in science.

And yet speculation is an essential part of science. So how does it fit in? Two recent publications, both about the misty depths of canine and human history, suggest some answers. In one, an international team of scientists concludes that we really don’t know when and where dogs were domesticated. Greger Larson of the University of Durham, in England, the first of 20 authors of that report, said of dog DNA, “it’s a mess.”

In the other, Pat Shipman, an independent scientist and writer suggests that dogs may have helped modern humans push the Neanderthals out of existence and might even have helped shape human evolution.

Is one right and the other wrong? Are both efforts science — one a data-heavy reality check and the other freewheeling speculation? The research reported by Dr. Larson and his colleagues in The Proceedings of the National Academy of Sciences is solid science, easily judged by peers, at any rate. The essay by Dr. Shipman is not meant to come to any conclusion but to prompt thought and more research. It, too, will be judged by other scientists, and read by many nonscientists.

But how is one to judge the value of speculation? There are a few obvious ways. The questions readers ought to ask when confronting a “what-if “as opposed to “what-is” article are: Does the writer make it clear what is known, what is probable, and what is merely possible?

Dr. Shipman was careful to make these distinctions in her essay inAmerican Scientist, and in an interview, when I asked her to walk me through her argument.

First, she said, we know that modern humans and Neanderthals occupied Europe at the same time, from about 45,000 to 25,000 years ago, and that the fortunes of the modern humans rose as those of the Neanderthals fell. Somehow the modern humans outcompeted the Neanderthals. And here we are now, with our computers, our research, and our beloved dogs, which, scientists agree evolved from wolves.

Second, and this point is crucial, Dr. Shipman thinks dogs were very probably around during this time period, although she recognizes that others disagree. She tells us about the research that convinced her, so we can check it ourselves, if we like: a 2009 report of three skulls, the oldest dating to 32,000 years ago, by Mietje Germonpré of The Royal Belgian Institute of Natural Sciences in The Journal of Archaeological Science.

The skulls are clearly of members of the canid family, but that includes wolves, jackals and foxes. Dr. Germonpré and her colleagues concluded that the skulls belonged to dogs. That’s where things get sticky.

The rest of Dr. Shipman’s essay is clear enough. If the humans had dogs, the dogs must have been helping somehow, in hunting or pulling travois. And they may have been so helpful that they gave modern humans an edge over the Neanderthals (unless the Neanderthals had dogs, too). If they helped in hunting, they might have watched human eyes for clues about what was going on, as they do now. Other researchers have suggested that the white of the human eye evolved to foster cooperation because we could more easily see where others were looking, than with plain brown eyes.

If dogs were watching us too, that would have added survival value to having a partly white eye and thus played a role in our evolution. Fair enough, but the dogs had to be there at that time when humans and Neanderthals overlapped. I asked Dr. Larson about Dr. Shipman’s essay, and I confess I expected he might object to its speculative nature. Not so. “I love speculation,” he wrote back, “I do it all the time.” And, he said of Dr. Shipman’s essay, “it’s a lovely chain of reasoning.”

But, he said, “it begins from the premise that the late Pleistocene canid remains are dogs. And they are not.”

He wrote, “there is not a single piece of (credible) evidence to suggest that the domestication process was under way 30,000 years ago.” He cited an article in press in The Journal of Archaeological Science that is highly critical of the Germonpré paper. The article, written by Susan J. Crockford at the University of Victoria and Yaroslav V. Kuzmin at the Siberian branch of the Russian Academy of Sciences, suggests that the skulls in question came from short-faced wolves and do not indicate that the domestication process had begun. Dr. Crockford, who had read Dr. Shipman’s paper, thought it “too speculative for science.” But she did not view the case of early domestication as completely closed.

She said in an e-mail: “We simply need more work on these ancient wolves before we can determine if these canids are incipient dogs (in the process of becoming dogs, although not there yet) or if they simply reflect the normal variation in ancient wolves. At present, I am leaning strongly towards the later (normal variation in wolves).”

Perhaps the way to judge the scientific value of speculation would be to see if it prompts more research, more collecting of fossils, more study. Until then, only proximate answers will exist to the question of where dogs came from.

Mine came from a shelter? How about yours?

Soldiers Who Desecrate the Dead See Themselves as Hunters (Science Daily)

ScienceDaily (May 20, 2012) — Modern day soldiers who mutilate enemy corpses or take body-parts as trophies are usually thought to be suffering from the extreme stresses of battle. But, research funded by the Economic and Social Research Council (ESRC) shows that this sort of misconduct has most often been carried out by fighters who viewed the enemy as racially different from themselves and used images of the hunt to describe their actions.

“The roots of this behaviour lie not in individual psychological disorders,” says Professor Simon Harrison who carried out the study, “but in a social history of racism and in military traditions that use hunting metaphors for war. Although this misconduct is very rare, it has persisted in predictable patterns since the European Enlightenment. This was the period when the first ideologies of race began to appear, classifying some human populations as closer to animals than others.”

European and North American soldiers who have mutilated enemy corpses appear to have drawn racial distinctions of this sort between close and distant enemies. They ‘fought’ their close enemies, and bodies remained untouched after death, but they ‘hunted’ their distant enemies and such bodies became the trophies that demonstrate masculine skill.

Almost always, only enemies viewed as belonging to other ‘races’ have been treated in this way. “This is a specifically racialised form of violence,” suggest Professor Harrison, “and could be considered a type of racially-motivated hate crime specific to military personnel in wartime.”

People tend to associate head-hunting and other trophy-taking with ‘primitive’ warfare. They consider wars fought by professional militaries as rational and humane. However, such contrasts are misleading. The study shows that the symbolic associations between hunting and war that can give rise to abnormal behaviour such as trophy-taking in modern military organisations are remarkably close to those in certain indigenous societies where practices such as head-hunting were a recognised part of the culture.

In both cases, mutilation of the enemy dead occurs when enemies are represented as animals or prey. Parts of the corpse are removed like trophies at ‘the kill’. Metaphors of ‘war-as-hunting’ that lie at the root of such behaviour are still strong in some armed forces in Europe and North America — not only in military training but in the media and in soldiers’ own self-perception.

Professor Harrison gives the example of the Second World War and shows that trophy-taking was rare on the European battlefields but was relatively common in the war in the Pacific, where some Allied soldiers kept skulls of Japanese combatants as mementos or made gifts of their remains to friends back home.

The study also gives a more recent comparison: there have been incidents in Afghanistan in which NATO personnel have desecrated the dead bodies of Taliban combatants but there is no evidence of such misconduct occurring in the conflicts of the former Yugoslavia where NATO forces were much less likely to have considered their opponents racially ‘distant’.

But, it would be wrong to suggest that such behaviour amounts to a tradition. These practices are usually not explicitly taught. Indeed, they seem to be quickly forgotten after the end of wars and veterans often remain unaware of the extent to which they occurred.

Furthermore, attitudes towards the trophies themselves change as the enemy ceases to be the enemy. The study shows how human remains kept by Allied soldiers after the Pacific War became unwanted memory objects over time, which ex-servicemen or their families often donated to museums. In some cases, veterans have made great efforts to seek out the families of Japanese soldiers in order to return their remains and to disconnect themselves from a disturbing past.

Professor Harrison concludes that human trophy-taking is evidence of the power of metaphor in structuring and motivating human behaviour. “It will probably occur, in some form or other, whenever war, hunting and masculinity are conceptually linked,” he says. “Prohibition is clearly not enough to prevent it. We need to recognise the dangers of portraying war in terms of hunting imagery.”

Increased Knowledge About Global Warming Leads To Apathy, Study Shows (Science Daily)

ScienceDaily (Mar. 27, 2008) — The more you know the less you care — at least that seems to be the case with global warming. A telephone survey of 1,093 Americans by two Texas A&M University political scientists and a former colleague indicates that trend, as explained in their recent article in the peer-reviewed journal Risk Analysis.

“More informed respondents both feel less personally responsible for global warming, and also show less concern for global warming,” states the article, titled “Personal Efficacy, the Information Environment, and Attitudes toward Global Warming and Climate Change in the USA.”

The study showed high levels of confidence in scientists among Americans led to a decreased sense of responsibility for global warming.

The diminished concern and sense of responsibility flies in the face of awareness campaigns about climate change, such as in the movies An Inconvenient Truth and Ice Age: The Meltdown and in the mainstream media’s escalating emphasis on the trend.

The research was conducted by Paul M. Kellstedt, a political science associate professor at Texas A&M; Arnold Vedlitz, Bob Bullock Chair in Government and Public Policy at Texas A&M’s George Bush School of Government and Public Service; and Sammy Zahran, formerly of Texas A&M and now an assistant professor of sociology at Colorado State University.

Kellstedt says the findings were a bit unexpected. The focus of the study, he says, was not to measure how informed or how uninformed Americans are about global warming, but to understand why some individuals who are more or less informed about it showed more or less concern.

“In that sense, we didn’t really have expectations about how aware or unaware people were of global warming,” he says.

But, he adds, “The findings that the more informed respondents were less concerned about global warming, and that they felt less personally responsible for it, did surprise us. We expected just the opposite.

“The findings, while rather modest in magnitude — there are other variables we measured which had much larger effects on concern for global warming — were statistically quite robust, which is to say that they continued to appear regardless of how we modeled the data.”

Measuring knowledge about global warming is a tricky business, Kellstedt adds.

“That’s true of many other things we would like to measure in surveys, of course, especially things that might embarrass people (like ignorance) or that they might feel social pressure to avoid revealing (like prejudice),” he says.

“There are no industry standards, so to speak, for measuring knowledge about global warming. We opted for this straightforward measure and realize that other measures might produce different results.”

Now, for better or worse, scientists have to deal with the public’s abundant confidence in them. “But it cannot be comforting to the researchers in the scientific community that the more trust people have in them as scientists, the less concerned they are about their findings,” the researchers conclude in their study.

Lo que dicen las fotos de Lula con cáncer (BBC Mundo)

Gerardo Lissardy

BBC Mundo, Rio de Janeiro
Viernes, 25 de noviembre de 2011

Lula siendo afeitado por su esposa Leticia

Para ningún político debe ser fácil mostrar públicamente una lucha personal contra el cáncer, pero el modo en que lo ha hecho el ex presidente brasileño Luiz Inácio Lula da Silva tiene significados concretos, según sus allegados y expertos.

La noticia del cáncer de laringe que afecta a Lula fue conocida por los brasileños el 29 de octubre, apenas unas horas después que el propio ex presidente fuera diagnosticado con la enfermedad.

Desde entonces, el equipo de comunicación del instituto que encabeza Lula ha enviado regularmente a la prensa mensajes con información del tratamiento de quimioterapia que recibe y hasta de momentos íntimos que vive.

Por ejemplo, hubo fotos de Lula con médicos cuando inició el tratamiento en un hospital de Sao Paulo, fotos en una cama del nosocomio tomado de la mano de su sucesora, la presidenta Dilma Rousseff, y hasta fotos de su esposa Marisa Letícia cortándole a cero su cabello y su barba.

Todas estas imágenes han sido ofrecidas a los medios, libres de reproducción, por el Instituto Lula.

Algunas, en especial las del momento en que perdía su distintiva barba, recorrieron el mundo y se publicaron en las portadas de varios diarios locales y latinoamericanos.

Hay expertos que creen que todo esto responde a una estrategia definida, con valoraciones políticas.

José Chrispiniano, asesor de prensa del Instituto Lula, acepta que el modo de comunicar sobre la enfermedad del ex presidente tiene ciertos objetivos, pero descarta que se trate de vender algo en particular.

“No es de ninguna forma marketing”, dijo en diálogo con BBC Mundo.

“Cuestión muy simbólica”

Lula sin barbaLa oficina del expresidente ha presentado decenas de fotos que documentan la enfermedad de Lula.

Chrispiniano explicó que fue el propio Lula quien tomó la decisión de informar abiertamente sobre su cáncer y tratamiento, desde el momento en que conoció el diagnóstico.

“Aunque no tenga ningún cargo público ahora, es una persona de interés público, entonces el objetivo es divulgar claramente: es una enfermedad tratable y un tratamiento con perspectivas bastante positivas de cura”, señaló.

Además, dijo, se ha buscado evitar una dramatización de la enfermedad (de hecho, en muchas de las fotos divulgadas Lula aparece sonriente) o evitar que parezca “que se están escondiendo cosas”.

La difusión de las fotos de Lula siendo afeitado y mostrando su nuevo aspecto con bigote también fue iniciativa del ex presidente, relató Chrispiniano.

“Era una cuestión muy simbólica de su imagen y quisimos mostrar que pasó ese momento tranquilo, porque (para) muchas personas que tienen esta enfermedad es un momento de mucho estigma”, dijo.

Dos días después del corte de pelo de Lula, su instituto divulgó el viernes 18 fotos del ex presidente recibiendo la visita del director técnico de la selección brasileña de fútbol, Mano Menezes.

“Fuerza, eterno ‘presidente Lula’. Contamos contigo para 2014”, escribió Menezes en la casaca número 10 del combinado nacional que le obsequió a Lula, y que también aparecía en las fotos.

Se trataba de una referencia al Mundial de fútbol que Brasil va a organizar ese año, precisó el comunicado.

“Una estrategia”

Lula con el equipo del hospital de Sao Paolo que lo atiendePara muchos el padecimiento de Lula con el cáncer podría aumentar su ya alta popularidad.

Rousiley Maia, una investigadora de la Universidad Federal de Minas Gerais experta en comunicación y política, cree que la decisión de informar de esta forma sobre el cáncer de Lula “fue deliberadamente una estrategia”.

“En vez de poner sombras (o) tratar con medias palabras (la enfermedad), la estrategia es apelar por el lado humano, ordinario y mortal de la figura”, dijo Maia a BBC Mundo.

Sin embargo, sostuvo que esa decisión es coherente con la “construcción de imagen pública de Lula por varios años”, de un hombre de pueblo que se convirtió en un líder nacional reconocido mundialmente.

“Más allá de la empatía, es una forma de sustentar el carisma y respeto que construyó durante estos años”, opinó. “Este momento de enfermedad personal es una forma de volver a la escena pública de forma central”.

Renzo Taddei, un antropólogo profesor de comunicación, ciudadanía y política en la Universidad Federal de Río de Janeiro (UFRJ), dijo que el manejo público del cáncer de Lula muestra probables aspiraciones políticas a futuro.

“El cáncer es un tema ya clásico de superación y heroísmo en Brasil”, indicó a BBC Mundo.

“Era todo lo que faltaba a Lula: vencer el cáncer. Si lo hace, ya no hay nada más que no pueda hacer (aunque no haya hecho la reforma agraria que Brasil aguarda hace tanto ni las reformas fiscales y políticas)”, agregó.

Cáncer y elecciones

Presidenta Rousseff vista a Lula tras su operaciónLa presidenta Dilma Rousseff también es sobreviviente de un cáncer

Hasta que le fue diagnosticado el cáncer, muchos brasileños se preguntaban si Lula buscaría regresar a la presidencia en las elecciones de 2014, pero él decía que corresponde a Rousseff buscar la reelección.

Cuando Rousseff fue tratada con éxito de un cáncer linfático en 2009, algunos miembros del gobierno de Lula llegaron a especular con que podía salir fortalecida para buscar la presidencia al año siguiente.

Sin embargo, Lula descartó públicamente que ambas cosas pudieran vincularse.

“No puedo imaginar cómo es que alguien sale fortalecido porque tuvo un cáncer”, declaró entonces. “Sólo deseo la recuperación de Dilma”.

Rousseff se recuperó y fue electa presidenta al año siguiente, con el respaldo de Lula.

Arjun Appadurai: A Nation of Business Junkies (Anthropology News)

Guest Columnist
Arjun Appadurai

By Anthropology News on November 3, 2011

I first came to this country in 1967. I have been either a crypto-anthropologist or professional anthropologist for most of that time. Still, because I came here with an interest in India and took the path of least resistance in choosing to maintain India as my principal ethnographic referent, I have always been reluctant to offer opinions about life in these United States. I have begun to do so recently, but mainly in occasional blogs, twitter posts and the like. Now seems to be a good time to ponder whether I have anything to offer to public debate about the media in this country. Since I have been teaching for a few years in a distinguished department of media studies, I feel emboldened to offer my thoughts in this new AN Forum.

My examination of changes in the media over the last few decades is not based on a scientific study. I read the New York Times every day, the Wall Street Journal occasionally, and I subscribe to The Atlantic, Harper’s, The New York Review of Books, the Economist, and a variety of academic journals in anthropology and area studies. I get a smattering of other useful media pieces from friends on Facebook and other social media sites. I also use the Internet to keep up with as much as I can from the press in and about India. At various times in the past, I have subscribed to The Nation, Money Magazine, Foreign Policy, the Times Literary supplement and a few other periodicals.

I have long been interested in how culture and economy interact. Today, I want to make an observation about the single biggest change I have seen over my four decades in the United States, which is a growing and now hegemonic domination of the news and of a great deal of opinion, both in print and on television, by business news. Business news was a specialized affair in the late 1960’s, confined to a few magazines such as Money and Fortune, and to newspapers and TV reporters (not channels). Now, it is hard to find anything but business as the topic of news in all media. Consider television: if you spend even three hours surfing between CNN and BBC on any given day ( surfing for news about Libya or about soccer, for example) you will find yourself regularly assaulted by business news, not just from London, New York and Washington, but from Singapore, Hong Kong, Mumbai and many other places. Look at the serious talk shows and chances are that you will find a talking CEO, describing what’s good about his company, what’s bad about the government and how to read his company’s stock prices. Channels like MSNBC are a form of endless, mind-numbing Jerry Lewis telethon about the economy, with more than a hint of the desperation of the Depression era movie “They Shoot Horses Don’t They?”, as they bid the viewer to make insane bets and to mourn the fallen heroes of failed companies and fired CEO’s.

Turn to the newspapers and things get worse. Any reader of the New York Times will find it hard to get away from the business machine. Start with the lead section, and stories about Obama’s economic plans, mad Republican proposals about taxes, the Euro-crisis and the latest bank scandal will assault you. Some relief is provided by more corporate news: the exit of Steve Jobs, the Op-Ed piece about the responsibilities of the super-rich by Warren Buffet, Donald Trump advertising his new line of housewares to go along with his ugly homes and buildings. Turn to the sports section: it is littered with talk of franchises, salaries, trades, owner antics, stadium projects and more. I need hardly say anything about the section on “Business” itself, which has now virtually become redundant. And if you are still thirsty for more business news, check out the “Home”, “Lifestyle” and Real Estate sections for news on houses you can’t afford and mortgage financing gimmicks you have never heard off. Some measure of relief is to be in the occasional “Science Times” and in the NYT Book Review, which do have some pieces which are not primarily about profit, corporate politics or the recession.

The New York Times is not to blame for this. They are the newspaper of “record’ and that means that they reflect broader trends and cannot be blamed for their compliance with bigger trends. Go through the magazines when you take a flight to Detroit or Mumbai and there is again a feast of news geared to the “business traveler”. This is when I catch up on how to negotiate the best deal, why this is the time to buy gold and what software and hardware to use when I make my next presentation to General Electric. These examples could be multiplied in any number of bookstores, newspaper kiosks, airport lounges, park benches and dentist’s offices.

What does all this reflect? Well, we were always told that the business of America is business. But now we are gradually moving into a society in which the business of American life is also business. Who are we now? We have become (in our fantasies) entrepreneurs, start-up heroes, small investors, consumers, home-owners, day-traders, and a gallery of supporting business types, and no longer fathers, mothers, friends or neighbors. Our very citizenship is now defined by business, whether we are winners or losers. Everyone is an expert on pensions, stocks, retirement packages, vacation deals, credit- card scams and more. Meanwhile, as Paul Krugman has argued in a brilliant recent speech to some of his fellow economists, this discipline, especially macro-economics, has lost all its capacities to analyze, define or repair the huge mess we are in.

The gradual transformation of the imagined reader or viewer into a business junkie is a relatively new disease of advanced capitalism in the United States. The avalanche of business knowledge and information dropping on the American middle-classes ought to have helped us predict – or avoid – the recent economic meltdown, based on crazy credit devices, vulgar scams and lousy regulation. Instead it has made us business junkies, ready to be led like sheep to our own slaughter by Wall Street, the big banks and corrupt politicians. The growing hegemony of business news and knowledge in the popular media over the last few decades has produced a collective silence of the lambs. It is time for a bleat or two.

Dr. Arjun Appadurai is a prominent contemporary social-cultural anthropologist, having formerly served as Provost and Senior Vice President for Academic Affairs at The New School in NYC. He has held various professorial chairs and visiting appointments at some of top institutions in the United States and Europe. In addition, he has served on several scholarly and advisory bodies in the United States, Latin America, Europe and India. Dr. Appadurai is a prolific writer having authored numerous books and scholarly articles. The nature and significance of his contributions throughout his academic career have earned him the reputation as a leading figure in his field. He is the author of The Future as a Cultural Fact: Essays on the Global Condition (Verso: forthcoming 2012).

Ken Routon is the contributing editor of Media Notes. He is a visiting professor of cultural anthropology at the University of New Orleans and the author of Hidden Powers of the State in the Cuban Imagination (University Press of Florida, 2010).

Desafios do “tsunami de dados” (FAPESP)

Lançado pelo Instituto Microsoft Research-FAPESP de Pesquisas em TI, o livro O Quarto Paradigma debate os desafios da eScience, nova área dedicada a lidar com o imenso volume de informações que caracteriza a ciência atual

07/11/2011

Por Fábio de Castro

Agência FAPESP – Se há alguns anos a falta de dados limitava os avanços da ciência, hoje o problema se inverteu. O desenvolvimento de novas tecnologias de captação de dados, nas mais variadas áreas e escalas, tem gerado um volume tão imenso de informações que o excesso se tornou um gargalo para o avanço científico.

Nesse contexto, cientistas da computação têm se unido a especialistas de diferentes áreas para desenvolver novos conceitos e teorias capazes de lidar com a enxurrada de dados da ciência contemporânea. O resultado é chamado de eScience.

Esse é o tema debatido no livro O Quarto Paradigma – Descobertas científicas na era da eScience, lançado no dia 3 de novembro pelo Instituto Microsoft Research-FAPESP de Pesquisas em TI.

Organizado por Tony Hey, Stewart Tansley, Kristin Tolle – todos da Microsoft Research –, a publicação foi lançada na sede da FAPESP, em evento que contou com a presença do diretor científico da Fundação, Carlos Henrique de Brito Cruz.

Durante o lançamento, Roberto Marcondes Cesar Jr., do Instituto de Matemática e Estatística (IME) da Universidade de São Paulo (USP), apresentou a palestra “eScience no Brasil”. “O Quarto Paradigma: computação intensiva de dados avançando a descoberta científica” foi o tema da palestra de Daniel Fay, diretor de Terra, Energia e Meio Ambiente da MSR.

Brito Cruz destacou o interesse da FAPESP em estimular o desenvolvimento da eScience no Brasil. “A FAPESP está muito conectada a essa ideia, porque muitos dos nossos projetos e programas apresentam essa necessidade de mais capacidade de gerenciar grandes conjuntos de dados. O nosso grande desafio está na ciência por trás dessa capacidade de lidar com grandes volumes de dados”, disse.

Iniciativas como o Programa FAPESP de Pesquisa sobre Mudanças Climáticas Globais (PFPMCG), o BIOTA-FAPESP e o Programa FAPESP de Pesquisa em Bioenergia (BIOEN) são exemplos de programas que têm grande necessidade de integrar e processar imensos volumes de dados.

“Sabemos que a ciência avança quando novos instrumentos são disponibilizados. Por outro lado, os cientistas normalmente não percebem o computador como um novo grande instrumento que revoluciona a ciência. A FAPESP está interessada em ações para que a comunidade científica tome consciência de que há grandes desafios na área de eScience”, disse Brito Cruz.

O livro é uma coleção de 26 ensaios técnicos divididos em quatro seções: “Terra e meio ambiente”, “Saúde e bem-estar”, “Infraestrutura científica” e “Comunicação acadêmica”.

“O livro fala da emergência de um novo paradigma para as descobertas científicas. Há milhares de anos, o paradigma vigente era o da ciência experimental, fundamentada na descrição de fenômenos naturais. Há algumas centenas de anos, surgiu o paradigma da ciência teórica, simbolizado pelas leis de Newton. Há algumas décadas, surgiu a ciência computacional, simulando fenômenos complexos. Agora, chegamos ao quarto paradigma, que é o da ciência orientada por dados”, disse Fay.

Com o advento do novo paradigma, segundo ele, houve uma mudança completa na natureza da descoberta científica. Entraram em cena modelos complexos, com amplas escalas espaciais e temporais, que exigem cada vez mais interações multidisciplinares.

“Os dados, em quantidade incrível, são provenientes de diferentes fontes e precisam também de abordagem multidisciplinar e, muitas vezes, de tratamento em tempo real. As comunidades científicas também estão mais distribuídas. Tudo isso transformou a maneira como se fazem descobertas”, disse Fay.

A ecologia, uma das áreas altamente afetadas pelos grandes volumes de dados, é um exemplo de como o avanço da ciência, cada vez mais, dependerá da colaboração entre pesquisadores acadêmicos e especialistas em computação.

“Vivemos em uma tempestade de sensoriamento remoto, sensores terrestres baratos e acesso a dados na internet. Mas extrair as variáveis que a ciência requer dessa massa de dados heterogêneos continua sendo um problema. É preciso ter conhecimento especializado sobre algoritmos, formatos de arquivos e limpeza de dados, por exemplo, que nem sempre é acessível para o pessoal da área de ecologia”, explicou.

O mesmo ocorre em áreas como medicina e biologia – que se beneficiam de novas tecnologias, por exemplo, em registros de atividade cerebral, ou de sequenciamento de DNA – ou a astronomia e física, à medida que os modernos telescópios capturam terabytes de informação diariamente e o Grande Colisor de Hádrons (LHC) gera petabytes de dados a cada ano.

Instituto Virtual

Segundo Cesar Jr., a comunidade envolvida com eScience no Brasil está crescendo. O país tem 2.167 cursos de sistemas de informação ou engenharia e ciências da computação. Em 2009, houve 45 mil formados nessas áreas e a pós-graduação, entre 2007 e 2009, tinha 32 cursos, mil orientadores, 2.705 mestrandos e 410 doutorandos.

“A ciência mudou do paradigma da aquisição de dados para o da análise de dados. Temos diferentes tecnologias que produzem terabytes em diversos campos do conhecimento e, hoje, podemos dizer que essas áreas têm foco na análise de um dilúvio de dados”, disse o membro da Coordenação da Área de Ciência e Engenharia da Computação da FAPESP.

Em 2006, a Sociedade Brasileira de Computação (SBC) organizou um encontro a fim de identificar os problemas-chave e os principais desafios para a área. Isso levou a diferentes propostas para que o Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq) criasse um programa específico para esse tipo de problema.

“Em 2009, realizamos uma série de workshops na FAPESP, reunindo, para discutir essa questão, cientistas de áreas como agricultura, mudanças climáticas, medicina, transcriptômica, games, governo eletrônico e redes sociais. A iniciativa resultou em excelentes colaborações entre grupos de cientistas com problemas semelhantes e originou diversas iniciativas”, disse César Jr.

As chamadas do Instituto Microsoft Research-FAPESP de Pesquisas em TI, segundo ele, têm sido parte importante do conjunto de iniciativas para promover a eScience, assim como a organização da Escola São Paulo de Ciência Avançada em Processamento e Visualização de Imagens Computacionais. Além disso, a FAPESP tem apoiado diversos projetos de pesquisa ligados ao tema.

“A comunidade de eScience em São Paulo tem trabalhado com profissionais de diversas áreas e publicado em revistas de várias delas. Isso é indicação de qualidade adquirida pela comunidade para encarar o grande desafio que teremos nos próximos anos”, disse César Jr., que assina o prefácio da edição brasileira do livro.

  • O Quarto Paradigma
    Organizadores: Tony Hey, Stewart Tansley e Kristin Tolle
    Lançamento: 2011
    Preço: R$ 60
    Páginas: 263
    Mais informações: www.ofitexto.com.br

People Rationalize Situations They’re Stuck With, but Rebel When They Think There’s an out (Science Daily)

ScienceDaily (Nov. 1, 2011) — People who feel like they’re stuck with a rule or restriction are more likely to be content with it than people who think that the rule isn’t definite. The authors of a new study, which will be published in an upcoming issue of Psychological Science, a journal of the Association for Psychological Science, say this conclusion may help explain everything from unrequited love to the uprisings of the Arab Spring.

Psychological studies have found two contradictory results about how people respond to rules. Some research has found that, when there are new restrictions, you rationalize them; your brain comes up with a way to believe the restriction is a good idea. But other research has found that people react negatively against new restrictions, wanting the restricted thing more than ever.

Kristin Laurin of the University of Waterloo thought the difference might be absoluteness — how much the restriction is set in stone. “If it’s a restriction that I can’t really do anything about, then there’s really no point in hitting my head against the wall and trying to fight against it,” she says. “I’m better off if I just give up. But if there’s a chance I can beat it, then it makes sense for my brain to make me want the restricted thing even more, to motivate me to fight” Laurin wrote the new paper with Aaron Kay and Gavan Fitzsimons of Duke University.

In an experiment in the new study, participants read that lowering speed limits in cities would make people safer. Some read that government leaders had decided to reduce speed limits. Of those people, some were told that this legislation would definitely come into effect, and others read that it would probably happen, but that there was still a small chance government officials could vote it down.

People who thought the speed limit was definitely being lowered supported the change more than control subjects, but people who thought there was still a chance it wouldn’t happen supported it less than these control subjects. Laurin says this confirms what she suspected about absoluteness; if a restriction is definite, people find a way to live with it.

This could help explain how uprisings spread across the Arab world earlier this year. When people were living under dictatorships with power that appeared to be absolute, Laurin says, they may have been comfortable with it. But once Tunisia’s president fled, citizens of neighboring countries realized that their governments weren’t as absolute as they seemed — and they could have dropped whatever rationalizations they were using to make it possible to live under an authoritarian regime. Even more, the now non-absolute restriction their governments represented could have exacerbated their reaction, fueling their anger and motivating them to take action.

And how does this relate to unrequited love? It confirms people’s intuitive sense that leading someone can just make them fall for you more deeply, Laurin says. “If this person is telling me no, but I perceive that as not totally absolute, if I still think I have a shot, that’s just going to strengthen my desire and my feeling, that’s going to make me think I need to fight to win the person over,” she says. “If instead I believe no, I definitely don’t have a shot with this person, then I might rationalize it and decide that I don’t like them that much anyway.”

The world at seven billion (BBC)

27 October 2011 Last updated at 23:08 GMT

File photograph of newborn babies in Lucknow, India, in July 2009

As the world population reaches seven billion people, the BBC’s Mike Gallagher asks whether efforts to control population have been, as some critics claim, a form of authoritarian control over the world’s poorest citizens.

The temperature is some 30C. The humidity stifling, the noise unbearable. In a yard between two enormous tea-drying sheds, a number of dark-skinned women patiently sit, each accompanied by an unwieldy looking cloth sack. They are clad in colourful saris, but look tired and shabby. This is hardly surprising – they have spent most of the day in nearby plantation fields, picking tea that will net them around two cents a kilo – barely enough to feed their large families.

Vivek Baid thinks he knows how to help them. He runs the Mission for Population Control, a project in eastern India which aims to bring down high birth rates by encouraging local women to get sterilised after their second child.

As the world reaches an estimated seven billion people, people like Vivek say efforts to bring down the world’s population must continue if life on Earth is to be sustainable, and if poverty and even mass starvation are to be avoided.

There is no doubting their good intentions. Vivek, for instance, has spent his own money on the project, and is passionate about creating a brighter future for India.

But critics allege that campaigners like Vivek – a successful and wealthy male businessman – have tended to live very different lives from those they seek to help, who are mainly poor women.

These critics argue that rich people have imposed population control on the poor for decades. And, they say, such coercive attempts to control the world’s population often backfired and were sometimes harmful.

Population scare

Most historians of modern population control trace its roots back to the Reverend Thomas Malthus, an English clergyman born in the 18th Century who believed that humans would always reproduce faster than Earth’s capacity to feed them.

Giving succour to the resulting desperate masses would only imperil everyone else, he said. So the brutal reality was that it was better to let them starve.

‘Plenty is changed into scarcity’

Thomas Malthus

From Thomas Malthus’ Essay on Population, 1803 edition:

A man who is born into a world already possessed – if he cannot get subsistence from his parents on whom he has a just demand, and if the society do not want his labour, has no claim of right to the smallest portion of food.

At nature’s mighty feast there is no vacant cover for him. She tells him to be gone, and will quickly execute her own orders, if he does not work upon the compassion of some of her guests. If these guests get up and make room for him, other intruders immediately appear demanding the same favour. The plenty that before reigned is changed into scarcity; and the happiness of the guests is destroyed by the spectacle of misery and dependence in every part of the hall.

Rapid agricultural advances in the 19th Century proved his main premise wrong, because food production generally more than kept pace with the growing population.

But the idea that the rich are threatened by the desperately poor has cast a long shadow into the 20th Century.

From the 1960s, the World Bank, the UN and a host of independent American philanthropic foundations, such as the Ford and Rockefeller foundations, began to focus on what they saw as the problem of burgeoning Third World numbers.

The believed that overpopulation was the primary cause of environmental degradation, economic underdevelopment and political instability.

Massive populations in the Third World were seen as presenting a threat to Western capitalism and access to resources, says Professor Betsy Hartmann of Hampshire College, Massachusetts, in the US.

“The view of the south is very much put in this Malthusian framework. It becomes just this powerful ideology,” she says.

In 1966, President Lyndon Johnson warned that the US might be overwhelmed by desperate masses, and he made US foreign aid dependent on countries adopting family planning programmes.

Other wealthy countries such as Japan, Sweden and the UK also began to devote large amounts of money to reducing Third World birth rates.

‘Unmet need’

What virtually everyone agreed was that there was a massive demand for birth control among the world’s poorest people, and that if they could get their hands on reliable contraceptives, runaway population growth might be stopped.

But with the benefit of hindsight, some argue that this so-called unmet need theory put disproportionate emphasis on birth control and ignored other serious needs.

Graph of world population figures

“It was a top-down solution,” says Mohan Rao, a doctor and public health expert at Delhi’s Jawaharlal Nehru University.

“There was an unmet need for contraceptive services, of course. But there was also an unmet need for health services and all kinds of other services which did not get attention. The focus became contraception.”

Had the demographic experts worked at the grass-roots instead of imposing solutions from above, suggests Adrienne Germain, formerly of the Ford Foundation and then the International Women’s Health Coalition, they might have achieved a better picture of the dilemmas facing women in poor, rural communities.

“Not to have a full set of health services meant women were either unable to use family planning, or unwilling to – because they could still expect half their kids to die by the age of five,” she says.

India’s sterilisation ‘madness’

File photograph of Sanjay and Indira Gandhi in 1980

Indira Gandhi and her son Sanjay (above) presided over a mass sterilisation campaign. From the mid-1970s, Indian officials were set sterilisation quotas, and sought to ingratiate themselves with superiors by exceeding them. Stories abounded of men being accosted in the street and taken away for the operation. The head of the World Bank, Robert McNamara, congratulated the Indian government on “moving effectively” to deal with high birth rates. Funding was increased, and the sterilising went on.

In Delhi, some 700,000 slum dwellers were forcibly evicted, and given replacement housing plots far from the city centre, frequently on condition that they were either sterilised or produced someone else for the operation. In poorer agricultural areas, whole villages were rounded up for sterilisation. When residents of one village protested, an official is said to have threatened air strikes in retaliation.

“There was a certain madness,” recalls Nina Puri of the Family Planning Association of India. “All rationality was lost.”

Us and them

In 1968, the American biologist Paul Ehrlich caused a stir with his bestselling book, The Population Bomb, which suggested that it was already too late to save some countries from the dire effects of overpopulation, which would result in ecological disaster and the deaths of hundreds of millions of people in the 1970s.

Instead, governments should concentrate on drastically reducing population growth. He said financial assistance should be given only to those nations with a realistic chance of bringing birth rates down. Compulsory measures were not to be ruled out.

Western experts and local elites in the developing world soon imposed targets for reductions in family size, and used military analogies to drive home the urgency, says Matthew Connelly, a historian of population control at Columbia University in New York.

“They spoke of a war on population growth, fought with contraceptive weapons,” he says. “The war would entail sacrifices, and collateral damage.”

Such language betrayed a lack of empathy with their subjects, says Ms Germain: “People didn’t talk about people. They talked of acceptors and users of family planning.”

Emergency measures

Critics of population control had their say at the first ever UN population conference in 1974.

Karan Singh, India’s health minister at the time, declared that “development is the best contraceptive”.

But just a year later, Mr Singh’s government presided over one of the most notorious episodes in the history of population control.

In June 1975, the Indian premier, Indira Gandhi, declared a state of emergency after accusations of corruption threatened her government. Her son Sanjay used the measure to introduce radical population control measures targeted at the poor.

The Indian emergency lasted less than two years, but in 1975 alone, some eight million Indians – mainly poor men – were sterilised.

Yet, for all the official programmes and coercion, many poor women kept on having babies.

And where they did not, it arguably had less to do with coercive population control than with development, just as Karan Singh had argued in 1974, says historian Matt Connelly.

For example, in India, a disparity in birth rates could already be observed between the impoverished northern states and more developed southern regions like Kerala, where women were more likely to be literate and educated, and their offspring more likely to be healthy.

Women there realised that they could have fewer births and still expect to see their children survive into adulthood.

China: ‘We will not allow your baby to live’

Steven Mosher was a Stanford University anthropologist working in rural China who witnessed some of the early, disturbing moments of Beijing’s One Child Policy.

“I remember very well the evening of 8 March, 1980. The local Communist Party official in charge of my village came over waving a government document. He said: ‘The Party has decided to impose a cap of 1% on population growth this year.’ He said: ‘We’re going to decide who’s going to be allowed to continue their pregnancy and who’s going to be forced to terminate their pregnancy.’ And that’s exactly what they did.”

“These were women in the late second and third trimester of pregnancy. There were several women just days away from giving birth. And in my hearing, a party official said: ‘Do not think that you can simply wait until you go into labour and give birth, because we will not allow your baby to live. You will go home alone’.”

Total control

By now, this phenomenon could be observed in another country too – one that would nevertheless go on to impose the most draconian population control of all.

The One Child Policy is credited with preventing some 400 million births in China, and remains in place to this day. In 1983 alone, more than 16 million women and four million men were sterilised, and 14 million women received abortions.

Assessed by numbers alone, it is said to be by far the most successful population control initiative. Yet it remains deeply controversial, not only because of the human suffering it has caused.

A few years after its inception, the policy was relaxed slightly to allow rural couples two children if their first was not a boy. Boy children are prized, especially in the countryside where they provide labour and care for parents in old age.

But modern technology allows parents to discover the sex of the foetus, and many choose to abort if they are carrying a girl. In some regions, there is now a serious imbalance between men and women.

Moreover, since Chinese fertility was already in decline at the time the policy was implemented, some argue that it bears less responsibility for China’s falling birth rate than its supporters claim.

“I don’t think they needed to bring it down further,” says Indian demographer AR Nanda. “It would have happened at its own slow pace in another 10 years.”

Backlash

In the early 1980s, objections to the population control movement began to grow, especially in the United States.

In Washington, the new Reagan administration removed financial support for any programmes that involved abortion or sterilisation.

“If you give women the tools they need – education, employment, contraception, safe abortion – then they will make the choices that benefit society”

Adrienne Germain

The broad alliance to stem birth rates was beginning to dissolve and the debate become more polarised along political lines.

While some on the political right had moral objections to population control, some on the left saw it as neo-colonialism.

Faith groups condemned it as a Western attack on religious values, but women’s groups feared changes would mean poor women would be even less well-served.

By the time of a major UN conference on population and development in Cairo in 1994, women’s groups were ready to strike a blow for women’s rights, and they won.

The conference adopted a 20-year plan of action, known as the Cairo consensus, which called on countries to recognise that ordinary women’s needs – rather than demographers’ plans – should be at the heart of population strategies.

After Cairo

Today’s record-breaking global population hides a marked long-term trend towards lower birth rates, as urbanisation, better health care, education and access to family planning all affect women’s choices.

With the exception of sub-Saharan Africa and some of the poorest parts of India, we are now having fewer children than we once did – in some cases, failing even to replace ourselves in the next generation. And although total numbers are set to rise still further, the peak is now in sight.

Chinese poster from the 1960s of mother and baby, captioned: Practicing birth control is beneficial for the protection of the health of mother and childChina promoted birth control before implementing its one-child policy

Assuming that this trend continues, total numbers will one day level off, and even fall. As a result, some believe the sense of urgency that once surrounded population control has subsided.

The term population control itself has fallen out of fashion, as it was deemed to have authoritarian connotations. Post-Cairo, the talk is of women’s rights and reproductive rights, meaning the right to a free choice over whether or not to have children.

According to Adrienne Germain, that is the main lesson we should learn from the past 50 years.

“I have a profound conviction that if you give women the tools they need – education, employment, contraception, safe abortion – then they will make the choices that benefit society,” she says.

“If you don’t, then you’ll just be in an endless cycle of trying to exert control over fertility – to bring it up, to bring it down, to keep it stable. And it never comes out well. Never.”

Nevertheless, there remain to this day schemes to sterilise the less well-off, often in return for financial incentives. In effect, say critics, this amounts to coercion, since the very poor find it hard to reject cash.

“The people proposing this argue ‘Don’t worry, everything’ s fine now we have voluntary programmes on the Cairo model’,” says Betsy Hartmann.

“But what they don’t understand is the profound difference in power between rich and poor. The people who provide many services in poor areas are already prejudiced against the people they serve.”

Work in progress

For Mohan Rao, it is an example of how even the Cairo consensus fails to take account of the developing world.

“Cairo had some good things,” he says. “However Cairo was driven largely by First World feminist agendas. Reproductive rights are all very well, but [there needs to be] a whole lot of other kinds of enabling rights before women can access reproductive rights. You need rights to food, employment, water, justice and fair wages. Without all these you cannot have reproductive rights.”

Perhaps, then, the humanitarian ideals of Cairo are still a work in progress.

Meanwhile, Paul Ehrlich has also amended his view of the issue.

If he were to write his book today, “I wouldn’t focus on the poverty-stricken masses”, he told the BBC.

“I would focus on there being too many rich people. It’s crystal clear that we can’t support seven billion people in the style of the wealthier Americans.”

Mike Gallager is the producer of the radio programme Controlling People on BBC World Service

Where do you fit into 7 billion?

The world’s population is expected to hit seven billion in the next few weeks. After growing very slowly for most of human history, the number of people on Earth has more than doubled in the last 50 years. Where do you fit into this story of human life? Fill in your date of birth here to find out.

The world’s population will reach 7 billion at the end of October. Don’t panic (The Economist)

Demography

A tale of three islands

Oct 22nd 2011 | from the print edition

 

IN 1950 the whole population of the earth—2.5 billion—could have squeezed, shoulder to shoulder, onto the Isle of Wight, a 381-square-kilometre rock off southern England. By 1968 John Brunner, a British novelist, observed that the earth’s people—by then 3.5 billion—would have required the Isle of Man, 572 square kilometres in the Irish Sea, for its standing room. Brunner forecast that by 2010 the world’s population would have reached 7 billion, and would need a bigger island. Hence the title of his 1968 novel about over-population, “Stand on Zanzibar” (1,554 square kilometres off east Africa).

Brunner’s prediction was only a year out. The United Nations’ population division now says the world will reach 7 billion on October 31st 2011 (America’s Census Bureau delays the date until March 2012). The UN will even identify someone born that day as the world’s 7 billionth living person. The 6 billionth, Adnan Nevic, was born on October 12th 1999 in Sarajevo, in Bosnia. He will be just past his 12th birthday when the next billion clicks over.

That makes the world’s population look as if it is rising as fast as ever. It took 250,000 years to reach 1 billion, around 1800; over a century more to reach 2 billion (in 1927); and 32 years more to reach 3 billion. But to rise from 5 billion (in 1987) to 6 billion took only 12 years; and now, another 12 years later, it is at 7 billion (see chart 1). By 2050, the UN thinks, there will be 9.3 billion people, requiring an island the size of Tenerife or Maui to stand on.

Odd though it seems, however, the growth in the world’s population is actually slowing. The peak of population growth was in the late 1960s, when the total was rising by almost 2% a year. Now the rate is half that. The last time it was so low was in 1950, when the death rate was much higher. The result is that the next billion people, according to the UN, will take 14 years to arrive, the first time that a billion milestone has taken longer to reach than the one before. The billion after that will take 18 years.

Once upon a time, the passing of population milestones might have been cause for celebration. Now it gives rise to jeremiads. As Hillary Clinton’s science adviser, Nina Fedoroff, told the BBC in 2009, “There are probably already too many people on the planet.” But the notion of “too many” is more flexible than it seems. The earth could certainly not support 10 billion hunter-gatherers, who used much more land per head than modern farm-fed people do. But it does not have to. The earth might well not be able to support 10 billion people if they had exactly the same impact per person as 7 billion do today. But that does not necessarily spell Malthusian doom, because the impact humans have on the earth and on each other can change.

For most people, the big questions about population are: can the world feed 9 billion mouths by 2050? Are so many people ruining the environment? And will those billions, living cheek-by-jowl, go to war more often? On all three counts, surprising as it seems, reducing population growth any more quickly than it is falling anyway may not make much difference.

Start with the link between population and violence. It seems plausible that the more young men there are, the more likely they will be to fight. This is especially true when groups are competing for scarce resources. Some argue that the genocidal conflict in Darfur, western Sudan, was caused partly by high population growth, which led to unsustainable farming and conflicts over land and water. Land pressure also influenced the Rwandan genocide of 1994, as migrants in search of a livelihood in one of the world’s most densely populated countries moved into already settled areas, with catastrophic results.

But there is a difference between local conflicts and what is happening on a global scale. Although the number of sovereign states has increased almost as dramatically as the world’s population over the past half-century, the number of wars between states fell fairly continuously during the period. The number of civil wars rose, then fell. The number of deaths in battle fell by roughly three-quarters. These patterns do not seem to be influenced either by the relentless upward pressure of population, or by the slackening of that pressure as growth decelerates. The difference seems to have been caused by fewer post-colonial wars, the ending of cold-war alliances (and proxy wars) and, possibly, the increase in international peacekeepers.

More people, more damage?

Human activity has caused profound changes to the climate, biodiversity, oceanic acidity and greenhouse-gas levels in the atmosphere. But it does not automatically follow that the more people there are, the worse the damage. In 2007 Americans and Australians emitted almost 20 tonnes of carbon dioxide each. In contrast, more than 60 countries—including the vast majority of African ones—emitted less than 1 tonne per person.

This implies that population growth in poorer countries (where it is concentrated) has had a smaller impact on the climate in recent years than the rise in the population of the United States (up by over 50% in 1970-2010). Most of the world’s population growth in the next 20 years will occur in countries that make the smallest contribution to greenhouse gases. Global pollution will be more affected by the pattern of economic growth—and especially whether emerging nations become as energy-intensive as America, Australia and China.

Population growth does make a bigger difference to food. All things being equal, it is harder to feed 7 billion people than 6 billion. According to the World Bank, between 2005 and 2055 agricultural productivity will have to increase by two-thirds to keep pace with rising population and changing diets. Moreover, according to the bank, if the population stayed at 2005 levels, farm productivity would have to rise by only a quarter, so more future demand comes from a growing population than from consumption per person.

Increasing farm productivity by a quarter would obviously be easier than boosting it by two-thirds. But even a rise of two-thirds is not as much as it sounds. From 1970-2010 farm productivity rose far more than this, by over three-and-a-half times. The big problem for agriculture is not the number of people, but signs that farm productivity may be levelling out. The growth in agricultural yields seems to be slowing down. There is little new farmland available. Water shortages are chronic and fertilisers are over-used. All these—plus the yield-reductions that may come from climate change, and wastefulness in getting food to markets—mean that the big problems are to do with supply, not demand.

None of this means that population does not matter. But the main impact comes from relative changes—the growth of one part of the population compared with another, for example, or shifts in the average age of the population—rather than the absolute number of people. Of these relative changes, falling fertility is most important. The fertility rate is the number of children a woman can expect to have. At the moment, almost half the world’s population—3.2 billion—lives in countries with a fertility rate of 2.1 or less. That number, the so-called replacement rate, is usually taken to be the level at which the population eventually stops growing.

The world’s decline in fertility has been staggering (see chart 2). In 1970 the total fertility rate was 4.45 and the typical family in the world had four or five children. It is now 2.45 worldwide, and lower in some surprising places. Bangladesh’s rate is 2.16, having halved in 20 years. Iran’s fertility fell from 7 in 1984 to just 1.9 in 2006. Countries with below-replacement fertility include supposedly teeming Brazil, Tunisia and Thailand. Much of Europe and East Asia have fertility rates far below replacement levels.

The fertility fall is releasing wave upon wave of demographic change. It is the main influence behind the decline of population growth and, perhaps even more important, is shifting the balance of age groups within a population.

When gold turns to silver

A fall in fertility sends a sort of generational bulge surging through a society. The generation in question is the one before the fertility fall really begins to bite, which in Europe and America was the baby-boom generation that is just retiring, and in China and East Asia the generation now reaching adulthood. To begin with, the favoured generation is in its childhood; countries have lots of children and fewer surviving grandparents (who were born at a time when life expectancy was lower). That was the situation in Europe in the 1950s and in East Asia in the 1970s.

But as the select generation enters the labour force, a country starts to benefit from a so-called “demographic dividend”. This happens when there are relatively few children (because of the fall in fertility), relatively few older people (because of higher mortality previously), and lots of economically active adults, including, often, many women, who enter the labour force in large numbers for the first time. It is a period of smaller families, rising income, rising life expectancy and big social change, including divorce, postponed marriage and single-person households. This was the situation in Europe between 1945 and 1975 (“les trente glorieuses”) and in much of East Asia in 1980-2010.

But there is a third stage. At some point, the gilded generation turns silver and retires. Now the dividend becomes a liability. There are disproportionately more old people depending upon a smaller generation behind them. Population growth stops or goes into reverse, parts of a country are abandoned by the young and the social concerns of the aged grow in significance. This situation already exists in Japan. It is arriving fast in Europe and America, and soon after that will reach East Asia.

A demographic dividend tends to boost economic growth because a large number of working-age adults increases the labour force, keeps wages relatively low, boosts savings and increases demand for goods and services. Part of China’s phenomenal growth has come from its unprecedentedly low dependency ratio—just 38 (this is the number of dependents, children and people over 65, per 100 working adults; it implies the working-age group is almost twice as large as the rest of the population put together). One study by Australia’s central bank calculated that a third of East Asia’s GDP growth in 1965-90 came from its favourable demography. About a third of America’s GDP growth in 2000-10 also came from its increasing population.

The world as a whole reaped a demographic dividend in the 40 years to 2010. In 1970 there were 75 dependents for every 100 adults of working age. In 2010 the number of dependents dropped to just 52. Huge improvements were registered not only in China but also in South-East Asia and north Africa, where dependency ratios fell by 40 points. Even “ageing” Europe and America ended the period with fewer dependents than at the beginning.

A demographic dividend does not automatically generate growth. It depends on whether the country can put its growing labour force to productive use. In the 1980s Latin America and East Asia had similar demographic patterns. But while East Asia experienced a long boom, Latin America endured its “lost decade”. One of the biggest questions for Arab countries, which are beginning to reap their own demographic dividends, is whether they will follow East Asia or Latin America.

But even if demography guarantees nothing, it can make growth harder or easier. National demographic inheritances therefore matter. And they differ a lot.

Where China loses

Hania Zlotnik, the head of the UN’s Population Division, divides the world into three categories, according to levels of fertility (see map). About a fifth of the world lives in countries with high fertility—3 or more. Most are Africans. Sub-Saharan Africa, for example, is one of the fastest-growing parts of the world. In 1975 it had half the population of Europe. It overtook Europe in 2004, and by 2050 there will be just under 2 billion people there compared with 720m Europeans. About half of the 2.3 billion increase in the world’s population over the next 40 years will be in Africa.

The rest of the world is more or less equally divided between countries with below-replacement fertility (less than 2.1) and those with intermediate fertility (between 2.1 and 3). The first group consists of Europe, China and the rest of East Asia. The second comprises South and South-East Asia, the Middle East and the Americas (including the United States).

The low-fertility countries face the biggest demographic problems. The elderly share of Japan’s population is already the highest in the world. By 2050 the country will have almost as many dependents as working-age adults, and half the population will be over 52. This will make Japan the oldest society the world has ever known. Europe faces similar trends, less acutely. It has roughly half as many dependent children and retired people as working-age adults now. By 2050 it will have three dependents for every four adults, so will shoulder a large burden of ageing, which even sustained increases in fertility would fail to reverse for decades. This will cause disturbing policy implications in the provision of pensions and health care, which rely on continuing healthy tax revenues from the working population.

At least these countries are rich enough to make such provision. Not so China. With its fertility artificially suppressed by the one-child policy, it is ageing at an unprecedented rate. In 1980 China’s median age (the point where half the population is older and half younger) was 22 years, a developing-country figure. China will be older than America as early as 2020 and older than Europe by 2030. This will bring an abrupt end to its cheap-labour manufacturing. Its dependency ratio will rise from 38 to 64 by 2050, the sharpest rise in the world. Add in the country’s sexual imbalances—after a decade of sex-selective abortions, China will have 96.5m men in their 20s in 2025 but only 80.3m young women—and demography may become the gravest problem the Communist Party has to face.

Many countries with intermediate fertility—South-East Asia, Latin America, the United States—are better off. Their dependency ratios are not deteriorating so fast and their societies are ageing more slowly. America’s demographic profile is slowly tugging it away from Europe. Though its fertility rate may have fallen recently, it is still slightly higher than Europe’s. In 2010 the two sides of the Atlantic had similar dependency rates. By 2050 America’s could be nearly ten points lower.

But the biggest potential beneficiaries are the two other areas with intermediate fertility—India and the Middle East—and the high-fertility continent of Africa. These places have long been regarded as demographic time-bombs, with youth bulges, poverty and low levels of education and health. But that is because they are moving only slowly out of the early stage of high fertility into the one in which lower fertility begins to make an impact.

At the moment, Africa has larger families and more dependent children than India or Arab countries and is a few years younger (its median age is 20 compared with their 25). But all three areas will see their dependency ratios fall in the next 40 years, the only parts of the world to do so. And they will keep their median ages low—below 38 in 2050. If they can make their public institutions less corrupt, keep their economic policies outward-looking and invest more in education, as East Asia did, then Africa, the Middle East and India could become the fastest-growing parts of the world economy within a decade or so.

Here’s looking at you

Demography, though, is not only about economics. Most emerging countries have benefited from the sort of dividend that changed Europe and America in the 1960s. They are catching up with the West in terms of income, family size and middle-class formation. Most say they want to keep their cultures unsullied by the social trends—divorce, illegitimacy and so on—that also affected the West. But the growing number of never-married women in urban Asia suggests that this will be hard.

If you look at the overall size of the world’s population, then, the picture is one of falling fertility, decelerating growth and a gradual return to the flat population level of the 18th century. But below the surface societies are being churned up in ways not seen in the much more static pre-industrial world. The earth’s population may never need a larger island than Maui to stand on. But the way it arranges itself will go on shifting for centuries to come.

Those fast-talking Japanese! And Spanish! (The Christian Science Monitor)

By Ruth Walker / October 13, 2011

It is the universal experience of anyone having a first serious encounter in a language he or she is learning: “Those people talk so fast I will never be able to understand them, let alone hold my own in a conversation.”

The learner timidly poses a carefully rehearsed question about the availability of tickets for tonight’s performance or directions to the museum or whatever, and the response all but gallops out of the mouth of the native speaker like a runaway horse.

Now researchers at the University of Lyon in France have presented findings that provide language learners some validation for their feelings – but only some. The team found that, objectively, some languages are spoken faster than others, in terms of syllables per minute. But there’s a trade-off: Some languages pack more meaning into their syllables.

The key element turns out to be what the researchers call “density.”

Time magazine published a widely reproduced article on the Lyon research, which originally came out in Language, the journal of the Linguistic Society of America. The team in Lyon recruited several dozen volunteers, each a native speaker of one of several common languages: English, French, German, Italian, Japanese, Mandarin Chinese, or Spanish. Vietnamese was used as a sort of control language.

The volunteers read a series of 20 different texts in their respective native tongues into a recorder. The researchers then counted all the syllables in each of the recordings to determine how many syllables per second were spoken in each language. That’s a lot of counting.

Then they analyzed all these syllables for their information density. To mention Time’s examples: “A single-syllable word like bliss, for example, is rich with meaning – signifying not ordinary happiness but a particularly serene and rapturous kind. The single-syllable word to is less information-dense. And a single syllable like the short ‘i’ sound, as in the word jubilee, has no independent meaning at all.”

Here’s where Vietnamese comes in: It turns out to be the gold standard for information density. Who knew? The researchers assigned an arbitrary value of 1 to Vietnamese syllables, and compared other syllables against that standard.

English turns out to have a density of .91 (91 percent as dense as Vietnamese, in other words) and an average speed of 6.19 syllables per second. Mandarin is slightly denser (.94) but has an average speed of 5.18, which made it the slowest of the group studied.

At the other end of the scale were Spanish, with a density of .63 and a speed of 7.82, and Japanese, with a density of only .49 but a speed of 7.84.

So what makes a language more or less dense? The number of sounds, for one thing. Some languages make do with relatively few consonants and vowels, and so end up with a lot of long words: Hawaiian, for example, with 13 letters.

English, on the other hand, has a relatively large number of vowels – a dozen, although that varies according to dialect. Chinese uses tones, which help make it a “denser” language. And some languages use more inflections – special endings to indicate gender, number, or status – which English, for instance, largely dispenses with.

The researchers concluded that across the board, speakers of the languages they studied conveyed about the same amount of meaning in the same amount of time, whether by speaking faster or packing more meaning into their syllables.

MP3 Players ‘Shrink’ Our Personal Space (Science Daily)

Science Daily (Oct. 12, 2011) — How close could a stranger come to you before you start feeling uncomfortable? Usually, people start feeling uneasy when unfamiliar people come within an arm’s reach. But take the subway (underground rail) during rush hour and you have no choice but to get up close and personal with complete strangers.

Researchers at Royal Holloway, University of London wanted to find out whether there is a way to make this intrusion more tolerable. Their results, published in the journal PLoS One, reveal that listening to music through headphones can change people’s margins of personal space.

Dr Manos Tsakiris, from the Department of Psychology at Royal Holloway, said: “This distance we try to maintain between ourselves and others is a comfort zone surrounding our bodies. Everyone knows where the boundaries of their personal space are even though they may not consciously dictate them. Of course personal space can be modified for example in a number of relationships including family members and romantic partners, but on a busy tube or bus you can find complete strangers encroaching in this space.”

The study, led by Dr Tsakiris and Dr Ana Tajadura-Jiménez from Royal Holloway, involved asking volunteers to listen to positive or negative emotion-inducing music through headphones or through speakers. At the same time, a stranger started walking towards them and the participants were asked to say “stop” when they started feeling uncomfortable.

The results showed that when participants were listening to music that evoked positive emotions through headphones, they let the stranger come closer to them, indicating a change in their own personal space. Dr Tajadura-Jiménez explains: “Listening to music that induces positive emotions delivered through headphones shifts the margins of personal space. Our personal space “shrinks,” allowing others to get closer to us.”

Dr Tsakiris added: “So next time you are ready to board a packed train, turn on your mp3 player and let others come close to you without fear of feeling invaded.”

Some People’s Climate Beliefs Shift With Weather (Columbia University)

Study Shows Daily Malleability on a Long-Term Question

2011-04-06
ThermometerPhoto by domediart, Flickr

Social scientists are struggling with a perplexing earth-science question: as the power of evidence showing manmade global warming is rising, why do opinion polls suggest public belief in the findings is wavering? Part of the answer may be that some people are too easily swayed by the easiest, most irrational piece of evidence at hand: their own estimation of the day’s temperature.

In three separate studies, researchers affiliated with Columbia University’s Center for Research on Environmental Decisions (CRED) surveyed about 1,200 people in the United States and Australia, and found that those who thought the current day was warmer than usual were more likely to believe in and feel concern about global warming than those who thought the day was unusually cold. A new paper describing the studies appears in the current issue of the journal Psychological Science.

“Global warming is so complex, it appears some people are ready to be persuaded by whether their own day is warmer or cooler than usual, rather than think about whether the entire world is becoming warmer or cooler,” said lead author Ye Li, a postdoctoral researcher at the Columbia Business School’s Center for Decision Sciences, which is aligned with CRED. “It is striking that society has spent so much money, time and effort educating people about this issue, yet people are still so easily influenced.”  The study says that “these results join a growing body of work show that irrelevant environmental information, such as the current weather, can affect judgments. … By way of analogy, when asked about the state of the national economy, someone might look at the amount of money in his or her wallet, a factor with only trivial relevance.”

Ongoing studies by other researchers have already provided strong evidence that opinions on climate and other issues can hinge on factors unrelated to scientific observations. Most pointedly, repeated polls have shown that voters identifying themselves as political liberals or Democrats are far more likely to believe in human-influenced climate change than those who identify themselves as conservatives or Republicans. Women believe more than men, and younger people more than older ones. Other, yet-to-be published studies at four other universities have looked at the effects of actual temperature—either the natural one outside, or within a room manipulated by researchers—and show that real-time thermometer readings can affect people’s beliefs as well. These other studies involve researchers at New York University, Temple University, the University of Chicago and the University of California, Berkeley.

In the current paper, respondents were fairly good at knowing if it was unusually hot or cold–perceptions correlated with reality three quarters of the time—and that the perception exerted a powerful control on their attitude. As expected, politics, gender and age all had the predicted influences: for instance, on the researchers’ 1-to-4 scale of belief in global warming, Democrats were 1.5 points higher than Republicans. On the whole though, after controlling for the other factors, the researchers found that perceived temperatures still had nearly two-thirds the power as political belief, and six times the power as gender, to push someone one way or the other a notch along the scale. (The coming NYU/Temple study suggests that those with no strong political beliefs and lower education are the most easily swayed.)

In one of the studies described in the paper, the researchers tried to test the earnestness of the responses by seeing how many of those getting paid $8 for the survey were willing to donate to a real-life charity, Clean Air-Cool Planet. The correlation was strong; those who said it was warmer donated an average of about $2; those who felt it was cooler gave an average of 48 cents.

The researchers say the study not only points to how individuals’ beliefs can change literally with the wind. Li says it is possible that weather may have influenced recent large-scale public opinion polls showing declining faith in climate science. Administered at different times, future ones might turn out differently, he said. These polls, he pointed out, include the national elections, which always take place in November, when things are getting chilly and thus may be empowering conservative forces at a time when climate has become a far more contentious issue than in the past. (Some politicians subsequently played up the heavy snows and cold of winter 2009-2010 as showing global warming was a hoax—even though scientists pointed out that such weather was probably controlled by short-term atmospheric mechanisms, and consistent with long-term warming.) “I’m not sure I’d say that people are manipulated by the weather. But for some percentage of people, it’s certainly pushing them around.” said Li.

The other authors are Eric J. Johnson, co-director of the Center for Decision Sciences; and Lisa Zaval, a Columbia graduate student in psychology.

Original link: http://www.earth.columbia.edu/articles/view/2794

The great difficulty with good hypotheses

“There is one great difficulty with a good hypothesis. When it is completed and rounded, the corners smooth and the content cohesive and coherent, it is likely to become a thing in itself, a work of art. It is then like a finished sonnet or a painting completed. One hates to disturb it. Even if subsequent information should shoot a hole in it, one hates to tear it down because it once was beautiful and whole.”

From The Log from the Sea of Cortez, by John Steinbeck.

I am, therefore I’m right (Christian Science Monitor)

By Jim Sollisch / July 29, 2011

If you’ve ever been on a jury, you might have noticed that a funny thing happens the minute you get behind closed doors. Everybody starts talking about themselves. They say what they would have done if they had been the plaintiff or the defendant. They bring up anecdote after anecdote. It can take hours to get back to the points of law that the judge has instructed you to consider.

Being on a jury (I recently served on my fourth) reminds me why I can’t stomach talk radio. We Americans seem to have lost the ability to talk about anything but our own experiences. We can’t seem to generalize without stereotyping or to consider evidence that goes against our own experience.

I heard a doctor on a radio show the other day talking about a study that found that exercise reduces the incidence of Alzheimer’s. And caller after caller couldn’t wait to make essentially the opposite point: “Well, my grandmother never exercised and she lived to 95, sharp as a tack.” We are in an age summed up by the aphorism: “I experience, therefore I’m right.”

This isn’t a new phenomenon, except by degree. Historically, the hallmarks of an uneducated person were the lack of ability to think critically, to use deductive reasoning, to distinguish the personal from the universal. Now that seems an apt description of many Americans. The culture of “I” is everywhere you look, from the iPod/iPhone/iPad to the fact that memoir is the fastest growing literary genre.

How’d we get here? The same way we seem to get everywhere today: the Internet. The Internet has allowed us to segregate ourselves based on our interests. All cat lovers over here. All people who believe President Obama wasn’t born in the United States over there. For many of us, what we believe has become the most important organizing element in our lives. Once we all had common media experiences: Walter Cronkite, Ed Sullivan, a large daily newspaper. Now each of us can create a personal media network – call it the iNetwork – fed by the RSS feeds of our choosing.

But the Internet doesn’t just cordon us off in our own little pods. It also makes us dumber, as Nicholas Carr points out in his excellent book, “The Shallows: What the Internet is Doing to our Brains.” He argues that the way we consume media changes our brains, not just our behaviors. The Internet rewards shallow thinking: One search leads to thousands of results that skim over the surface of a subject.

Of course, we could dive deeply into any one of the listings, but we don’t. Studies show that people skim on line, they don’t read. The experience has been designed to reward speed and variety, not depth. And there is tangible evidence, based on studies of brain scans, that the medium is changing our physical brains, strengthening the synapses and areas used for referential thinking while weakening the areas used for critical thinking.

And when we diminish our ability to think critically, we, in essence, become less educated. Less capable of reflection and meaningful conversation. Our experience, reinforced by a web of other gut instincts and experiences that match our own, becomes evidence. Case in point: the polarization of our politics. Exhibit A: the debt ceiling impasse.

Ironically, the same medium that helped mobilize people in the Arab world this spring is helping create a more rigid, dysfunctional democracy here: one that’s increasingly polarized, where each side is isolated and capable only of sound bites that skim the surface, a culture where deep reasoning and critical thinking aren’t rewarded.

The challenge for most of us isn’t to go backwards: We can’t disconnect from the Internet. Nor would we want to. But we can work harder to make “search” the metaphor it once was: to discover, not just to skim. The Internet lets us find facts in an instant. But it doesn’t stop us from finding insight, if we’re willing to really search.

Jim Sollisch is creative director at Marcus Thomas Advertising.

Dilma Rousseff – a favela with a presidential name (The Guardian)

Renaming of Brazilian shantytown puts spotlight on problems facing country’s 16 million citizens living in extreme poverty

Tom Phillips in Rio de Janeiro ; guardian.co.uk, Monday 27 June 2011 16.58 BST

Three-month old Karen da Silva – the youngest resident of Dilma Rousseff – with her mother, 23-year-old Maria da Paixao Sequeira da Silva. Photograph: Tom Phillips

They call her Dilma Rousseff’s daughter: a dribbling three-month-old girl, coated in puppy fat and smothered by cooing relatives.

But Karen da Silva is no relation of Brazil’s first-ever female president. She is the first child to be born into one of the country’s newest favelas – the Comunidade Dilma Rousseff, a roadside shantytown on the western outskirts of Rio de Janeiro that was recently re-baptised with the name of the most powerful woman in the country

“She’s Dilma’s baby,” said Vagner Gonzaga dos Santos, a 33-year-old brick-layer-cum-evangelical preacher and the brains behind the decision to change the name of this hitherto unknown favela.

Last month, just as Rousseff was about to complete six months in power, Santos says he received a heaven-sent message suggesting the renaming.

“God lit up my heart,” he said. “The idea was to pay homage to the president and also to get the attention of the government, of our leaders, so they look to us and help the families here. The poor are God’s children too.”

Until recently, the 30-odd shacks that flank the Rio-Sao Paulo highway were known simply as “kilometre 31”. But its transition to Dilma Rousseff has not been entirely smooth.

At first, locals plastered A4 posters on the area’s walls and front doors, announcing the new name. But the posters referred to the Comunidade “Roussef” – one “f” short of the president’s Bulgarian surname. In May a sign was erected welcoming visitors to their shantytown, but again spelling proved an issue. This time the name given was “Dilma Rusself.”

That mistake has now been corrected, after an intervention from the preacher’s wife, who took a pot of red nail varnish to the sign. Locals say the name-change is starting to pay off.

“It’s been good having the president’s names,” said Marlene Silva de Souza, a 57-year-old mother of five and one of the area’s oldest residents. “Now we can say our community’s name with pride. Before we didn’t have a name at all.”

Dozens of Brazilian newspapers have flocked to the community – poking fun at its misspelt sign but also drawing attention to the poor living conditions inside the favela.

“It has brought us a lot of attention … The repercussion has been marvellous. Today things are starting to take shape, things are improving,” said Santos, who hopes local authorities will now formally recognise the favela, bringing public services such as electricity and rubbish collection.

Still, problems abound. Raw sewage trickles out from the houses, through a patchwork of wooden shacks, banana and mango trees and an allotment where onions sprout amid piles of rubbish. Rats and cockroaches proliferate in the wasteland that encircles the area.

Ownership is also an issue. Dilma Rousseff is built on private land – “The owners are Spanish, I think,” says Santos – and on paper the community does not officially exist. Without a fixed abode Karen “Rousseff” da Silva – the favela’s firstborn child – has yet to be legally registered.

Last month the Brazilian government launched a drive to eradicate extreme poverty unveiling programmes that will target 16 million of Brazil’s poorest citizens.

“My government’s most determined fight will be to eradicate extreme poverty and create opportunities for all,” Rousseff said in her inaugural address in January. “I will not rest while there are Brazilians who have no food on their tables, while there are desperate families on the streets [and] while there are poor children abandoned to their own fate.”

Residents of Rousseff’s namesake, who scratch a living selling biscuits and drinks to passing truck drivers, hope such benefits will soon reach them.

A visit from the president herself may also be on the cards, after Santos launched an appeal in the Brazilian media.

“We dream of her coming one day,” said the preacher, perched on a wooden bench outside his redbrick church, the House of Prayers. “It might be impossible for man to achieve, but for God everything is possible.”

Naming a community

Tear-jerking soap operas, political icons, stars of stage and screen – when it comes to baptising a Brazilian favela, all are fair game. The north-eastern city of Recife is home to favelas called Ayrton Senna, Planet of the Apes and Dancing Days, the title of a popular 1970stelenovela,

In the 1980s residents of a shantytown in Belo Horizonte named their community Rock in Rio – a tribute to the Brazilian rock festival that has played host to acts such as Neil Young, David Bowie and Queen.

Rio de Janeiro is home to the Boogie Woogie favela, the Kinder Egg favela and one community called Disneylandia. Vila Kennedy – a slum in west Rio – was named after the American president John F Kennedy and features a three-metre tall replica of the Statue of Liberty. Nearby, locals christened another hilltop slum Jorge Turco or Turkish George. Jorge was reputedly a benevolent gangster who ruled the community decades ago.

Paul Virilio: “Minha língua estrangeira é a velocidade, é a aceleração do real” (L.M. DIPLOMATIQUE Brasil)

03 de Junho de 2011

ENTREVISTA

“Minha língua estrangeira é a velocidade, é a aceleração do real”

por Guilherme Soares dos Santos

Uma das maiores personalidades da França atualmente, o filósofo e urbanista Paul Virilio ocupa um lugar de destaque na cena intelectual. Escritor prolífico, no seu currículo sucedem-se livros, exposições e artigos tais como Velocidade e política, Guerra e cinema, O espaço crítico, Máquina de visão e, recentemente, O grande acelerador [sem tradução ainda para o português] em que ele desenvolve uma cultura crítica, alguns dizem “catastrofista”, sobre as técnicas modernas, bem como seus efeitos de aceleração sobre nossos comportamentos e nossa percepção do mundo, no momento em que a economia mundial depende cada vez mais do investimento na tecnologia.

Paul Virilio recebe o filósofo brasileiro Guilherme Soares dos Santos em Paris, e fala com exclusividade ao Le Monde Diplomatique Brasil sobre suas teses, que tratam da corrida, da lógica da velocidade, ele que é visto por muitos como um reacionário ou um visionário, fala ytambém de sua biografia.

VIRILIO: Há uma coisa fundamental que explicará, talvez, o aspecto “catastrofista” do qual me acusam. Eu sou uma criança da guerra, um war baby, e é um elemento que não foi suficientemente compreendido, porque tentou-se fazer esquecer a guerra. Há dois momentos importantes na Segunda Guerra Mundial (eu a vivi, eu tinha 10 anos, eu nasci em 1932). Houve primeiramente a Guerra Relâmpago, a Blitzkrieg. Censurou-se esse aspecto, e alguns historiadores negam a blitz, isto é, o fato de que a velocidade esteve na base da grande ruína, primeiro da Polônia, e em seguida da França. E então essa Blitzkrieg se esgotou nos países do leste e na União Soviética, porque lá havia a profundidade de campo que permitia amortecer. Eu sou, portanto, uma criança da Blitzkrieg, eu diria mesmo que eu sou talvez o único, o único que desde então jamais cessou de ser marcado pelo poder da velocidade. Não é somente o poder dos transportes, os carros de assalto contra a cavalaria polonesa que desembainhava o sabre contra os panzers… Há também a guerra das ondas da qual eu sou o filho: “Pom, pom, pom, pom”… a rádio de Londres que eu escutava no escuro com o meu pai. Há dois momentos capitais: a Blitzkrieg e a deportação que vai, aliás, junto com o mesmo movimento de invasão. Se a guerra de 1914 foi uma guerra de posição em que os exércitos se exterminaram no mesmo lugar durante anos, com a deportação, conduziu-se à Shoah no curso da Segunda Guerra Mundial.

DIPLOMATIQUE: O senhor é muito sensibilizado pelas tragédias que ocorrem em nossa época. O senhor quis inclusive que fosse criado um “Museu dos Acidentes”, após a exposição feita sobre esse tema na Fundação Cartier para a Arte Contemporânea. O senhor já avançou essa ideia, mas por enquanto sem sucesso junto às instituições, e agora o senhor propõe a criação de uma “Universidade do Desastre”. De que se trata exatamente? Não se poderia pensar que a guerra terá sido, para o senhor, uma universidade do desastre?

VIRILIO: Com efeito. Quando eu falo de “Universidade do Desastre” não é de modo algum o desastre da universidade, é o contrário: eu quero dizer que “o pior provoca o melhor”. A universidade europeia apareceu em Bolonha e alhures aproximadamente em 1100, 1200, após o “grande medo” do ano 1000, em oposição à grande barbárie. E ela foi, essa universidade, um coletivo judeo-cristão, greco-latino e árabe. Alguns negam o grande medo do primeiro milênio, como alguns negam, hoje, a blitz. Há aí alguma coisa que, no meu entendimento, faz parte do segredo da velocidade. Se “o tempo é dinheiro”, a velocidade é poder. Eu lembro que para os banqueiros, para que haja mais-valia, é preciso que haja a velocidade de troca. A questão da velocidade é uma questão mascarada; não mascarada por um complô, mas mascarada por sua simplicidade. Riqueza e velocidade estão vinculadas. É conhecido o vínculo da riqueza e do poder como da lei do mais forte. Mas a lei do mais forte é a lei do mais rápido. A questão da “dromologia” é a questão da velocidade que, hoje, mudou de natureza. Na origem, a velocidade é o tesouro dos faraós; é a tesaurização, quer dizer, a acumulação e, então, muito rapidamente, tornar-se-á especulação. E aí o movimento de acumulação vai passar na aceleração. Os dois estão vinculados. Acumulação do tesouro que tornar-se-á tesouro público, em seguida especulação, e hoje financeirização com os sistemas de cotação automática em alta frequência que fazem explodir a bolsa de valores. Veja, estamos diante de algo extraordinário, é que nós não sabemos o que é a velocidade em nossos dias. As pessoas me dizem que é preciso uma economia política da velocidade, e, de fato, é preciso uma, mas é preciso primeiro uma dromologia, ou seja, revelar na vida política, no sentido amplo do poder, a natureza da velocidade em nosso tempo. Essa velocidade mudou de natureza. Essencialmente ela foi a revolução dos transportes. Até o século XX, até a blitz, vimos que a revolução das riquezas é uma revolução dos transportes: o cavalo, o navio, o trem, o avião, os sinais mecânicos.

DIPLOMATIQUE: No final do século XX, passa-se não mais à revolução dos transportes, mas à das transmissões instantâneas.

VIRILIO: Durante a guerra, ainda garotinho, eu participei da Resistência com os meus pais graças à guerra das ondas que já era uma guerra eletromagnética. Uma guerra da velocidade das ondas. Marconi e sua invenção era, também, uma revolução da velocidade. Começava-se a pôr em obra a velocidade das ondas eletromagnéticas, isto é, das ondas da velocidade da luz. E, claro, com a televisão, os computadores e a Internet, nós entramos numa fase que hoje atinge o seu limite; a velocidade da luz em que o tempo humano, o tempo da negociação, da especulação, em que a inteligência do homem, do especulador, dos cotadores é ultrapassada pelos automatismos. Aliás, quando a bolsa quebrou em 6 de maio do ano passado em Wall Street, em alguns milésimos de segundos houve 23.000 operações, o sistema entrou em pane e bilhões foram perdidos em dez minutos.

DIPLOMATIQUE: O que preocupa o senhor são os limites do tempo humano?

VIRILIO: Sim, é preciso trabalhar sobre a natureza do poder da velocidade atualmente, porque a velocidade da luz é um absoluto e é o limite do tempo humano. Nós estamos no “tempo-máquina”; o tempo humano é sacrificado como os escravos eram sacrificados no culto solar de antigamente. Eu o digo, nós estamos num novo Iluminismo em que a velocidade da luz é um culto. É um poder absoluto que se esconde atrás do progresso, e é por isso que eu afirmo que a velocidade é a propaganda do progresso. Eu não tenho nada contra o progresso. Quando eu digo que é preciso “ir mais devagar”, alguns zombam de mim. Pensam que eu condeno a revolução dos transportes, dos trens, dos carros, dos aviões, que eu sou contra os computadores e contra a Internet. Não é nesse nível que as coisas estão em jogo…

DIPLOMATIQUE: O que o senhor combate é a aceleração do real que põe em questão a percepção das aparências sensíveis e daquilo que a fenomenologia chama de “ser-no-mundo”?

VIRILLIO: Sim, o que a revolução dos transportes era para a aceleração da história e os movimentos migratórios, a revolução das transmissões instantâneas o é para a aceleração da realidade percebida. É um acontecimento alucinante, estupeficante. A velocidade é uma ebriedade. Uma embriaguez que pode ser “scópica” ou sonora – daí, aliás, a passagem do muro do som. Com as telecomunicações, utiliza-se a força de impacto da aceleração para fazer passar coisas que não estão na realidade pública, ou seja, no espaço real público, mas na realidade privada, ou antes transmitidas em tempo real por sociedades privadas. A tal ponto que a questão da imaginação, e aquela, filosófica, do “ser-no-mundo”, do aqui e do agora, tornam-se centrais. Nós estamos, assim, em plena crise da ciência, do que eu chamo de “acidente dos conhecimentos”, “acidente das substâncias” e “acidente das distâncias”. Essa questão da velocidade, desde Einstein, está no cerne da relatividade outrora especial, e hoje generalizada, que está em vias de se chocar contra um muro, o que eu chamo de “muro do tempo”. O que se passou em Wall Street me interessa muito porque as pessoas de Wall Street se chocaram contra o muro do tempo e o muro do dinheiro. É um fenômeno político maior no momento em que os algoritmos e os programas de computador dominam a vida econômica, e eu pretendo que a “relatividade especial” deveria ser um problema encarado pelo Estado. Se o século XX foi o século da conquista do ar e do espaço, eu penso que o século XXI deveria se questionar não somente sobre as nanotecnologias, mas, também, sobre as nanocronologias, isto é, sobre o tempo infinitesimal, sobre a conquista do “infinitamente pequeno do tempo”.

DIPLOMATIQUE: Parece-me que, nos textos do senhor, o estilo quer sempre ecoar o assunto estudado, e quando senhor pensa a velocidade, é a própria escrita que deve ir rápido.

VIRILIO: Absolutamente. E nesse sentido, como o mostra Proust, todo verdadeiro escritor escreve “numa espécie de língua estrangeira”. Minha língua estrangeira é a velocidade, é a aceleração do real. No que respeita à velocidade da minha escrita, trata-se da herança dos futuristas, e eu sinto a dromologia como uma musicologia. O problema não é nem de acelerar nem de desacelerar, mas de seguir uma linha melódica.

DIPLOMATIQUE: Há quem diga que o senhor pratica uma “escrita rapsódica”.

VIRILIO: Trata-se do ritmo. As sociedades antigas eram sociedades ritmológicas. Havia o calendário, a liturgia, as festas que estruturavam a linha melódica de tal ou qual sociedade. Os ritmos são muito importantes, você sabe, pois trata-se do sopro. Quando Bergson e Einstein se encontraram, eles não se compreenderam a esse respeito. O primeiro falava de “duração”, do vivo; o segundo, do vazio e do veloz. Saiba, no entanto, que será necessário conciliá-los, caso contrário o futuro do século XXI será um caos global pior que o nazismo ou o comunismo, que não tem nada a ver com a anarquia. Não, um caos global pior que tudo!

DIPLOMATIQUE: É por isso que o pensamento do senhor torna-se mais e mais dramático e religioso nos últimos tempos?

VIRILIO: Eu sou um católico que se converteu já adulto, isso é importante. Meu pai era comunista, e minha mãe, católica. Acontece que eu conheci o abade Pierre e padres operários. Mas eu permaneço sozinho sobre a minha senda.

DIPLOMATIQUE: Alguns criticam o senhor por descrever situações que seriam exagerações fantasistas, quando o senhor descreve este temor da solidão gerado pelas telecomunicações, notadamente pela Internet ou o celular. Não estaria o senhor realizando, talvez, uma especulação sobre “mundos possíveis”, seguindo uma espécie de método “transcendental” de investigação?

VIRILIO: Eu quero reunir o que foi separado, quero dizer, a filosofia e a física. Trata-se fundamentalmente de uma reinvenção filosófica para fazer frente a esta matematização do mundo, esta rapidez que ultrapassa a consciência. Eu me sinto no limiar de uma filosofia sem igual. Tal como Heráclito ou Parmênides, estamos aqui na origem – daí a Universidade do Desastre. Todo o trabalho desta seria um questionamento sobre o “desastre do êxito”. O que se acaba de descrever é o sucesso da tecnociência. Ora, é imperativo reconciliar e lançar a “filociência”. O que está aí em jogo é a vida ou a morte da humanidade. Se o homem não pode mais falar e se ele transfere o poder de enunciação a aparelhos, encontramo-nos, pois, diante de uma tirania sem igual. Físicos que são meus amigos estão conscientes disso, do que se perfila, um “acidente dos conhecimentos”. Isso nos conduz à árvore da vida que só tem referência na origem do Gênese, ou seja, o mito da vida… E aí eu o digo enquanto cristão e enquanto escritor: o acidente dos conhecimentos é o pecado original.

DIPLOMATIQUE: O que significa em nossa época “ser sábio”, quando nós somos forçados a uma especialização crescente e incessante, assim como à busca de uma “resposta automatizada” nos motores de pesquisa e nos bancos de dados que ultrapassam de longe o que a memória individual pode abarcar em uma vida? Como nós podemos cultivar nossa lucidez nesta maré enlouquecedora de informações?

VIRILIO: No que me toca mais diretamente, eu sou um urbanista, quer dizer que eu trabalho sobre o habitat. E o próprio de um urbanista é trabalhar sobre o habitar, o “ser-aqui”. O “ser-aqui-junto”. É isso o habitat: é o lugar de nossos hábitos. Os dois mantêm um vínculo muito estreito, isto é, a possibilidade de durar; o hábito é o que se reproduz. É o “ser-junto”. Não simplesmente o “ser-junto” do socius, mas o “ser-junto” da natureza no habitat comum com a nossa irmã, a chuva, e o nosso irmão, o sol, como diriam os franciscanos… É isso a arquitetura! É abrigar o vivente.

DIPLOMATIQUE: Perante o excesso contemporâneo de informações, e à velocidade sempre acelerada do desenrolar dos acontecimentos se desdobrando mundialmente nas imagens de nossas telas, não estamos testemunhando uma verdadeira desconstrução da cultura geral?

VIRILIO: Você utilizou na sua pergunta a palavra “desconstrução”. Eu creio que Derrida tinha razão para o fim do século XX. O início do século XX é a destruição pura e simplesmente através da ruína das cidades, através da ruína dos corpos. É a destruição; não se pode dizer que Auschiwtz ou Hiroshima sejam “desconstruções”… São puras destruições. Ora, eu creio – e eu o digo e o escrevo – que o século XXI será a desorientação, quer dizer, a perda de todas as referências – se a humanidade continuar desse jeito, e ela não continuará. Portanto, eu não creio de maneira alguma no fim do mundo. Mas o que eu quero dizer é a desorientação: não sabemos mais onde estamos nem no espaço nem no tempo. E aí, o geômetra que eu sou, o arquiteto que eu sou, sabe o que é a orientação. A arquitetura é primeira; ela é composição; ela é habitat comum entre os seres e as coisas. Pois ser é “ser-no-mundo”, e é o que eu digo: o problema não é de ser, mas de ser-no-mundo, em outras palavras, de ser-no-corpo-territorial. Isso não tem nada a ver com nacionalismo. Simplesmente não se pode ser sem “ser-no-mundo”. Em nossa época, todavia, o essencial se passa no vazio. Se você olhar hoje em dia, o poder não é mais geopolítico, religado ao solo, ele é aeropolítico: as ondas, os aviões e os foguetes traçam o porvir. A história se transferiu da terra ao céu, com toda a dimensão mística de adoração do cosmos, do grande vazio sideral, das ondas que se propagam etc. que isso supõe. As sociedades históricas eram sociedades geopolíticas, ou seja, inscritas nos lugares. O acontecimento, conforme com que eu digo, o acontecimento “tem lugar”; logo existe uma natureza do lugar que tange ao acontecimento. E essa relação com o “ter lugar” foi ocultada. É uma noção tão banal… Significa, portanto, que eu não posso ser sem ter lugar. Não é um problema de identidade – não, situado, orientado, in situ, hic et nunc, aqui e agora.

DIPLOMATIQUE: Uma última pergunta. O senhor pensa que o sistema econômico já está sendo transtornado pela ecologia?

VIRILIO: Doravante a economia e a ecologia devem fundir-se porque o mundo é finito, porque o mundo é demasiado pequeno para o progresso. Nós esgotamos a matéria do mundo, nós poluímos sua substância e nós poluímos suas distâncias. E nós estamos perante à fusão próxima da ecologia e da economia. Vê-se bem as dificuldades com o encontro do Grupo de Informação sobre o Clima em Copenhague. Vê-se bem a dificuldade que há em plena crise econômica, nos Estados Unidos e no mundo, a tomar medidas ecológicas. Portanto, inevitavelmente, o fato de que a Terra é muito pequena para a velocidade, para a velocidade do progresso, exige a fusão dos dois. Daí a importância de uma Universidade do Desastre e a reinvenção de um pensamento, de um intelectual coletivo, como o foi a universidade das origens. É indispensável. Até o momento ela não existe; nós estamos na origem de um novo mundo. E eu gostaria de ser mais jovem para poder viver esse Novo Mundo que vai nascer na dor do confronto. Mas que é indispensável. Nenhum homem, seja qual for a sua cor política, não está à altura desse acontecimento que se assemelha à Renascença italiana… E ao mesmo tempo é tão excitante! É maravilhoso! Como já dizia Karl Krauss: “que grande época”!

Guilherme Soares dos Santos – Filósofo, mestre em filosofia política e ética pela Universidade Paris Sorbonne e, atualmente, é doutorando em filosofia contemporânea pela Universidade Paris 8, onde estuda o pensamento de Gilles Deleuze.

Foto e tradução: Guilherme Soares dos Santos

Palavras chave: Paul Virilio, velocidade, transformação, corrida, cultura, teoria