Arquivo da categoria: Uncategorized

Scientists struggle with limits – and risks – of advocacy (eenews.net)

Monday, July 9, 2012

Paul Voosen, E&E reporter

Jon Krosnick has seen the frustration etched into the faces of climate scientists.

For 15 years, Krosnick has charted the rising public belief in global warming. Yet, as the field’s implications became clearer, action has remained elusive. Science seemed to hit the limits of its influence. It is a result that has prompted some researchers to cross their world’s no man’s land — from advice to activism.

As Krosnick has watched climate scientists call for government action, he began pondering a recent small dip in the public’s belief. And he wondered: Could researchers’ move into the political world be undermining their scientific message?

Jon Krosnick
Stanford’s Jon Krosnick has been studying the public’s belief in climate change for 15 years, but only recently did he decide to probe their reaction to scientists’ advocacy. Photo courtesy of Jon Krosnick.

“What if a message involves two different topics, one trustworthy and one not trustworthy?” said Krosnick, a communication and psychology professor at Stanford University. “Can the general public detect crossing that line?”

His results, not yet published, would seem to say they can.

Using a national survey, Krosnick has found that, among low-income and low-education respondents, climate scientists suffered damage to their trustworthiness and credibility when they veered from describing science into calling viewers to ask the government to halt global warming. And not only did trust in the messenger fall — even the viewers’ belief in the reality of human-caused warming dropped steeply.

It is a warning that, even as the frustration of inaction mounts and the politicization of climate science deepens, researchers must be careful in getting off the political sidelines.

“The advice that comes out of this work is that all of us, when we claim to have expertise and offer opinions on matters [in the world], need to be guarded about how far we’re willing to go,” Krosnick said. Speculation, he added, “could compromise everything.”

Krosnick’s survey is just the latest social science revelation that has reordered how natural scientists understand their role in the world. Many of these lessons have stemmed from the public’s and politicians’ reactions to climate change, which has provided a case study of how science communication works and doesn’t work. Complexity, these researchers have found, does not stop at their discipline’s verge.

For decades, most members of the natural sciences held a simple belief that the public stood lost, holding out empty mental buckets for researchers to fill with knowledge, if they could only get through to them. But, it turns out, not only are those buckets already full with a mix of ideology and cultural belief, but it is incredibly fraught, and perhaps ineffective, for scientists to suggest where those contents should be tossed.

It’s been a difficult lesson for researchers.

“Many of us have been saddened that the world has done so little about it,” said Richard Somerville, a meteorologist at the Scripps Institution of Oceanography and former author of the United Nations’ authoritative report on climate change.

“A lot of physical climate scientists, myself included, have in the past not been knowledgeable about what the social sciences have been saying,” he added. “People who know a lot about the science of communication … [are] on board now. But we just don’t see that reflected in the policy process.”

While not as outspoken as NASA’s James Hansen, who has taken a high-profile moral stand alongside groups like 350.org and Greenpeace, Somerville has been a leader in bringing scientists together to call for greenhouse gas reductions. He helped organize the 2007 Bali declaration, a pointed letter from more than 200 scientists urging negotiators to limit global CO2 levels well below 450 parts per million.

Such declarations, in the end, have done little, Somerville said.

“If you look at the effect this has had on the policy process, it is very, very small,” he said.

This failed influence has spurred scientists like Somerville to partner closely with social scientists, seeking to understand why their message has failed. It is an effort that received a seal of approval this spring, when the National Academy of Sciences, the nation’s premier research body, hosted a two-day meeting on the science of science communication. Many of those sessions pivoted on public views of climate change.

It’s a discussion that’s been long overdue. When it comes to how the public learns about expert opinions, assumptions mostly rule in the sciences, said Dan Kahan, a professor of law and psychology at Yale Law School.

“Scientists are filled with conjectures that are plausible about how people make sense about information,” Kahan said, “only some fraction of which [are] correct.”

Shifting dynamic

Krosnick’s work began with a simple, hypothetical scene: NASA’s Hansen, whose scientific work on climate change is widely respected, walks into the Oval Office.

As he has since the 1980s, Hansen rattles off the inconvertible, ever-increasing evidence of human-caused climate change. It’s a stunning litany, authoritative in scope, and one the fictional president — be it a Bush or an Obama — must judge against Hansen’s scientific credentials, backed by publications and institutions of the highest order. If Hansen stops there, one might think, the case is made.

But he doesn’t stop. Hansen continues, arguing, as a citizen, for an immediate carbon tax.

“Whoa, there!” Krosnick’s president might think. “He’s crossed into my domain, and he’s out of touch with how policy works.” And if Hansen is willing to offer opinions where he lacks expertise, the president starts to wonder: “Can I trust any of his work?”

Richard Somerville
Part of Scripps’ legendary climate team — Charles David Keeling was an early mentor — Richard Somerville helped organize the 2007 Bali declaration by climate scientists, calling for government action on CO2 emissions. Photo by Sylvia Bal Somerville.

Researchers have studied the process of persuasion for 50 years, Krosnick said. Over that time, a few vital truths have emerged, including that trust in a source matters. But looking back over past work, Krosnick found no answer to this question. The treatment was simplistic. Messengers were either trustworthy or not. No one had considered the case of two messages, one trusted and one shaky, from the same person.

The advocacy of climate scientists provided an excellent path into this shifting dynamic.

Krosnick’s team hunted down video of climate scientists first discussing the science of climate change and then, in the same interview, calling for viewers to pressure the government to act on global warming. (Out of fears of bruised feelings, Krosnick won’t disclose the specific scientists cited.) They cut the video in two edits: one showing only the science, and one showing the science and then the call to arms.

Krosnick then showed a nationally representative sample of 793 Americans one of three videos: the science-only cut, the science and political cut, and a control video about baking meatloaf (The latter being closer to politics than Krosnick might admit). The viewers were then asked a series of questions both about their opinion of the scientist’s credibility and their overall beliefs on global warming.

For a cohort of 548 respondents who either had a household income under $50,000 or no more than a high school diploma, the results were stunning and statistically significant. Across the board, the move into politics undermined the science.

The viewers’ trust in the scientist dropped 16 percentage points, from 48 to 32 percent. Their belief in the scientist’s accuracy fell from 47 to 36 percent. Their overall trust in all scientists went from 60 to 52 percent. Their belief that government should “do a lot” to stop warming fell from 62 to 49 percent. And their belief that humans have caused climate change fell 14 percentage points, from 81 to 67 percent.

Krosnick is quick to note the study’s caveats. First, educated or wealthy viewers had no significant reaction to the political call and seemed able to parse the difference between science and a personal political view. The underlying reasons for the drop are far from clear, as well — it could simply be a function of climate change’s politicization. And far more testing needs to be done to see whether this applies in other contexts.

With further evidence, though, the implications could be widespread, Krosnick said.

“Is it the case that the principle might apply broadly?” he asked. “Absolutely.”

‘Fraught with misadventure’

Krosnick’s study is likely rigorous and useful — he is known for his careful methods — but it still carries with it a simple, possibly misleading frame, several scientists said.

Most of all, it remains hooked to a premise that words float straight from the scientist’s lips to the public’s ears. The idea that people learn from scientists at all or that they are simply misunderstanding scientific conclusions is not how reality works, Yale’s Kahan said.

“The thing that goes into the ear is fraught with misadventure,” he said.

Kahan has been at the forefront of charting how the empty-bucket theory of science communication — called the deficit model — fails. People interpret new information within the context of their own cultural beliefs, peers and politics. They use their reasoning to pick the evidence that supports their views, rather than the other way around. Indeed, recent work by Kahan found that higher-educated respondents were more likely to be polarized than their less-educated peers.

Krosnick’s study will surely spur new investigations, Kahan said, though he resisted definite remarks until he could see the final work. If the study’s conditions aren’t realistic, even a simple model can have “plenty of implications for all kinds of ways of which people become exposed to science,” he said.

The survey sits well with other research in the field and carries an implication about what role scientists should play in scientific debates, added Matthew Nisbet, a communication professor at American University.

“As soon as you start talking about a policy option, you’re presenting information that is potentially threatening to people’s values or identity,” he said. The public, he added, doesn’t “view scientists and scientific information in a vacuum.”

The deficit model has remained an enduring frame for scientists, many of whom are just becoming aware of social science work on the problem. Kahan compares it to the stages of grief. The first stage was that the truth just needs to be broadcast to change minds. The second, and one still influential in the scientific world, is that if the message is just simplified, the right images used, than the deficit will be filled.

“That too, I think, is a stage of misperception about how this works,” Kahan said.

Take the hand-wringing about science education that accompanied a recent poll finding that 46 percent of the United States believed in a creationist origin for humans. It’s a result that speaks to belief, not an understanding of evolution. Many surveyed who believed in evolution would still fail to explain natural selection, mutation or genetic variance, Kahan said, just as they don’t have to understand relativity to use their GPS.

Much of science doesn’t run up against the public’s belief systems and is accepted with little fuss. It’s not as if Louis Pasteur had to sell pasteurization by using slick images of children getting sick; for nearly all of society, it was simply a useful tool. People want to defer to the experts, as long as they don’t have to concede their beliefs on the way.

“People know what’s known without having a comprehension of why that’s the truth,” Kahan said.

There remains a danger in the emerging consensus that all scientific knowledge is filtered by the motivated reasoning of political and cultural ideology, Nisbet added. Not all people can be sorted by two, or even four, variables.

“In the new ideological deficit model, we tend to assume that failures in communication are caused by conservative media and conservative psychology,” he said. “The danger in this model is that we define the public in exclusively binary terms, as liberals versus conservatives, deniers versus believers.”

‘Crossing that line’

So why do climate scientists, more than most fields, cross the line into advocacy?

Most of all, it’s because their scientific work tells them the problem is so pressing, and time dependent, given the centuries-long life span of CO2 emissions, Somerville said.

“You get to the point where the emissions are large enough that you’ve run out of options,” he said. “You can no longer limit [it]. … We may be at that point already.”

There may also be less friction for scientists to suggest communal solutions to warming because, as Nisbet’s work has found, scientists tend to skew more liberal than the general population with more than 50 percent of one U.S. science society self-identifying as “liberal.” Given this outlook, they are more likely to accept efforts like cap and trade, a bill that, in implying a “cap” on activity, rubbed conservatives wrong.

Dan Kahan
A prolific law professor and psychologist at Yale, Dan Kahan has been charting how the public comes to, and understands, science. Photo courtesy of Dan Kahan.

“Not a lot of scientists would question if this is an effective policy,” Nisbet said.

It is not that scientists are unaware that they are moving into policy prescription, either. Most would intuitively know the line between their work and its political implications.

“I think many are aware when they’re crossing that line,” said Roger Pielke Jr., an environmental studies professor at the University of Colorado, Boulder, “but they’re not aware of the consequences [of] doing so.”

This willingness to cross into advocacy could also stem from the fact that it is the next logical skirmish. The battle for public opinion on the reality of human-driven climate change is already over, Pielke said, “and it’s been won … by the people calling for action.”

While there are slight fluctuations in public belief, in general a large majority of Americans side with what scientists say about the existence and causes of climate change. It’s not unanimous, he said, but it’s larger than the numbers who supported actions like the Montreal Protocol, the bank bailout or the Iraq War.

What has shifted has been its politicization: As more Republicans have begun to disbelieve global warming, Democrats have rallied to reinforce the science. And none of it is about the actual science, of course. It’s a fact Scripps’ Somerville now understands. It’s a code, speaking for fear of the policies that could happen if the science is accepted.

Doubters of warming don’t just hear the science. A policy is attached to it in their minds.

“Here’s a fact,” Pielke said. “And you have to change your entire lifestyle.”

For all the focus on how scientists talk to the public — whether Hansen has helped or hurt his cause — Yale’s Kahan ultimately thinks the discussion will mean very little. Ask most of the public who Hansen is, and they’ll mention something about the Muppets. It can be hard to accept, for scientists and journalists, but their efforts at communication are often of little consequence, he said.

“They’re not the primary source of information,” Kahan said.

‘A credible voice’

Like many of his peers, Somerville has suffered for his acts of advocacy.

“We all get hate email,” he said. “I’ve given congressional testimony and been denounced as an arrogant elitist hiding behind a discredited organization. Every time I’m on national news, I get a spike in ugly email. … I’ve received death threats.”

There are also pressures within the scientific community. As an elder statesman, Somerville does not have to worry about his career. But he tells young scientists to keep their heads down, working on technical papers. There is peer pressure to stay out of politics, a tension felt even by Somerville’s friend, the late Stephen Schneider, also at Stanford, who was long one of the country’s premier speakers on climate science.

He was publicly lauded, but many in the climate science community grumbled, Somerville said, that Schneider should “stop being a motormouth and start publishing technical papers.”

But there is a reason tradition has sustained the distinction between advising policymakers and picking solutions, one Krosnick’s work seems to ratify, said Michael Mann, a climatologist at Pennsylvania State University and a longtime target of climate contrarians.

“It is thoroughly appropriate, as a scientist, to discuss how our scientific understanding informs matters of policy, but … we should stop short of trying to prescribe policy,” Mann said. “This distinction is, in my view, absolutely critical.”

Somerville still supports the right of scientists to speak out as concerned citizens, as he has done, and as his friend, NASA’s Hansen, has done more stridently, protesting projects like the Keystone XL pipeline. As long as great care is taken to separate the facts from the political opinion, scientists should speak their minds.

“I don’t think being a scientist deprives you of the right to have a viewpoint,” he said.

Somerville often returns to a quote from the late Sherwood Rowland, a Nobel laureate from the University of California, Irvine, who discovered the threat chlorofluorocarbons posed to ozone: “What’s the use of having developed a science well enough to make predictions if, in the end, all we’re willing to do is stand around and wait for them to come true?”

Somerville asked Rowland several times whether the same held for global warming.

“Yes, absolutely,” he replied.

It’s an argument that Krosnick has heard from his own friends in climate science. But often this fine distinction gets lost in translation, as advocacy groups present the scientist’s personal message as the message of “science.” It’s luring to offer advice — Krosnick feels it himself when reporters call — but restraint may need to rule.

“In order to preserve a credible voice in public dialogue,” Krosnick said, “it might be that scientists such as myself need to restrain ourselves as speaking as public citizens.”

Broader efforts of communication, beyond scientists, could still mobilize the public, Nisbet said. Leave aside the third of the population who are in denial or alarmed about climate change, he said, and figure out how to make it relevant to the ambivalent middle.

“We have yet to really do that on climate change,” he said.

Somerville is continuing his efforts to improve communication from scientists. Another Bali declaration is unlikely, though. What he’d really like to do is get trusted messengers from different moral realms beyond science — leaders like the Dalai Lama — to speak repeatedly on climate change.

It’s all Somerville can do. It would be too painful to accept the other option, that climate change is like racism, war or poverty — problems the world has never abolished.

“[It] may well be that it is a problem that is too difficult for humanity to solve,” he said.

Irony Seen Through the Eye of MRI (Science Daily)

ScienceDaily (Aug. 3, 2012) — In the cognitive sciences, the capacity to interpret the intentions of others is called “Theory of Mind” (ToM). This faculty is involved in the understanding of language, in particular by bridging the gap between the meaning of the words that make up a statement and the meaning of the statement as a whole.

In recent years, researchers have identified the neural network dedicated to ToM, but no one had yet demonstrated that this set of neurons is specifically activated by the process of understanding of an utterance. This has now been accomplished: a team from L2C2 (Laboratoire sur le Langage, le Cerveau et la Cognition, Laboratory on Language, the Brain and Cognition, CNRS / Université Claude Bernard-Lyon 1) has shown that the activation of the ToM neural network increases when an individual is reacting to ironic statements.

Published in Neuroimage, these findings represent an important breakthrough in the study of Theory of Mind and linguistics, shedding light on the mechanisms involved in interpersonal communication.

In our communications with others, we are constantly thinking beyond the basic meaning of words. For example, if asked, “Do you have the time?” one would not simply reply, “Yes.” The gap between what is saidand what it means is the focus of a branch of linguistics called pragmatics. In this science, “Theory of Mind” (ToM) gives listeners the capacity to fill this gap. In order to decipher the meaning and intentions hidden behind what is said, even in the most casual conversation, ToM relies on a variety of verbal and non-verbal elements: the words used, their context, intonation, “body language,” etc.

Within the past 10 years, researchers in cognitive neuroscience have identified a neural network dedicated to ToM that includes specific areas of the brain: the right and left temporal parietal junctions, the medial prefrontal cortex and the precuneus. To identify this network, the researchers relied primarily on non-verbal tasks based on the observation of others’ behavior[1]. Today, researchers at L2C2 (Laboratoire sur le Langage, le Cerveau et la Cognition, Laboratory on Language, the Brain and Cognition, CNRS / Université Claude Bernard-Lyon 1) have established, for the first time, the link between this neural network and the processing of implicit meanings.

To identify this link, the team focused their attention on irony. An ironic statement usually means the opposite of what is said. In order to detect irony in a statement, the mechanisms of ToM must be brought into play. In their experiment, the researchers prepared 20 short narratives in two versions, one literal and one ironic. Each story contained a key sentence that, depending on the version, yielded an ironic or literal meaning. For example, in one of the stories an opera singer exclaims after a premiere, “Tonight we gave a superb performance.” Depending on whether the performance was in fact very bad or very good, the statement is or is not ironic.

The team then carried out functional magnetic resonance imaging (fMRI) analyses on 20 participants who were asked to read 18 of the stories, chosen at random, in either their ironic or literal version. The participants were not aware that the test concerned the perception of irony. The researchers had predicted that the participants’ ToM neural networks would show increased activity in reaction to the ironic sentences, and that was precisely what they observed: as each key sentence was read, the network activity was greater when the statement was ironic. This shows that this network is directly involved in the processes of understanding irony, and, more generally, in the comprehension of language.

Next, the L2C2 researchers hope to expand their research on the ToM network in order to determine, for example, whether test participants would be able to perceive irony if this network were artificially inactivated.

Note:

[1] For example, Grèzes, Frith & Passingham (J. Neuroscience, 2004) showed a series of short (3.5 second) films in which actors came into a room and lifted boxes. Some of the actors were instructed to act as though the boxes were heavier (or lighter) than they actually were. Having thus set up deceptive situations, the experimenters asked the participants to determine if they had or had not been deceived by the actors in the films. The films containing feigned actions elicited increased activity in the rTPJ (right temporal parietal junction) compared with those containing unfeigned actions.

Journal Reference:

Nicola Spotorno, Eric Koun, Jérôme Prado, Jean-Baptiste Van Der Henst, Ira A. Noveck. Neural evidence that utterance-processing entails mentalizing: The case of ironyNeuroImage, 2012; 63 (1): 25 DOI:10.1016/j.neuroimage.2012.06.046

Multiple Husbands Serve as Child Support and Life Insurance in Some Cultures (Science Daily)

ScienceDaily (Aug. 2, 2012) — Marrying multiple husbands at the same time, or polyandry, creates a safety net for women in some cultures, according to a recent study by a University of Missouri researcher. Extra husbands ensure that women’s children are cared for even if their fathers die or disappear. Although polyandry is taboo and illegal in the United States, certain legal structures, such as child support payments and life insurance, fill the same role for American women that multiple husbands do in other cultures.

Marrying multiple husbands at the same time, or polyandry, creates a safety net for women in some cultures, according to a recent study by Kathrine Starkweather, anthropology doctoral student in MU’s Department of Anthropology. (Credit: Image courtesy of University of Missouri-Columbia)

“In America, we don’t meet many of the criteria that tend to define polyandrous cultures,” said Kathrine Starkweather, doctoral student in MU’s Department of Anthropology in the College of Arts and Science. “However, some aspects of American life mirror polyandrous societies. Child support payments provide for offspring when one parent is absent. Life insurance allows Americans to provide for dependents in the event of death, just as secondary husbands support a deceased husband’s children in polyandrous societies.”

Starkweather and her co-author, Raymond Hames, professor of anthropology at the University of Nebraska, examined 52 cultures with traditions of polyandry from all continents except Europe. They found that similar conditions seemed to influence cultures toward polyandry. Males frequently outnumbered females in these cultures, as a result of high mortality prior to adulthood. Although males out-numbered females, they also were more likely to die in warfare or hunting and fishing accidents or to be absent for other economic reasons. Polyandrous cultures also tended to be small scale and egalitarian.

In approximately half of the cultures studied, the other husbands were closely related to the first husband, a practice with economic repercussions. In previously studied polyandrous cultures, especially those of Nepal, Tibet and India, inheritance traditions called for land to be divided evenly among male offspring after a parent’s passing. That practice would have resulted in land being sub-divided into useless parcels too small to provide enough crops to feed a family. However, if several brothers married the same wife, the family farm would stay intact. In the small egalitarian cultures Starkweather studied land and property ownership was unusual. In these societies, younger brothers in the marriage often protected and provided food for the family in the absence of the older brother, who was often the primary husband.

“This research shows that humans are capable of tremendous variability and adaptability in their behaviors,” said Starkweather. “Human marriage structures aren’t written in stone; throughout history, people have adapted their societal norms to ensure the survival and well-being of their children.”

Journal Reference:

Katherine E. Starkweather, Raymond Hames. A Survey of Non-Classical PolyandryHuman Nature, 2012; 23 (2): 149 DOI: 10.1007/s12110-012-9144-x

*   *   *

Multiple Fathers Prevalent in Amazonian Cultures, Study Finds

ScienceDaily (Nov. 11, 2010) — In modern culture, it is not considered socially acceptable for married people to have extramarital sexual partners. However, in some Amazonian cultures, extramarital sexual affairs were common, and people believed that when a woman became pregnant, each of her sexual partners would be considered part-biological father.

Now, a new University of Missouri study published in the journalProceedings of the National Academy of Sciences has found that up to 70 percent of Amazonian cultures may have believed in the principle of multiple paternity.

“In these cultures, if the mother had sexual relations with multiple men, people believed that each of the men was, in part, the child’s biological father,” said Robert Walker, assistant professor of Anthropology in the College of Arts and Science. “It was socially acceptable for children to have multiple fathers, and secondary fathers often contributed to their children’s upbringing.”

Walker says sexual promiscuity was normal and acceptable in many traditional South American societies. He says married couples typically lived with the wife’s family, which he says increased their sexual freedom.

“In some Amazonian cultures, it was bad manners for a husband to be jealous of his wife’s extramarital partners,” Walker said. “It was also considered strange if you did nothave multiple sexual partners. Cousins were often preferred partners, so it was especially rude to shun their advances.”

Previous research had uncovered the existence of multiple paternity in some Amazonian cultures. However, anthropologists did not realize how many societies held the belief. Walker’s team analyzed ethnographies (the branch of anthropology that deals descriptively with cultures) of 128 societies across lowland South America, which includes Brazil and many of the surrounding countries. Multiple paternity is reported to appear in 53 societies, and singular paternity is mentioned in 23 societies. Ethnographies for 52 societies do not mention conception beliefs.

Walker’s team has several hypotheses on the benefits of multiple paternity. Women believed that by having multiple sexual partners they gained the benefit of larger gene pools for their children. He says women benefited from the system because secondary fathers gave gifts and helped support the child, which has been shown to increase child survival rates. In addition, brutal warfare was common in ancient Amazonia, and should the mother become a widow, her child would still have a father figure.

Men benefitted from the multiple paternity system because they were able to formalize alliances with other men by sharing wives. Walker hypothesizes that multiple paternity also strengthened family bonds, as brothers often shared wives in some cultures.

Walker collaborated with Mark Flinn, professor in the MU Department of Anthropology, and Kim Hill, professor in Arizona State University’s School of Human Evolution and Social Change.

Journal Reference:

R. S. Walker, M. V. Flinn, K. R. Hill. Evolutionary history of partible paternity in lowland South America.Proceedings of the National Academy of Sciences, 2010; 107 (45): 19195 DOI: 10.1073/pnas.1002598107

Mapping the Future of Climate Change in Africa (Science Daily)

ScienceDaily (Aug. 2, 2012) — Our planet’s changing climate is devastating communities in Africa through droughts, floods and myriad other disasters.

Children in the foothills of Drakensberg mountains in South Africa who still live in traditional rondavels on family homesteads. (Credit: Todd G. Smith, CCAPS Program)

Using detailed regional climate models and geographic information systems, researchers with the Climate Change and African Political Stability (CCAPS) program developed an online mapping tool that analyzes how climate and other forces interact to threaten the security of African communities.

The program was piloted by the Robert S. Strauss Center for International Security and Law at The University of Texas at Austin in 2009 after receiving a $7.6 million five-year grant from the Minerva Initiative with the Department of Defense, according to Francis J. Gavin, professor of international affairs and director of the Strauss Center.

“The first goal was to look at whether we could more effectively identify what were the causes and locations of vulnerability in Africa, not just climate, but other kinds of vulnerability,” Gavin said.

CCAPS comprises nine research teams focusing on various aspects of climate change, their relationship to different types of conflict, the government structures that exist to mitigate them, and the effectiveness of international aid in intervening. Although most CCAPS researchers are based at The University of Texas at Austin, the Strauss Center also works closely with Trinity College Dublin, the College of William and Mary, and the University of North Texas.

“In the beginning these all began as related, but not intimately connected, topics” Gavin said, “and one of the really impressive things about the project is how all these different streams have come together.”

Africa is particularly vulnerable to the effects of climate change due to its reliance on rain-fed agriculture and the inability of many of its governments to help communities in times of need.

The region is of increasing importance for U.S. national security, according to Gavin, because of the growth of its population, economic strength and resource importance, and also due to concerns about non-state actors, weakening governments and humanitarian disasters.

Although these issues are too complex to yield a direct causal link between climate change and security concerns, he said, understanding the levels of vulnerability that exist is crucial in comprehending the full effect of this changing paradigm.

The vulnerability mapping program within CCAPS is led by Joshua Busby, assistant professor at the Lyndon B. Johnson School of Public Affairs.

To determine the vulnerability of a given location based on changing climate conditions, Busby and his team looked at four different sources: 1) the degree of physical exposure to climate hazards, 2) population size, 3) household or community resilience, and 4) the quality of governance or presence of political violence.

The first source records the different types of climate hazards which could occur in the area, including droughts, floods, wildfires, storms and coastal inundation. However, their presence alone is not enough to qualify a region as vulnerable.

The second source — population size — determines the number of people who will be impacted by these climate hazards. More people create more demand for resources, potentially making the entire population more vulnerable.

The third source looks at how resilient a community is to adverse effects, analyzing the quality of their education and health, as well as whether they have easy access to food, water and health care.

“If exposure is really bad, it may exceed the capacity of local communities to protect themselves,” Busby said, “and then it comes down to whether or not the governments are going to be willing or able to help them.”

The final source accounts for the effectiveness of a given government, the amount of accountability present, how integrated it is with the international community, how politically stable it is, and whether there is any political violence present.

Busby and his team combined the four sources of vulnerability and gave them each equal weight, adding them together to form a composite map. Their scores were then divided into a ranking of five equal parts, or quintiles, going from the 20 percent of regions with the lowest vulnerability to the 20 percent with the highest.

The researchers gathered information for the tool from a variety of sources, including historic models of physical exposure from the United Nations Environment Programme (UNEP), population estimates from LandScan, as well as household surveys and governance assessments from the World Bank’s World Development and Worldwide Governance Indicators.

This data reflects past and present vulnerability, but to understand which places in Africa would be most vulnerable to future climate change, Busby and his team relied on the regional climate model simulations designed by Edward Vizy and Kerry Cook, both members of the CCAPS team from the Jackson School of Geosciences.

Vizy and Cook ran three, 20-year nested simulations of the African continent’s climate at the regional scales of 90 and 30 kilometers, using a derivation of the Weather Research and Forecasting Model of the National Center for Atmospheric Research. One was a control simulation representative of the years 1989-2008, and the others represented the climate as it may exist in 2041-2060 and 2081-2100.

“We’re adjusting the control simulation’s CO2 concentration, model boundary conditions, and sea surface temperatures to increased greenhouse gas forcing scenario conditions derived from atmosphere-ocean global climate models. We re-run the simulation to understand how the climate will operate under a different, warmer state at spatial resolutions needed for regional impact analyses,” Vizy said.

Each simulation took two months to complete on the Rangersupercomputer at the Texas Advanced Computing Center (TACC).

“We couldn’t run these simulations without the high-performance computing resources at TACC, it would just take too long. If it takes two months running with 200 processors, I can’t fathom doing it with one processor,” Vizy said.

Researchers input data from these vulnerability maps into an online mapping tool developed by the CCAPS program to integrate its various lines of climate, conflict and aid research. CCAPS’s current mapping tool is based on a prototype developed by the team to assess conflict patterns in Africa with the help of researchers at the TACC/ACES Visualization Laboratory (Vislab), according to Ashley Moran, program manager of CCAPS.

“The mapping tool is a key part of our effort to produce new research that could support policy making and the work of practitioners and governments in Africa,” Moran said. “We want to communicate this research in ways that are of maximum use to policymakers and researchers.”

The initial prototype of the mapping tool used the ArcGIS platform to project data onto maps. Working with its partner Development Gateway, CCAPS expanded the system to incorporate conflict, vulnerability, governance and aid research data.

After completing the first version of their model, Busby and his team carried out the process of ground truthing their maps by visiting local officials and experts in several African countries, such as Kenya and South Africa.

“The experience of talking with local experts was tremendously gratifying,” Busby said. “They gave us confidence that the things we’re doing in a computer lab setting in Austin do pick up on some of the ground-level expert opinions.”

Busby and his team complemented their maps with local perspectives on the kind of impact climate was already having, leading to new insights that could help perfect the model. For example, local experts felt the model did not address areas with chronic water scarcity, an issue the researchers then corrected upon returning home.

According to Busby, the vulnerability maps serve as focal points which can give way to further analysis about the issues they illustrate.

Some of the countries most vulnerable to climate change include Somalia, Sierra Leone, Guinea, Sudan and parts of the Democratic Republic of Congo. Knowing this allows local policymakers to develop security strategies for the future, including early warning systems against floods, investments in drought-resistant agriculture, and alternative livelihoods that might facilitate resource sharing and help prevent future conflicts. The next iteration of the online mapping tool to be released later this year will also incorporate the future projections of climate exposure from the models developed by Vizy and Cook.

The CCAPS team publishes their research in journals likeClimate Dynamics and The International Studies Review, carries out regular consultations with the U.S. government and governments in Africa, and participates in conferences sponsored by concerned organizations, such as the United Nations and the United States Africa Command.

“What this project has showed us is that many of the real challenges of the 21st century aren’t always in traditional state-to-state interactions, but are transnational in nature and require new ways of dealing with,” Gavin said.

Teen Survival Expectations Predict Later Risk-Taking Behavior (Science Daily)

ScienceDaily (Aug. 1, 2012) — Some young people’s expectations that they will not live long, healthy lives may actually foreshadow such outcomes.

New research published August 1 in the open access journal PLOS ONEreports that, for American teens, the expectation of death before the age of 35 predicted increased risk behaviors including substance abuse and suicide attempts later in life and a doubling to tripling of mortality rates in young adulthood.

The researchers, led by Quynh Nguyen of Northeastern University in Boston, found that one in seven participants in grades 7 to 12 reported perceiving a 50-50 chance or less of surviving to age 35. Upon follow-up interviews over a decade later, the researchers found that low expectations of longevity at young ages predicted increased suicide attempts and suicidal thoughts as well as heavy drinking, smoking, and use of illicit substances later in life relative to their peers who were almost certain they would live to age 35.

“The association between early survival expectations and detrimental outcomes suggests that monitoring survival expectations may be useful for identifying at-risk youth,” the authors state.

The study compared data collected from 19,000 adolescents in 1994-1995 to follow-up data collected from the same respondents 13-14 years later. The cohort was part of the National Longitudinal Study of Adolescent Health (Add Health), conducted by the Carolina Population Center and funded by the National Institutes of Health and 23 other federal agencies and foundations.

Journal Reference:

Quynh C. Nguyen, Andres Villaveces, Stephen W. Marshall, Jon M. Hussey, Carolyn T. Halpern, Charles Poole. Adolescent Expectations of Early Death Predict Adult Risk BehaviorsPLoS ONE, 2012; 7 (8): e41905 DOI: 10.1371/journal.pone.0041905

Brain Imaging Can Predict How Intelligent You Are: ‘Global Brain Connectivity’ Explains 10 Percent of Variance in Individual Intelligence (Science Daily)

ScienceDaily (Aug. 1, 2012) — When it comes to intelligence, what factors distinguish the brains of exceptionally smart humans from those of average humans?

New research suggests as much as 10 percent of individual variances in human intelligence can be predicted based on the strength of neural connections between the lateral prefrontal cortex and other regions of the brain. (Credit: WUSTL Image / Michael Cole)

As science has long suspected, overall brain size matters somewhat, accounting for about 6.7 percent of individual variation in intelligence. More recent research has pinpointed the brain’s lateral prefrontal cortex, a region just behind the temple, as a critical hub for high-level mental processing, with activity levels there predicting another 5 percent of variation in individual intelligence.

Now, new research from Washington University in St. Louis suggests that another 10 percent of individual differences in intelligence can be explained by the strength of neural pathways connecting the left lateral prefrontal cortex to the rest of the brain.

Published in the Journal of Neuroscience, the findings establish “global brain connectivity” as a new approach for understanding human intelligence.

“Our research shows that connectivity with a particular part of the prefrontal cortex can predict how intelligent someone is,” suggests lead author Michael W. Cole, PhD, a postdoctoral research fellow in cognitive neuroscience at Washington University.

The study is the first to provide compelling evidence that neural connections between the lateral prefrontal cortex and the rest of the brain make a unique and powerful contribution to the cognitive processing underlying human intelligence, says Cole, whose research focuses on discovering the cognitive and neural mechanisms that make human behavior uniquely flexible and intelligent.

“This study suggests that part of what it means to be intelligent is having a lateral prefrontal cortex that does its job well; and part of what that means is that it can effectively communicate with the rest of the brain,” says study co-author Todd Braver, PhD, professor of psychology in Arts & Sciences and of neuroscience and radiology in the School of Medicine. Braver is a co-director of the Cognitive Control and Psychopathology Lab at Washington University, in which the research was conducted.

One possible explanation of the findings, the research team suggests, is that the lateral prefrontal region is a “flexible hub” that uses its extensive brain-wide connectivity to monitor and influence other brain regions in a goal-directed manner.

“There is evidence that the lateral prefrontal cortex is the brain region that ‘remembers’ (maintains) the goals and instructions that help you keep doing what is needed when you’re working on a task,” Cole says. “So it makes sense that having this region communicating effectively with other regions (the ‘perceivers’ and ‘doers’ of the brain) would help you to accomplish tasks intelligently.”

While other regions of the brain make their own special contribution to cognitive processing, it is the lateral prefrontal cortex that helps coordinate these processes and maintain focus on the task at hand, in much the same way that the conductor of a symphony monitors and tweaks the real-time performance of an orchestra.

“We’re suggesting that the lateral prefrontal cortex functions like a feedback control system that is used often in engineering, that it helps implement cognitive control (which supports fluid intelligence), and that it doesn’t do this alone,” Cole says.

The findings are based on an analysis of functional magnetic resonance brain images captured as study participants rested passively and also when they were engaged in a series of mentally challenging tasks associated with fluid intelligence, such as indicating whether a currently displayed image was the same as one displayed three images ago.

Previous findings relating lateral prefrontal cortex activity to challenging task performance were supported. Connectivity was then assessed while participants rested, and their performance on additional tests of fluid intelligence and cognitive control collected outside the brain scanner was associated with the estimated connectivity.

Results indicate that levels of global brain connectivity with a part of the left lateral prefrontal cortex serve as a strong predictor of both fluid intelligence and cognitive control abilities.

Although much remains to be learned about how these neural connections contribute to fluid intelligence, new models of brain function suggested by this research could have important implications for the future understanding — and perhaps augmentation — of human intelligence.

The findings also may offer new avenues for understanding how breakdowns in global brain connectivity contribute to the profound cognitive control deficits seen in schizophrenia and other mental illnesses, Cole suggests.

Other co-authors include Tal Yarkoni, PhD, a postdoctoral fellow in the Department of Psychology and Neuroscience at the University of Colorado at Boulder; Grega Repovs, PhD, professor of psychology at the University of Ljubljana, Slovenia; and Alan Anticevic, an associate research scientist in psychiatry at Yale University School of Medicine.

Funding from the National Institute of Mental Health supported the study (National Institutes of Health grants MH66088, NR012081, MH66078, MH66078-06A1W1, and 1K99MH096801).

A burocracia e as violências invisíveis (Canal Ibase)

Renzo Taddei – Colunista do Canal Ibase

2 de agosto de 2012

matéria de capa da revista Time da semana passada chama a atenção para dados impressionantes sobre o suicídio entre militares norte-americanos. Desde 2004, o número de militares americanos que se suicidaram é maior do que os que foram mortos em combate no Afeganistão. Em média, um soldado americano na ativa se suicida por dia. Dentre os veteranos, um suicídio ocorre a cada 80 minutos. Entre 2004 e 2008, a taxa de suicídio entre militares cresceu 80%; só em 2012, esse crescimento já é de 18%. O suicídio ultrapassou os acidentes automobilísticos como primeira causa de morte de militares fora de situação de combate.

Foto: Matthew C. Moeller (Flickr)

O exército americano naturalmente busca, preocupado, identificar as causas do problema – até o momento sem sucesso. O problema está longe de ser óbvio, no entanto. Um terço dos suicidas nunca foi ao Afeganistão ou ao Iraque. 43% só foram convocados uma vez. Apenas 8,5% dos suicidas foram convocados três vezes ou mais. E, em sua maioria, são casados. Ou seja, nem todos os suicídios estão relacionados com traumas de campos de batalha.

Como é de se esperar, a burocracia militar busca um diagnóstico burocrático, para que a solução seja burocrática – de modo que não seja necessário cavar muito fundo na questão. O exército americano não tem psiquiatras e profissionais de serviço social suficientes. Muitos soldados se suicidam na longa espera por uma consulta psiquiátrica; outros, após terem sido receitados soníferos e oficialmente diagnosticados como “não sendo um perigo para si ou para os demais”. A cultura militar estigmatiza demonstrações de fraqueza, de modo que muitos evitam procurar ajuda a tempo. Viúvas acusam o exército de negligência; oficiais militares dizem que os soldados se suicidam devido a problemas conjugais.

Enquanto eu refletia sobre o assunto, chegou até mim a indicação de um livro chamadoDays of Destruction, Days of Revolt, do jornalista americano Chris Edges. O livro descreve a situação de algumas das cidades mais pobres dos Estados Unidos e chega à conclusão de que a pobreza de tais cidades não tem ligação com a ideia de subdesenvolvimento, mas sim ao que se poderia chamar de contra-desenvolvimento: são cidades que foram destruídas pela exploração capitalista.

Uma dessas cidades, Camden, no estado de Nova Jersey, é velha conhecida: durante meu doutorado nos Estados Unidos, trabalhei como fotógrafo para complementar minha renda, e estive em Camden várias vezes. Sempre me impressionaram os sinais explícitos de decadência do lugar: gente vivendo em prédios em ruínas; equipamentos públicos em decomposição; tráfico de droga à luz do dia. Agora descubro que se trata nada menos da cidade com menor renda per capita do país.

Chris Edges chama tais cidades de zonas de sacrifício do capitalismo. Ou seja, para que a exploração capitalista possa ocorrer sem impedimentos, o capital se move de um lugar para outro assim que os recursos ou as oportunidades se esgotam, deixando para trás cidades fantasmas, desemprego e depressão. A lógica desse padrão de exploração é bem conhecida desde Marx, pelo menos. O que Chris Edges faz é, com a ajuda do artista gráfico e também jornalista Joe Sacco, dar nova visibilidade a um problema que a burocracia oficial e a mídia fazem questão de não enxergar.

Que relação há entre os suicídios militares e a pobreza urbana dos Estados Unidos? Na verdade, me dei conta que há uma analogia fundamental entre os dois casos: em ambos há a conjugação do fato de que para que o sistema funcione – e estamos falando de sistemas diferentes para cada caso – alguém tem que ser sacrificado; e esse sacrifício e suas vítimas sacrificiais devem permanecer invisíveis para a maioria da população. O esforço dos Estados Unidos para manter sua hegemonia militar produz de forma sistemática a morte de uma imensa quantidade de gente, dentre americanos e seus supostos inimigos. E, para que a lucratividade se mantenha alta, florestas, cidades e empregos são destruídos, também de forma sistemática. Uma das expressões usadas nas ciências sociais para descrever esse estado de coisas é violência estrutural.

A invisibilidade dessas coisas é imprescindível – só assim pessoas bem intencionadas e de boa fé podem participar do sistema perverso, sem enxergar sua perversidade. Por isso, por exemplo, o governo Bush (pai) articulou com a imprensa americana um pacto para que não fossem publicadas fotos de caixões de soldados mortos em combate na primeira Guerra do Golfo. O pacto esteve em vigor por quase vinte anos, até que foidesfeito por Obama em 2009.

Mas a forma mais comum, e eficaz, de produzir as formas de violência estrutural que reproduzem desigualdades de forma invisível é a burocracia. E isso se dá, como nos lembra David Graeber, em razão do fato de que é função da burocracia ignorar as minúcias da vida cotidiana e reduzir tudo a fórmulas mecânicas e estatísticas. Isso nos permite focar nossas energias em um número menor de variáveis, e assim realizar coisas grandiosas e incríveis – para o bem e para o mal. O papel que a burocracia tem na produção da invisibilidade que mantém violências estruturais em funcionamento pode ser exemplificado através do uso de estatísticas em políticas públicas, por exemplo. Um dos programas oficiais de apoio à população rural do Nordeste mais importantes da atualidade, o Garantia Safra – em que pequenos agricultores adquirem um seguro e são indenizados em caso de perda de safra -, sistematicamente exclui agricultores em função de miopia burocrática. Para que os agricultores de um município recebam a indenização, as regras do programa exigem que haja 50% de perda da safra de todo o município. No entanto, basta ver a dimensão e os contornos dos municípios brasileiros para rapidamente concluir que não há relação necessária entre os limites municipais e os fenômenos meteorológicos. Há municípios que, de tão extensos, apresentam variações climáticas dramáticas dentro de suas fronteiras. Nesses casos, é comum que muitos agricultores com grandes perdas não recebam qualquer indenização, se outras regiões do município tiverem perdas menores. Por que é que o município tem que ser tomado como unidade de referência nesse caso? Porque há um aparato burocrático municipal para gerir o programa, e não há níveis burocráticos oficiais em escala menor. Ou seja, o sistema é burro mesmo que ninguém o seja, e quem sofre as consequências são os agricultores.

De forma correlata, índices nacionais ou estaduais de desemprego, crescimento do PIB e do PIB per capita, são unidades de referência centrais das políticas públicas atuais, ainda que sejam médias que não levem em consideração as situações extremas onde efetivamente existe vulnerabilidade socioeconômica. É como se o ditado que diz que “a corda sempre se parte no lado mais fraco” fosse sistematicamente ignorado. A vulnerabilidade de qualquer sistema – uma máquina, por exemplo – é definida pelo seu componente mais frágil. Qualquer engenheiro sabe disso; na verdade, a ideia é tão óbvia que qualquer um sabe disso. É ai que entra a burocracia: . Nesse contexto, não importa muito o que as pessoas sabem ou não: elas não serão capazes de identificar como a burocracia produz inconsistências e violência estrutural, a menos que sejam diretamente afetadas. Dessa forma, cidades como Camden ficam sistematicamente fora do radar, camufladas por estatísticas de âmbito estadual ou nacional.

Isso tudo está relacionado a outra notícia veiculada nos jornais na semana passada: a posição do Brasil nos debates na ONU sobre a regulação do comércio mundial de armas. Apesar das evidências de que as armas fabricadas no Brasil foram e continuam sendo vendidas a governos com histórico de violação dos direitos humanos, o Brasil se colocou frontalmente contra a regulação e criação de mecanismos que deem transparência a esse mercado. A justificativa, como não poderia deixar de ser, é burocrática: a disseminação de informações sobre capacidade bélica “poderia expor os recursos e a capacidade dos países […] de sustentar um conflito prolongado”. Colocar isso como argumento que tem precedência sobre a necessidade de proteger os direitos humanos é um escândalo. Por trás dessa desculpa esfarrapada, está a intenção de proteger a lucrativa indústria bélica brasileira. O que faz a história toda mais indigesta é o fato da Dilma ter sido vítima de tortura, durante o período em que o Brasil era dirigido pela burocracia militar. Como pode a mesma presidente que criou aComissão da Verdade ser conivente com uma indústria e um mercado manchados de sangue?

Esse episódio mostra que, em termos éticos, há menos diferença entre Estados Unidos e Brasil do que os brasileiros gostam de acreditar. Para proteger o capitalismo – já não mais num campo de luta ideológica, como à época da guerra fria, mas na forma de interesses privados reais e específicos de empresas norte-americanas -, os Estados Unidos passam a ser um perigo não apenas para nações vulneráveis não-alinhadas, mas a si mesmo, como revela a epidemia de suicídios entre militares. Da mesma forma, e pelas mesmas razões – ou seja, na caminhada rumo à sua consolidação como poder imperialista – o Brasil se preocupa com seus mortos políticos, e estrategicamente finge não ver que, para a engorda do seu PIB e para a prosperidade de sua indústria bélica, uma imensa quantidade de vidas – na África, no Oriente Média, no sul do Pará e nos morros cariocas –  é sacrificada.

Renzo Taddei é professor da Escola de Comunicação da Universidade Federal do Rio de Janeiro. É doutor em antropologia pela Universidade de Columbia, em Nova York. Dedica-se aos estudos sociais da ciência e tecnologia.

O mau exemplo da Apple (Mundo Sustentável)

Tecnologia
01/8/2012 – 09h36

por André Trigueiro*

apple O mau exemplo da AppleLixo eletrônico: Apple perde a certificação ambiental de 39 modelos de computador.

O gênero de resíduos que mais cresce no mundo é o de lixo eletrônico, ou seja, pilhas, baterias e tudo o que precise de eletricidade para funcionar (computadores, televisores, aparelhos de som, etc.). Os obsessivos lançamentos de novos produtos e o encurtamento da obsolescência programada (equipamentos projetados para durar pouco) são responsáveis por uma “tsunami” de lixo eletrônico que já ultrapassou 50 milhões de toneladas/ano em todo o mundo.

Para reduzir o volume de lixo – e facilitar o reúso ou a reciclagem dos componentes –, os Estados Unidos criaram uma certificação ambiental para produtos eletrônicos (Epeat) que valoriza, entre outras iniciativas, eficiência energética, maior facilidade para desmontar o equipamento após o descarte e segurança na segregação dos componentes tóxicos.

Segundo reportagem do Wall Street Journal, o governo dos Estados Unidos exige que 95% dos produtos eletrônicos adquiridos com recursos públicos sejam certificados pelos padrões da Epeat. Também seguem a certificação grandes empresas como a Ford e o HSBC. Duzentas e vinte e duas das mais importantes universidades norte-americanas também dão preferência a computadores certificados pelo Epeat.

Pois a mesma reportagem informa que um funcionário da Apple avisou, no final de junho, ao diretor executivo da Epeat, Robert Frisbee, que a orientação de design da empresa não era mais compatível com as exigências da Epeat, e que, por isso, pediu para tirar da lista de produtos sujeitos à certificação ambiental 39 computadores desktop, monitores e laptops (incluindo alguns modelos MacBook Pro e MacBook Air).

Foi a segunda vez em menos de três meses que a Apple desapontou seus seguidores mais antenados com os assuntos na sustentabilidade. No mês de abril a empresa apareceu como vilã em um relatório do Greenpeace que avaliou as fontes de energia mais utilizadas pelas gigantes de TI. O relatório How clean is your cloud? (O quão limpa é a sua nuvem?) informou que mais da metade da energia que mantém a estrutura da Apple funcionando tem origem em combustíveis fósseis, principalmente o carvão mineral.

Completando a onda de notícias ruins que alcançam a maçã de Steve Jobs, um relatório recente produzido pelo Centro de Descarte e Reúso de Resíduos de Informática da Universidade de São Paulo (USP) avaliou os esforços realizados pelas empresas que atuam no Brasil para recuperar os equipamentos que são descartados como lixo. Pela atual Lei Nacional de Resíduos Sólidos, essas empresas são obrigadas a promover a logística reversa, ou seja, a recuperação desses produtos quando eles são descartados como lixo. A Apple (juntamente com Samsung, Sony, IBM, Proviews e Brother) aparece na lista negra do relatório, justamente entre as empresas que não se prontificam a buscar o resíduo (quando o usuário está pronto para descartá-lo como lixo), nem aceitá-lo quando entregue em uma de suas lojas.

É pena saber de tudo isso depois de já ter um Iphone.

Se a Apple não desmonstrar de forma bastante convincente seu comprometimento com os valores socioambientais, será meu último tablet sabor maçã.

PS: Este espaço está completamente disponível para que a Apple faça as considerações que desejar.

André Trigueiro é jornalista com pós-graduação em Gestão Ambiental pela Coppe-UFRJ onde hoje leciona a disciplina Geopolítica Ambiental, professor e criador do curso de Jornalismo Ambiental da PUC-RJ, autor do livroMundo Sustentável – Abrindo Espaço na Mídia para um Planeta em Transformação, coordenador editorial e um dos autores dos livros Meio Ambiente no Século XXI, e Espiritismo e Ecologia, lançado na Bienal Internacional do Livro, no Rio de Janeiro, pela Editora FEB, em 2009. É apresentador do Jornal das Dez e editor-chefe do programa Cidades e Soluções, da Globo News. É também comentarista da Rádio CBN e colaborador voluntário da Rádio Rio de Janeiro.

** Publicado originalmente no site Mundo Sustentável.

Projeto proíbe uso de animais em pesquisas se houver sofrimento (Agência Câmara)

JC e-mail 4551, de 31 de Julho de 2012.

Está em análise na Câmara o Projeto de Lei 2905/11, do deputado Roberto De Lucena (PV-SP), que proíbe o uso de animais em pesquisas quando eles forem submetidos a algum tipo de sofrimento físico ou psicológico.

A proibição vale para estudos relacionados à produção de cosméticos, perfumes, produtos para higiene pessoal, limpeza doméstica, lavagem de roupas, de suprimentos de escritório, de protetores solares, além de vitaminas e suplementos.

Atualmente a Lei dos Crimes Ambientais (9.605/98), que define punições para quem praticar atividade lesiva ao meio ambiente, criminaliza apenas a realização de experiência dolorosa ou cruel em animal vivo, ainda que para fins didáticos ou científicos, quando existirem recursos alternativos.

Pelo projeto, quem não cumprir a determinação ficará sujeito às penalidades previstas na lei de crimes ambientais. No caso de provocar o sofrimento de animais durante pesquisa, a pessoa poderá pegar de três meses a um ano de prisão, além de ser multada.

Declaração Universal – O autor do projeto lembra que a Declaração Universal dos Direitos dos Animais, estabelecida pela Organização das Nações Unidas para a Educação, Ciência e a Cultura (Unesco) em 1978, prevê que experimentos que causem sofrimento físico ou psicológico violam os direitos dos animais e que métodos alternativos devem ser desenvolvidos e sistematicamente implementados.

“O ideal seria dispormos de técnicas alternativas ao uso de animais em toda atividade de ensino e pesquisa. A cura para muitas doenças depende de pesquisas médicas que utilizam animais e não podem ainda ser realizadas por métodos alternativos. Mas o que dizer, entretanto, de pesquisas relacionadas, por exemplo, à produção de cosméticos? Cosméticos não são produtos essenciais para a vida e a saúde humana. Não há, neste caso, nenhuma justificativa para tolerarmos o sofrimento de milhares de animais”, disse o parlamentar.

Tramitação – A proposição tramita em conjunto com o PL 4548/98 e outras oito propostas, que estão prontas para serem votadas em Plenário.

Psychological Abuse and Youth Anxiety and Depression (Science Daily)

Psychological Abuse Puts Children at Risk

ScienceDaily (July 30, 2012) — Child abuse experts say psychological abuse can be as damaging to a young child’s physical, mental and emotional health as a slap, punch or kick.

While difficult to pinpoint, it may be the most challenging and prevalent form of child abuse and neglect, experts say in an American Academy of Pediatrics (AAP) position statement on psychological maltreatment in the August issue of the journal Pediatrics.

Psychological abuse includes acts such as belittling, denigrating, terrorizing, exploiting, emotional unresponsiveness, or corrupting a child to the point a child’s well-being is at risk, said Dr. Harriet MacMillan, a professor in the departments of psychiatry and behavioural neurosciences and pediatrics of McMaster University’s Michael G. DeGroote School of Medicine and the Offord Centre for Child Studies. One of three authors of the position statement, she holds the David R. (Dan) Offord Chair in Child Studies at McMaster.

“We are talking about extremes and the likelihood of harm, or risk of harm, resulting from the kinds of behavior that make a child feel worthless, unloved or unwanted,” she said, giving the example of a mother leaving her infant alone in a crib all day or a father involving his teenager in his drug habit.

A parent raising their voice to a strident pitch after asking a child for the eighth time to put on their running shoes is not psychological abuse, MacMillan said. “But, yelling at a child every day and giving the message that the child is a terrible person, and that the parent regrets bringing the child into this world, is an example of a potentially very harmful form of interaction.”

Psychological abuse was described in the scientific literature more than 25 years ago, but it has been under-recognized and under-reported, MacMillan said, adding that its effects “can be as harmful as other types of maltreatment.”

The report says that because psychological maltreatment interferes with a child’s development path, the abuse has been linked with disorders of attachment, developmental and educational problems, socialization problems and disruptive behaviour. “The effects of psychological maltreatment during the first three years of life can be particularly profound.”

This form of mistreatment can occur in many types of families, but is more common in homes with multiple stresses, including family conflict, mental health issues, physical violence, depression or substance abuse.

Although there are few studies reporting the prevalence of psychological abuse, the position statement says large population-based, self-report studies in Britain and the United States found approximately eight-to-nine per cent of women and four per cent of men reported exposure to severe psychological abuse during childhood.

The statement says pediatricians need to be alert to the possibility of psychological abuse even though there is little evidence on potential strategies that might help. It suggests collaboration among pediatric, psychiatric and child protective service professionals is essential for helping the child at risk.

Funders for the paper’s development included the Family Violence Prevention Unit of the Public Health Agency of Canada.

Along with MacMillan, the statement was prepared by Indiana pediatrician Dr. Roberta Hibbard, an expert on child abuse and neglect; Jane Barlow, professor of Public Health in the Early Years at the University of Warwick; as well as the Committee on Child Abuse and Neglect and the American Academy of Child and Adolescent Psychiatry, Child Maltreatment and Violence Committee.

*   *   *

Emotion Detectives Uncover New Ways to Fight-Off Youth Anxiety and Depression

ScienceDaily (July 30, 2012) — Emotional problems in childhood are common. Approximately 8 to 22 percent of children suffer from anxiety, often combined with other conditions such as depression. However, most existing therapies are not designed to treat coexisting psychological problems and are therefore not very successful in helping children with complex emotional issues.

To develop a more effective treatment for co-occurring youth anxiety and depression, University of Miami psychologist Jill Ehrenreich-May and her collaborator Emily L. Bilek analyzed the efficacy and feasibility of a novel intervention created by the researchers, called Emotion Detectives Treatment Protocol (EDTP). Preliminary findings show a significant reduction in the severity of anxiety and depression after treatment, as reported by the children and their parents.

“We are very excited about the potential of EDTP,” says Ehrenreich-May, assistant professor of psychology in the College of Arts and Sciences at UM and principal investigator of the study. “Not only could the protocol better address the needs of youth with commonly co-occurring disorders and symptoms, it may also provide additional benefits to mental health professionals,” she says. “EDTP offers a more unified approach to treatment that, we hope, will allow for an efficient and cost-effective treatment option for clinicians and clients alike.”

Emotion Detectives Treatment Program is an adaptation of two treatment protocols developed for adults and adolescents, the Unified Protocols. The program implements age-appropriate techniques that deliver education about emotions and how to manage them, strategies for evaluating situations, problem-solving skills, behavior activation (a technique to reduce depression), and parent training.

In the study, 22 children ages 7 to 12 with a principal diagnosis of anxiety disorder and secondary issues of depression participated in a 15-session weekly group therapy of EDTP. Among participants who completed the protocol (18 out of 22), 14 no longer met criteria for anxiety disorder at post-treatment. Additionally, among participants who were assigned a depressive disorder before treatment (5 out of 22), only one participant continued to meet such criteria at post-treatment.

Unlike results from previous studies, the presence of depressive symptoms did not predict poorer treatment response. The results also show a high percentage of attendance. The findings imply that EDTP may offer a better treatment option for children experiencing anxiety and depression.

“Previous research has shown that depressive symptoms tend to weaken treatment response for anxiety disorders. We were hopeful that a broader, more generalized approach would better address this common co-occurrence,” says Bilek, doctoral candidate in clinical psychology at UM and co-author of the study. “We were not surprised to find that the EDTP had equivalent outcomes for individuals with and without elevated depressive symptoms, but we were certainly pleased to find that this protocol may address this important issue.”

The study, titled “The Development of a Transdiagnostic, Cognitive Behavioral Group Intervention for Childhood Anxiety Disorders and Co-Occurring Depression Symptoms,” is published online ahead of print in the journal Cognitive and Behavioral Practice. The next step is for the team to conduct a randomized controlled trial comparing the EDTP to another group treatment protocol for anxiety disorder.

Mário Scheffer: “Vivemos uma crise sem precedentes na resposta à epidemia de HIV/Aids” (viomundo.com.br)

31 de julho de 2012

por Conceição Lemes

Mário Scheffer: “A condução é conservadora, defasada. A criatividade, a ousadia e o diálogo permanente com a sociedade civil  cederam lugar à arrogância”

Terminou nesta sexta-feira, em Washington, Estados Unidos, a 19ª Conferência Internacional sobre Aids. O Programa Nacional de DST/Aids, que até então era festejado e apontado como modelo para o mundo, sofreu críticas de especialistas durante toda a semana.

“A história de sucesso do programa brasileiro de aids entrou em declínio por fatores como a saída de recursos internacionais e o enfraquecimento da relação entre o governo e a sociedade civil”, avalia Eduardo Gomez, pesquisador da Universidade Rutgers de Camden, em Nova Jersey, EUA. “Historicamente, o programa brasileiro de aids tinha uma conexão forte com as ONGs, mas agora elas estão sem recursos e sem motivação. O governo precisa delas para conscientizar as populações difíceis de atingir.”

“O aumento da pressão de grupos religiosos e a redução das campanhas de prevenção junto às populações de maior risco são a maior ameaça ao programa brasileiro anti-aids”, pondera Massimo Ghidinelli, coordenador de Aids/HIV da Organização Panamericana da Saúde (OPAS). “Parece que, nos últimos anos, os grupos religiosos ficaram mais fortes e há uma menor intensidade na maneira pela qual o programa lida com questões de homofobia e sexualidade.”

Ontem, quinta-feira 26, ativistas brasileiros presentes à 19ª Conferência Internacional de Aids, em Washington, protestaram em frente ao estande do Ministério da Saúde contra o que definem como “retrocesso na resposta contra a epidemia”. O objetivo, segundo eles, foi mostrar ao mundo que o País “não é mais o mesmo” e “vive do sucesso do passado” no enfrentamento da doença.

“Até agora, as críticas eram principalmente de ONGs e ativistas brasileiros. Agora, são de especialistas estrangeiros renomados”, observa Mário Scheffer, presidente do Grupo Pela Vidda-SP. “O programa brasileiro de aids parou no tempo e não é mais motivo de orgulho nacional. Tivemos uma sucessão de perdas acumuladas. Vivemos uma crise sem precedentes na resposta à epidemia de HIV/aids.”

Ativista há mais de 20 anos e também professor do Departamento de Medicina Preventiva da Faculdade de Medicina da USP, Mário acompanha a epidemia de HIV/Aids desde o seu início nos anos 80. Além do olhar afiado e da expertise em saúde pública, ele conhece bem toda a trajetória do Programa Nacional de DST/Aids. Daí esta nossa entrevista:

Viomundo – Começou no domingo (22) e terminou hoje (27) em Washington a 19ª Conferência Internacional sobre Aids. No decorrer da semana, foram feitas várias críticas ao momento atual do programa brasileiro de aids. Você concorda com elas?

Mário Scheffer – Com certeza. Até agora, as críticas eram principalmente de ONGs brasileiras. Agora, são de especialistas estrangeiros renomados. Elas são a prova maior de que o programa brasileiro não é mais a principal referência internacional, perdemos a liderança e o ineditismo, não ousamos mais nas respostas excepcionais que marcaram nossa história de combate à aids.

Viomundo – As ONGs de aids sempre tiveram boa interlocução com o Ministério da Saúde. O que aconteceu?

Mário Scheffer — As ONGs e os ativistas pioneiros que são obviamente mais críticos não são mais ouvidos. O governo atualmente elege os interlocutores que lhes são mais convenientes e deslegitima muitos daqueles que deram contribuições históricas.

Sinal de que as coisas não vão nada bem por aqui é que tanto a crítica ao programa quanto o reconhecimento às ONGs e aos ativistas brasileiros têm que vir de fora.

Aliás, o presidente do Banco Mundial, Jim Yong Kim, em seu discurso na abertura da Conferência Internacional de Aids, domingo passado em Washington, fez um vigoroso elogio aos ativistas e citou especificamente as ONGs brasileiras. Disse que se hoje é possível falar em controle da epidemia e vislumbrar o seu fim, isso se deve fundamentalmente às ações desses ativistas.

Viomundo –  ONGs de aids estão fechando as portas no Brasil. Por quê?

Mário Scheffer – Vários motivos. Crise de pessoal, financeira, de sustentabilidade, não têm sede física, não têm dinheiro para pagar aluguel e telefone, têm que compor diretorias com apenas três pessoas  porque não há mais gente disponível. Também não conseguem mais montar  equipes para executar projetos, para chegar até as populações vulneráveis, o que só as ONGs são capazes de fazer.

Em outras palavras: algumas ONGs estão fechando as portas, como você disse. Mas está havendo também retração das atividades de todas elas.

Viomundo – Mas as críticas não se devem apenas à crise financeira e de pessoal das ONGs de aids?

Mário Scheffer – Essa é apenas uma das pontas da crise sem precedentes da resposta brasileira à epidemia, que também perdeu tecnicamente. Além disso, não há sensibilidade nem determinação do governo para perceber e para contribuir com a superação da crise das ONGs. Pelo contrário. Atualmente há uma crise política de relacionamento e mesmo de desprezo pela história das ONGs. O governo federal tem feito a opção — e isso não é só na área de aids — pela relação paroquial com a sociedade civil, uma política de cooptação e quebra-galho. Não ha mais crítica nem debate qualificado de ideias. Tivemos uma sucessão de perdas acumuladas.

Viomundo – Quais?

Mário Scheffer – Primeiro, perdemos a força do trabalho voluntário por meio do qual as pessoas participavam de nossas ONGs, exprimiam sua solidariedade, doavam tempo, trabalho e talento para a luta contra a aids. Não é mais uma causa mobilizadora e isso tem a ver com a imagem trabalhada pelo governo de que temos o melhor programa do mundo e que por aqui está tudo resolvido.

Segundo, com a ascensão das ONGs picaretas e bandidas, criadas para alimentar a corrupção em vários ministérios, cresceu o preconceito e foram impostas mais barreiras para as organizações sérias, que já tinham dificuldade em acessar recursos públicos.

Desde que realizado com critério, transparência, concorrência pública e rigorosa prestação de contas, as ONGs deveriam ter o direito de acessar fundos públicos para exercer o controle, a fiscalização e a participação nas políticas públicas, como acontece em várias democracias.

Terceiro, diante da imagem de que o Brasil hoje é um país rico e resolveu o problema da aids (o que não é verdade), acabou o apoio internacional às ONGs brasileiras de aids.

Resultado: sem ajuda de comunidades e empresas e com uma causa que não toca mais o coração de doadores e voluntários, passamos a viver a dificuldade crescente de assegurar recursos institucionais para a manutenção das ONGs. Com isso, arrefeceu o nosso ativismo e controle sobre as políticas públicas.

Viomundo – E os financiamentos governamentais vinculados a projetos?

Mário Scheffer – Eles fazem parte de um modelo esgotado em que as ONGs de aids foram reduzidas a mão de obra barata para prestação de serviços que o Ministério da Saúde e secretarias estaduais e municipais de saúde não conseguem realizar. Não bastasse isso, muitas vezes estados e municípios não repassam esse recursos às ONGs e quando o fazem, não há continuidade nem avaliação da eficácia das ações financiadas.

Viomundo – Um pouco atrás você falou que o programa brasileiro de aids perdeu tecnicamente. Em que medida? 

Mário Scheffer — Não houve renovação nem atualização dos quadros técnicos. Os desafios hoje são outros, mas a condução é conservadora, defasada. A criatividade, a ousadia e o diálogo permanente com a sociedade civil  cederam lugar à arrogância. Sem a força e a autonomia de outrora, os programas de aids —  o nacional e vários estaduais e municipais — estão isolados e enfraquecidos politicamente dentro dos governos.

Em São Paulo, por exemplo, muitos serviços municipais de aids estão sem médicos,   os estaduais, superlotados, sendo privatizados, fechando leitos, e os programas de aids sem nenhuma governabilidade sobre isso.

Já o programa nacional nem sequer dá mais as fichas sobre a produção nacional de antirretrovirais genéricos. Hoje é um processo sem transparência. O Ministério da Saúde não dá um passo sem o amém da Casa Civil e dos fundamentalistas religiosos que integram a base governista, o que emperra programas de prevenção de aids.

Viomundo – O que ONGs e ativistas da área de aids querem?

Mário Scheffer — Queremos ser respeitados e ouvidos mas em novos patamares de relacionamento. Ninguém desistiu da luta. Nossas ONGs querem continuar atuando nas diversas frentes, na prevenção, na assistência das casas de apoio, nas assessorias jurídicas, na defesa dos direitos das pessoas que vivem com HIV. Queremos continuar fazendo o mesmo ativismo que nos levou a conquistar o acesso universal aos medicamentos, derrubar patentes, lutar contra a exclusão de coberturas pelos planos de saúde privados, acessar os vulneráveis e alçá-los à condição de cidadãos.

O mesmo ativismo que nos leva a apontar que, diferentemente do que dizem, o acesso aos antirretrovirais no Brasil não é universal, pois o diagnóstico tardio é altíssimo e ainda existem desabastecimentos ocasionais. Que nos leva a dizer que não existe política de prevenção adequada a um perfil de epidemia concentrada em certas populações, como os homossexuais, atualmente os maiores negligenciados de prevenção em aids no Brasil.

Hoje estão ameaçados princípios essenciais que forjaram o combate à aids no Brasil, que um dia chegou a quebrar barreiras e tabus. Essa ousadia necessária deu lugar a um programa sem vida, covarde, que promove autocensura, se alinha com forças retrógradas, como no caso recente da campanha dirigida aos gays.

Um programa que se debruça sobre glórias do passado e exibe uma real incapacidade , lentidão e perda da capacidade técnica e política . Não tem conseguido dar respostas à altura das novas dinâmicas e desafios da epidemia e a comunidade internacional passou a perceber isso.

Neste momento de grandes mudanças, com esperança concreta da cura e controle da aids, novas armas para prevenção, necessidade de ampliarmos a oferta de testagem e tratamento a todos os infectados, o Brasil está paralisado, com seus indicadores de mortalidade e de novas infecções pelo HIV estacionados. O programa brasileiro de aids parou no tempo e não é mais motivo de orgulho nacional.

Modern culture emerged in Africa 20,000 years earlier than thought (L.A.Times)

By Thomas H. Maugh II

July 30, 2012, 1:54 p.m.

Border Cave artifactsObjects found in the archaeological site called Border Cave include a) a wooden digging stick; b) a wooden poison applicator; c) a bone arrow point decorated with a spiral incision filled with red pigment; d) a bone object with four sets of notches; e) a lump of beeswax; and f) ostrich eggshell beads and marine shell beads used as personal ornaments. (Francesco d’Errico and Lucinda Backwell/ July 30, 2012)
Modern culture emerged in southern Africa at least 44,000 years ago, more than 20,000 years earlier than anthropologists had previously believed, researchers reported Monday.

That blossoming of technology and art occurred at roughly the same time that modern humans were migrating fromAfrica to Europe, where they soon displaced Neanderthals. Many of the characteristics of the ancient culture identified by anthropologists are still present in hunter-gatherer cultures of Africa today, such as the San culture of southern Africa, the researchers said.

The new evidence was provided by an international team of researchers excavating at an archaeological site called Border Cave in the foothills of the Lebombo Mountains on the border of KwaZulu-Natal in South Africa and Swaziland. The cave shows evidence of occupation by human ancestors going back more than 200,000 years, but the team reported in two papers in the Proceedings of the National Academy of Sciences that they were able to accurately date their discoveries to 42,000 to 44,000 years ago, a period known as the Later Stone Age or the Upper Paleolithic Period in Europe.

Among the organic — and thus datable — artifacts the team found in the cave were ostrich eggshell beads, thin bone arrowhead points, wooden digging sticks, a gummy substance called pitch that was used to attach bone and stone blades to wooden shafts, a lump of beeswax likely used for the same purpose, worked pig tusks that were probably use for planing wood, and notched bones used for counting.

“They adorned themselves with ostrich egg and marine shell beads, and notched bones for notational purposes,” said paleoanthropologist Lucinda Blackwell of the University of Witwatersrand in South Africa, a member of the team. “They fashioned fine bone points for use as awls and poisoned arrowheads. One point is decorated with a spiral groove filled with red ochre, which closely parallels similar marks that San make to identify their arrowheads when hunting.”

The very thin bone points are “very good evidence” for the use of bows and arrows, said co-author Paola Villa, a curator at the University of Colorado Museum of Natural History. Some of the bone points were apparently coated with ricinoleic acid, a poison made from the castor bean. “Such bone points could have penetrated thick hides, but the lack of ‘knock-down’ power means the use of poison probably was a requirement for successful kills,” she said.

The discovery also represents the first time pitch-making has been documented in South Africa, Villa said. The process requires burning peeled bark in the absence of air. The Stone Age residents probably dug holes in the ground, inserted the bark, lit it on fire, and covered the holes with stones, she said.

Otávio Velho defende questionamento do eurocentrismo que marca o pensamento brasileiro (Jornal da Ciência)

Clarissa Vasconcellos – JC e-mail 4550, de 30 de Julho de 2012

Ele cita ideias de Tim Ingold, Aníbal Quijano e Ashis Nandy e aborda novas tendências vistas a partir da antropologia,em conferência realizada no último dia da 64ª Reunião Anual da SBPC.

Uma palestra com cara de aula magna, proferida pelo antropólogo Otávio Velho, foi um dos destaques do último dia da 64ª Reunião Anual da Sociedade Brasileira para o Progresso da Ciência (SBPC), que terminou na sexta-feira (27) em São Luís. Palestrante que se poderia chamar de ‘hors concours’ (se houvesse alguma classificação entre o time de conferencistas), Velho apresentou a mesa ‘Contradição ou complementariedade: novas tendências do pensamento vistas a partir da antropologia’.

Eurocentrismo, descolonização, abertura. Essas foram algumas das palavras chave usadas pelo antropólogo para questionar o pensamento social vigente no País, que ainda vira as costas para o que está acontecendo no campo social e científico de nações do hemisfério Sul.

Velho começou afirmando que a antropologia realizada no Brasil peca por uma “escolarização excessiva, uma tendência repetitiva e talvez uma falta de atenção à pesquisa de campo”. “É preciso tentar abrir horizontes, a pesquisa tem que ser o cerne da atividade”, opina.

Duas tendências – Contudo, a palestra foi estruturada em torno de duas tendências de linhas diferentes da antropologia. A primeira vem sendo redescoberta na figura de Gregory Bateson, antropólogo que atuou entre os anos 1930 e 1960 e apontou em direção à interdisciplinaridade, flertando até com a biologia. Velho se centrou em um dos prolongamentos de sua linha, estabelecido por Tim Ingold.

Ele detalha que, para Bateson, o foco era o agente social, basicamente os indivíduos em comunicação e interação. Ingold desloca esse foco para o campo como um todo, “o agente da vida” e não os individuais. “Isso deixa implícita uma crítica à ideologia individualista que permeia nosso inconsciente teórico; o foco é o sistema como um todo. Passa a ser importante a ideia de movimento e nele a grande unidade da vida entendida em sentido mais holístico e global”, detalha.

A segunda tendência é a crítica ao eurocentrismo, que se pode dar em diversos planos. “Estamos mais apegados a essas referências do que os próprios pesquisadores do primeiro mundo. Esse deslocamento do eurocentrismo funciona de modo quase análogo a uma mudança de paradigma”, afirma, propondo a releitura e contextualização dos pensadores e a abertura a outros. “Referimo-nos a autores europeus e americanos e não conhecemos a produção latino-americana”, pontua, citando o sociólogo peruano Aníbal Quijano.

Diferenças – Ele atenta para o abuso da utilização da ideia de diferença e diversidade, ênfases empregadas comumente na antropologia. E lembra que a disciplina “tem origem no colonialismo europeu” e a que a diferença, entre outras coisas, era usada para “mostrar que outros povos eram incapazes de fazer avanços tecnológicos”.

“Como a América Latina se tornou independente há algum tempo, antes de países da África e Ásia, o colonialismo nos parece algo distante, que nossos quadros de diferenças não contemplam”, sublinha, ressaltando que mesmo no marxismo é possível encontrar um eurocentrismo forte. “Ele quase sugere que o colonizador é uma agente de progresso”, exemplifica.

Velho insiste em retomar a questão como algo que não pertence ao passado, já que tem prolongamentos, e cita outra vez Quijano, que fala do conceito de ‘colonialidade’ para se referir a algo que vai além do fenômeno histórico e se prolonga. “Como acontece nesse certo mimetismo nosso, o eurocentrismo dos intelectuais”, completa. Outro exemplo é a ideia eurocêntrica de dividir o mundo entre povos “com ou sem história”. Ele lamenta que no Brasil ainda seja muito incipiente o estudo do território antes de 1500.

No entanto, relembra que alguns movimentos importantes estão sendo feitos no âmbito da antropologia da América Latina, como as reuniões regionais do Mercosul. Porém, ainda falta intensificar o intercâmbio Sul-Sul. “Estamos mandando bolsistas do Ciências sem Fronteiras para a Índia?”, indaga, apontando a influência eurocêntrica também no desenvolvimento científico técnico.

Novos eixos – Velho pontua que a Índia é um dos lugares onde a discussão sobre as críticas ao eurocentrismo tem avançado mais, destacando o nome de Ashis Nandy. E vai mais longe, afirmando que tampouco é salutar distinguir do contexto mundial as chamadas “populações tradicionais”, termo frequente quando se quer marcar as diferenças regionais.

“A diferença é muito importante, mas a ênfase não deveria estar no conflito, que pode ser paralisante para o movimento”, opina. E ressalta a necessidade de não “hegemonizar”. “Os indianos falam de dominação sem hegemonia, para marcar a força dessas tradições que não são necessariamente hegemonizadas pelo colonizador”, exemplifica.

Ele atenta para a ideia de “acentuar novos eixos e novas articulações”, que “não signifiquem um relativismo cultural exacerbado”. E propõe construir universos a partir de novas perspectivas, que “tampouco se pretendem absolutas ou dominantes”, sem excluir outras possibilidades. “Existe outro Ocidente. Temos que estar abertos a encontros inesperados”, exemplifica.

Velho acredita que o protagonismo econômico de países como os do Bric, impulsionado pela crise na Europa, não levará imediatamente a um protagonismo “do pensamento” também. Ele chama a atenção para o risco de “mimetização” das ideias e que os países emergentes não podem cair na tentação de se transformar em “novos etnocêntricos”. E cita Nandy, que afirma que o antropologismo “não é a cura para o etnocentrismo”, mas sim ajuda a “pluralizar”.

The Conversion of a Climate-Change Skeptic (N.Y.Times)

OP-ED CONTRIBUTOR

By RICHARD A. MULLER

Published: July 28, 2012

Berkeley, Calif.

CALL me a converted skeptic. Three years ago I identified problems in previous climate studies that, in my mind, threw doubt on the very existence of global warming. Last year, following an intensive research effort involving a dozen scientists, I concluded that global warming was real and that the prior estimates of the rate of warming were correct. I’m now going a step further: Humans are almost entirely the cause.

My total turnaround, in such a short time, is the result of careful and objective analysis by the Berkeley Earth Surface Temperature project, which I founded with my daughter Elizabeth. Our results show that the average temperature of the earth’s land has risen by two and a half degrees Fahrenheit over the past 250 years, including an increase of one and a half degrees over the most recent 50 years. Moreover, it appears likely that essentially all of this increase results from the human emission of greenhouse gases.

These findings are stronger than those of the Intergovernmental Panel on Climate Change, the United Nations group that defines the scientific and diplomatic consensus on global warming. In its 2007 report, the I.P.C.C. concluded only that most of the warming of the prior 50 years could be attributed to humans. It was possible, according to the I.P.C.C. consensus statement, that the warming before 1956 could be because of changes in solar activity, and that even a substantial part of the more recent warming could be natural.

Our Berkeley Earth approach used sophisticated statistical methods developed largely by our lead scientist, Robert Rohde, which allowed us to determine earth land temperature much further back in time. We carefully studied issues raised by skeptics: biases from urban heating (we duplicated our results using rural data alone), from data selection (prior groups selected fewer than 20 percent of the available temperature stations; we used virtually 100 percent), from poor station quality (we separately analyzed good stations and poor ones) and from human intervention and data adjustment (our work is completely automated and hands-off). In our papers we demonstrate that none of these potentially troublesome effects unduly biased our conclusions.

The historic temperature pattern we observed has abrupt dips that match the emissions of known explosive volcanic eruptions; the particulates from such events reflect sunlight, make for beautiful sunsets and cool the earth’s surface for a few years. There are small, rapid variations attributable to El Niño and other ocean currents such as the Gulf Stream; because of such oscillations, the “flattening” of the recent temperature rise that some people claim is not, in our view, statistically significant. What has caused the gradual but systematic rise of two and a half degrees? We tried fitting the shape to simple math functions (exponentials, polynomials), to solar activity and even to rising functions like world population. By far the best match was to the record of atmospheric carbon dioxide, measured from atmospheric samples and air trapped in polar ice.

Just as important, our record is long enough that we could search for the fingerprint of solar variability, based on the historical record of sunspots. That fingerprint is absent. Although the I.P.C.C. allowed for the possibility that variations in sunlight could have ended the “Little Ice Age,” a period of cooling from the 14th century to about 1850, our data argues strongly that the temperature rise of the past 250 years cannot be attributed to solar changes. This conclusion is, in retrospect, not too surprising; we’ve learned from satellite measurements that solar activity changes the brightness of the sun very little.

How definite is the attribution to humans? The carbon dioxide curve gives a better match than anything else we’ve tried. Its magnitude is consistent with the calculated greenhouse effect — extra warming from trapped heat radiation. These facts don’t prove causality and they shouldn’t end skepticism, but they raise the bar: to be considered seriously, an alternative explanation must match the data at least as well as carbon dioxide does. Adding methane, a second greenhouse gas, to our analysis doesn’t change the results. Moreover, our analysis does not depend on large, complex global climate models, the huge computer programs that are notorious for their hidden assumptions and adjustable parameters. Our result is based simply on the close agreement between the shape of the observed temperature rise and the known greenhouse gas increase.

It’s a scientist’s duty to be properly skeptical. I still find that much, if not most, of what is attributed to climate change is speculative, exaggerated or just plain wrong. I’ve analyzed some of the most alarmist claims, and my skepticism about them hasn’t changed.

Hurricane Katrina cannot be attributed to global warming. The number of hurricanes hitting the United States has been going down, not up; likewise for intense tornadoes. Polar bears aren’t dying from receding ice, and the Himalayan glaciers aren’t going to melt by 2035. And it’s possible that we are currently no warmer than we were a thousand years ago, during the “Medieval Warm Period” or “Medieval Optimum,” an interval of warm conditions known from historical records and indirect evidence like tree rings. And the recent warm spell in the United States happens to be more than offset by cooling elsewhere in the world, so its link to “global” warming is weaker than tenuous.

The careful analysis by our team is laid out in five scientific papers now online atBerkeleyEarth.org. That site also shows our chart of temperature from 1753 to the present, with its clear fingerprint of volcanoes and carbon dioxide, but containing no component that matches solar activity. Four of our papers have undergone extensive scrutiny by the scientific community, and the newest, a paper with the analysis of the human component, is now posted, along with the data and computer programs used. Such transparency is the heart of the scientific method; if you find our conclusions implausible, tell us of any errors of data or analysis.

What about the future? As carbon dioxide emissions increase, the temperature should continue to rise. I expect the rate of warming to proceed at a steady pace, about one and a half degrees over land in the next 50 years, less if the oceans are included. But if China continues its rapid economic growth (it has averaged 10 percent per year over the last 20 years) and its vast use of coal (it typically adds one new gigawatt per month), then that same warming could take place in less than 20 years.

Science is that narrow realm of knowledge that, in principle, is universally accepted. I embarked on this analysis to answer questions that, to my mind, had not been answered. I hope that the Berkeley Earth analysis will help settle the scientific debate regarding global warming and its human causes. Then comes the difficult part: agreeing across the political and diplomatic spectrum about what can and should be done.

Richard A. Muller, a professor of physics at the University of California, Berkeley, and a former MacArthur Foundation fellow, is the author, most recently, of “Energy for Future Presidents: The Science Behind the Headlines.”

*   *   *

Climate change study forces sceptical scientists to change minds (The Guardian)

Earth’s land shown to have warmed by 1.5C over past 250 years, with humans being almost entirely responsible

Leo Hickman
guardian.co.uk, Sunday 29 July 2012 14.03 BST

Prof Richard MullerProf Richard Muller considers himself a converted sceptic following the study’s surprise results. Photograph: Dan Tuffs for the Guardian

The Earth’s land has warmed by 1.5C over the past 250 years and “humans are almost entirely the cause”, according to a scientific study set up to address climate change sceptics’ concerns about whether human-induced global warming is occurring.

Prof Richard Muller, a physicist and climate change sceptic who founded the Berkeley Earth Surface Temperature (Best) project, said he was surprised by the findings. “We were not expecting this, but as scientists, it is our duty to let the evidence change our minds.” He added that he now considers himself a “converted sceptic” and his views had undergone a “total turnaround” in a short space of time.

“Our results show that the average temperature of the Earth’s land has risen by 2.5F over the past 250 years, including an increase of 1.5 degrees over the most recent 50 years. Moreover, it appears likely that essentially all of this increase results from the human emission of greenhouse gases,” Muller wrote in an opinion piece for the New York Times.

Can scientists in California end the war on climate change?
Study finds no grounds for climate sceptics’ concerns
Video: Berkeley Earth tracks climate change
Are climate sceptics more likely to be conspiracy theorists?

The team of scientists based at the University of California, Berkeley, gathered and merged a collection of 14.4m land temperature observations from 44,455 sites across the world dating back to 1753. Previous data sets created by Nasa, the US National Oceanic and Atmospheric Administration, and the Met Office and the University of East Anglia’s climate research unit only went back to the mid-1800s and used a fifth as many weather station records.

The funding for the project included $150,000 from the Charles G Koch Charitable Foundation, set up by the billionaire US coal magnate and key backer of the climate-sceptic Heartland Institute thinktank. The research also received $100,000 from the Fund for Innovative Climate and Energy Research, which was created by Bill Gates.

Unlike previous efforts, the temperature data from various sources was not homogenised by hand – a key criticism by climate sceptics. Instead, the statistical analysis was “completely automated to reduce human bias”. The Best team concluded that, despite their deeper analysis, their own findings closely matched the previous temperature reconstructions, “but with reduced uncertainty”.

Last October, the Best team published results that showed the average global land temperature has risen by about 1C since the mid-1950s. But the team did not look for possible fingerprints to explain this warming. The latest data analysis reached much further back in time but, crucially, also searched for the most likely cause of the rise by plotting the upward temperature curve against suspected “forcings”. It analysed the warming impact of solar activity – a popular theory among climate sceptics – but found that, over the past 250 years, the contribution of the sun has been “consistent with zero”. Volcanic eruptions were found to have caused short dips in the temperature rise in the period 1750–1850, but “only weak analogues” in the 20th century.

“Much to my surprise, by far the best match came to the record of atmospheric carbon dioxide, measured from atmospheric samples and air trapped in polar ice,” said Muller. “While this doesn’t prove that global warming is caused by human greenhouse gases, it is currently the best explanation we have found, and sets the bar for alternative explanations.”

Muller said his team’s findings went further and were stronger than the latest report published by the Intergovernmental Panel on ClimateChange.

In an unconventional move aimed at appeasing climate sceptics by allowing “full transparency”, the results have been publicly released before being peer reviewed by the Journal of Geophysical Research. All the data and analysis is now available to be freely scrutinised at the Bestwebsite. This follows the pattern of previous Best results, none of which have yet been published in peer-reviewed journals.

When the Best project was announced last year, the prominent climate sceptic blogger Anthony Watts was consulted on the methodology. He stated at the time: “I’m prepared to accept whatever result they produce, even if it proves my premise wrong.” However, tensions have since arisen between Watts and Muller.

Early indications suggest that climate sceptics are unlikely to fully accept Best’s latest results. Prof Judith Curry, a climatologist at the Georgia Institute of Technology who runs a blog popular with climate sceptics and who is a consulting member of the Best team, told the Guardian that the method used to attribute the warming to human emissions was “way over-simplistic and not at all convincing in my opinion”. She added: “I don’t think this question can be answered by the simple curve fitting used in this paper, and I don’t see that their paper adds anything to our understanding of the causes of the recent warming.”

Prof Michael Mann, the Penn State palaeoclimatologist who has faced hostility from climate sceptics for his famous “hockey stick” graph showing a rapid rise in temperatures during the 20th century, said he welcomed the Best results as they “demonstrated once again what scientists have known with some degree of certainty for nearly two decades”. He added: “I applaud Muller and his colleagues for acting as any good scientists would, following where their analyses led them, without regard for the possible political repercussions. They are certain to be attacked by the professional climate change denial crowd for their findings.”

Muller said his team’s analysis suggested there would be 1.5 degrees of warming over land in the next 50 years, but if China continues its rapid economic growth and its vast use of coal then that same warming could take place in less than 20 years.

“Science is that narrow realm of knowledge that, in principle, is universally accepted,” wrote Muller. “I embarked on this analysis to answer questions that, to my mind, had not been answered. I hope that the Berkeley Earth analysis will help settle the scientific debate regarding global warming and its human causes. Then comes the difficult part: agreeing across the political and diplomatic spectrum about what can and should be done.”

Climate Change and the Next U.S. Revolution (ZNet)

Thursday, July 26, 2012

The U.S. heat wave is slowly shaking the foundations of American politics. It may take years for the deep rumble to evolve into an above ground, institution-shattering earthquake, but U.S. society has changed for good.

The heat wave has helped convince tens of millions of Americans that climate change is real, overpowering the fake science and right-wing media – funded by corporate cash – to convince Americans otherwise.

Republicans and Democrats alike also erect roadblocks to understanding climate change. By the politicians’ complete lack of action towards addressing the issue, the “climate change is fake” movement was strengthened, since Americans presumed that any sane government would be actively trying to address an issue that had the potential to destroy civilization.

But working people have finally made up their mind. A recent poll showed that 70 percent of Americans now believe that climate change is real, up from 52 percent in 2010. And a growing number of people are recognizing that the warming of the planet is caused by human activity.

Business Week explains: “A record heat wave, drought and catastrophic wildfires are accomplishing what climate scientists could not: convincing a wide swath of Americans that global temperatures are rising.”

This means that working class families throughout the Midwest and southern states simply don’t believe what their media and politicians are telling them.

It also implies that these millions of Americans are being further politicized in a deeper sense.

Believing that climate change exists implies that you are somewhat aware about the massive consequences to humanity if the global economy doesn’t drastically change, and fast.

This awareness has revolutionary implications. As millions of Americans watch the environment destroyed – for their grandchildren or themselves – while politicians do absolutely nothing in response, or make tiny token gestures – a growing number of Americans will demand political alternatives, and fight to see them created. The American political system as it exists today cannot cope with this inevitable happening.

The New York Times explains why: “…the American political system is not ready to agree to a [climate] treaty that would force the United States, over time, to accept profound changes in its energy [coal, oil], transport [trucking and airline industry] and manufacturing [corporate] sectors.”

In short, the U.S. government will not force corporations to make less profit by behaving more eco-friendly. This is the essence of the problem.

In order for humanity to survive climate change, the economy must be radically transformed; massive investments must be made in renewable energy, public transportation, and recycling, while dirty energy sources must be quickly swept into the dustbin of history.

But the economy is currently owned by giant, privately run corporations, that will continue destroying the earth if it earns them huge profits, and they make massive “contributions” to political parties to ensure this remains so. It’s becoming increasingly obvious that government inaction on climate change is directly linked to the “special interests” of corporations that dominate these governments.

This fact of U.S. politics is present in every other capitalist country as well, which means that international agreements on reducing greenhouse gasses will remain impossible, as each country’s corporations vie for market domination, reducing pollution simply puts them at a competitive disadvantage.

This dynamic has already caused massive delays in the UN’s already inadequate efforts at addressing climate change. The Kyoto climate agreement was the by-product of years of cooperation and planning between many nations that included legally binding agreements to reduce greenhouse gasses. The Bush and Obama administrations helped destroy these efforts.

For example, Instead of building upon the foundation of the Kyoto Protocol, the Obama administration demanded a whole new structure, something that would take years to achieve. The Kyoto framework (itself insufficient) was abandoned because it included legally binding agreements, and was based on multilateral, agreed-upon reductions of greenhouse gasses.

In an article by the Guardian entitled “US Planning to Weaken Copenhagen Climate Deal,” the Obama administration’s UN position is exposed, as he dismisses the Kyoto Protocol by proposing that “…each country set its own rules and to decide unilaterally how to meet its target.”
Obama’s proposal came straight from the mouth of U.S. corporations, who wanted to ensure that there was zero accountability, zero oversight, zero climate progress, and therefore no dent to their profits. Instead of using its massive international leverage for climate justice, the U.S. has used it to promote divisiveness and inaction, to the potential detriment of billions of people globally.

The stakes are too high to hold out any hope that governments will act boldly. The Business Week article below explains the profound changes happening to the climate:

“The average temperature for the U.S. during June was 71.2 degrees Fahrenheit (21.7 Celsius), which is 2 degrees higher than the average for the 20th century, according to the National Oceanic and Atmospheric Administration. The June temperatures made the preceding 12 months the warmest since record-keeping began in 1895, the government agency said.”

Activists who are radicalized by this global problem face a crisis of what to do about it. It is difficult to put forth a positive climate change demand, since the problem is global.  Demanding that governments “act boldly” to address climate change hasn’t worked, and lesser demands seem inadequate.

The environmental rights movement continues to go through a variety of phases: individual and small group eco-“terrorism,” causing property damage to environmentally damaging companies; corporate campaigns that target especially bad polluters with high-profile direct action; and massive education programs that have been highly successful, but fall short when it comes to winning change.

Ultimately, climate activists must come face to face with political and corporate power. Corporate-owned governments are the ones with the power to adequately address the climate change issue, and they will not be swayed by good science, common sense, basic decency, or even a torched planet.

Those in power only respond to power, and the only power capable of displacing corporate power is when people unite and act collectively, as was done in Egypt, Tunisia, and is still developing throughout Europe.

Climate groups cannot view their issue as separate from other groups that are organizing against corporate power. The social movements that have emerged to battle austerity measures are natural allies, as are anti-war and labor activists. The climate solution will inevitably require revolutionary measures, which first requires that alliances and demands are put forward that unite Labor, working people in general, community, and student groups towards collective action.

One possible immediate demand is for environmental activists to unite with Labor groups over a federal jobs program, paid for by taxing the rich, that makes massive investments in jobs that are climate related, such as solar panel production, transportation, building recycling centers, home retro-fitting, etc.

Another demand could be to insist that the government convene the most knowledgeable scientists in the area of clean energy. These scientists should be given all the resources they need in order to collectively create alternative sources of clean energy that would allow for a realistic alternative to the current polluting and toxic sources of energy.

However, any type of immediate demand will meet giant corporate resistance from both political parties. Fighting for a uniting demand will thus strengthen the movement, and for this reason it is important to link climate solutions to the creation of jobs, which are the number one concern of most Americans. This unity will in turn lead allies toward a deeper understanding of the problem, and therefore deeper solutions will emerge that challenge the whole economic structure that is deaf to the needs of humans and the climate and sacrifices everything to the private profit of a few.

Shamus Cooke is a social service worker, trade unionist, and writer for Workers Action (www.workerscompass.org). He can be reached at shamuscooke@gmail.com

http://www.businessweek.com/news/2012-07-18/record-heat-wave-pushes-u-dot-s-dot-belief-in-climate-change-to-70-percent

http://www.nytimes.com/2009/12/13/weekinreview/13broder.html

http://www.guardian.co.uk/environment/2009/sep/15/europe-us-copenhagen

Computers Can Predict Effects of HIV Policies, Study Suggests (Science Daily)

ScienceDaily (July 27, 2012) — Policymakers in the fight against HIV/AIDS may have to wait years, even decades, to know whether strategic choices among possible interventions are effective. How can they make informed choices in an age of limited funding? A reliable, well-calibrated, predictive computer simulation would be a great help.

A visualization generated by an agent-based model of New York City’s HIV epidemic shows the risky interactions of unprotected sex or needle sharing among injection drug users (red), non-injection drug users (blue) and non-users (green). (Credit: Brandon Marshall/Brown University)

Policymakers struggling to stop the spread of HIV grapple with “what if” questions on the scale of millions of people and decades of time. They need a way to predict the impact of many potential interventions, alone or in combination. In two papers to be presented at the 2012 International AIDS Society Conference in Washington, D.C., Brandon Marshall, assistant professor of epidemiology at Brown University, will unveil a computer program calibrated to model accurately the spread of HIV in New York City over a decade and to make specific predictions about the future of the epidemic under various intervention scenarios.

“It reflects what’s seen in the real world,” said Marshall. “What we’re trying to do is identify the ideal combination of interventions to reduce HIV most dramatically in injection drug users.”

In an analysis that he’ll present on July 27, Marshall projects that with no change in New York City’s current programs, the infection rate among injection drug users will be 2.1 per 1,000 in 2040. Expanding HIV testing would drop the rate only 12 percent to 1.9 per 1,000; increasing drug treatment would reduce the rate 26 percent to 1.6 per 1,000; providing earlier delivery of antiretroviral therapy and better adherence would drop the rate 45 percent to 1.2 per 1,000; and expanding needle exchange programs would reduce the rate 34 percent to 1.4 per 1,000. Most importantly, doing all four of those things would cut the rate by more than 60 percent, to 0.8 per 1,000.

Virtual reality, real choices

The model is unique in that it creates a virtual reality of 150,000 “agents,” a programming term for simulated individuals, who in the case of the model, engage in drug use and sexual activity like real people.

Like characters in an all-too-serious video game, the agents behave in a world governed by biological rules, such as how often the virus can be transmitted through encounters such as unprotected gay sex or needle sharing.

With each run of the model, agents accumulate a detailed life history. For example, in one run, agent 89,425, who is male and has sex with men, could end up injecting drugs. He participates in needle exchanges, but according to the built-in probabilities, in year three he shares needles multiple times with another injection drug user with whom he is also having unprotected sex. In the last of those encounters, agent 89,425 becomes infected with HIV. In year four he starts participating in drug treatment and in year five he gets tested for HIV, starts antiretroviral treatment, and reduces the frequency with which he has unprotected sex. Because he always takes his HIV medications, he never transmits the virus further.

That level of individual detail allows for a detailed examination of transmission networks and how interventions affect them.

“With this model you can really look at the microconnections between people,” said Marshall, who began working on the model as a postdoctoral fellow at Columbia University and has continued to develop it since coming to Brown in January. “That’s something that we’re really excited about.”

To calibrate the model, Marshall and his colleagues found the best New York City data they could about how many people use drugs, what percentage of people were gay or lesbian, the probabilities of engaging in unprotected sex and needle sharing, viral transmission, access to treatment, treatment effectiveness, participation in drug treatment, progression from HIV infection to AIDS, and many more behavioral, social and medical factors. They also continuously calibrated it until the model could faithfully reproduce the infection rates among injection drug users that were known to occur in New York between 1992 and 2002.

And they don’t just run the simulation once. They run it thousands of times on a supercomputer at Brown to be sure the results they see are reliable.

Future applications

At Brown, Marshall is continuing to work on other aspects of the model, including an analysis of the cost effectiveness of each intervention and their combinations. Cost is, after all, another fact of life that policymakers and public health officials must weigh.

And then there’s the frustrating insight that the infection rate, even with four strengthened interventions underway, didn’t reduce the projected epidemic by much more than half.

“I actually expected something larger,” Marshall said. “That speaks to how hard we have to work to make sure that drug users can access and benefit from proven interventions to reduce the spread of HIV.”

Marshall’s collaborators on the model include Magdalena Paczkowski, Lars Seemann, Barbara Tempalski, Enrique Pouget, Sandro Galea, and Samuel Friedman.

The National Institutes of Health and the Lifespan/Tufts/Brown Center for AIDS Research provide financial support for the model’s continued development.

Climate Change Could Open Trade Opportunities for Some Vulnerable Nations (Science Daily)

ScienceDaily (July 26, 2012) — Tanzania is one developing country that could actually benefit from climate change by increasing exports of corn to the U.S. and other nations, according to a study by researchers at Stanford University, the World Bank and Purdue University.

The study, published in the Review of Development Economics, shows the African country better known for safaris and Mt. Kilimanjaro has the potential to substantially increase its maize exports and take advantage of higher commodity prices with a variety of trading partners due to predicted dry and hot weather that could affect those countries’ usual sources for the crop. In years that major consumer countries such as the U.S., China and India are forecast to experience severe dry conditions, Tanzania’s weather will likely be comparatively wet. Similarly, in the relatively few years this century that it is expected to have severe dry weather, Tanzania could import corn from trading partners experiencing better growing conditions.

“This study highlights how government policies can influence the impact that we experience from the climate system” said study co-author Noah Diffenbaugh, an assistant professor of environmental Earth system science at Stanford’s School of Earth Sciences and a center fellow at the Stanford Woods Institute for the Environment. “Tanzania is a particularly interesting case, as it has the potential to benefit from climate change if climate model predictions of decreasing drought in East Africa prove to be correct, and if trade policies are constructed to take advantage of those new opportunities.”

Tightening restrictions on crop exports during times of climate instability may seem like a logical way to ensure domestic food availability and price stability. In fact, the study warns, trade restrictions such as those that Tanzania has instituted several times in recent years prevent countries such as Tanzania from buffering its poor citizens in bad climate years and from taking advantage of economic opportunities in good climate years.

The study, the most long-range and detailed of its kind to date uses economic, climatic and agricultural data and computational models to forecast the occurrence of severe dry years during the next nine decades in Tanzania and its key trading partners. The authors began by analyzing historical years in which Tanzania experienced grains surpluses or deficits. They found that a closed trade policy enhanced poverty in both kinds of years, by limiting the ability to offset shortfalls with imports during deficit years and limiting the ability to profit from exports during surplus years.

The authors then attempted to predict how often Tanzania and key trading partners will experience severely dry years in response to continued global warming. Among the predictions: during an average of 96 percent of the years that the U.S. and China are predicted to have extremely dry conditions, Tanzania will not experience similarly dry weather. For India, that percentage increases to 97 percent. Similarly, the study’s climate models suggest that Tanzania is likely to have adequate growing season moisture in most of the years that its key African trading partners experience severe dry weather.

Among Tanzania’s trading partners, the U.S., China, Canada and Russia are most likely to consistently experience adequate growing conditions in years when Tanzania does not. When compared with all of its key trading partners, Tanzania’s dry years during the 21st century will often coincide with non-dry years in the other countries. Having a diverse mix of trading partners could help hedge against a coincidence of severe dry weather within and outside of Africa, the study’s results suggest.

The findings are relevant to grain-growing countries around the world. Those countries stand to profit from exports in years when trading partners are enduring severe dry and / or hot weather. Likewise, they can buffer themselves against bad growing weather at home by importing from grains-rich regions less affected by such weather during that particular year.

“This study highlights the importance of trade in either buffering or exacerbating the effects of climate stresses on the poor,” says Diffenbaugh. “We find that these effects are already taking place in the current climate, and that they could become even more important in the future as the co-occurrence of good and bad years between different regions changes in response to global warming.”

Ciência e cultura, o que elas têm em comum? (Jornal da Ciência)

JC e-mail 4549, de 27 de Julho de 2012.

A pergunta foi tema da mesa-redonda “Divulgação da Ciência e da Cultura”, realizada na 64ª Reunião Anual da Sociedade Brasileira para o Progresso da Ciência (SBPC), que termina hoje (27), em São Luís.

Para Ildeu de Castro Moreira, diretor de Popularização e Difusão da Ciência e Tecnologia do Ministério da Ciência, Tecnologia e Inovação (MCTI) e conselheiro da SBPC, o debate sobre a relação da ciência com a arte é muito importante porque são duas facetas fundamentais da cultura humana. “Ciência, arte e cultura têm em comum a criatividade inerente ao ser humano”, definiu. Ele explica que arte e ciência são atividades humanas e sociais baseadas na criatividade e curiosidade.

Físico e divulgador científico, Ildeu falou sobre o “imaginário científico presente na mente de artistas”, e explicou que a ciência também tem preocupação estética e guarda semelhanças com a arte. Para ele, há beleza nas teorias científicas. “Equações matemáticas e fórmulas físicas são lindas. Podem parecer chatas em sala de aula, mas contando com a ajuda do olhar de um artista é possível mostrar essa beleza. É preciso aprender a olhar a beleza da ciência, assim como temos que aprender a olhar muita coisa na arte contemporânea”, exemplifica.

Para Ildeu, as conexões entre ciência e arte são importantes para fazer a divulgação científica chegar mais facilmente ao público. Em sua exposição, ele mostrou manifestações artísticas que falam de ciência, dando exemplos de poesias, músicas, enredos de escolas de samba, ditos populares e cordel.

Público infantil – Em sua apresentação na mesa-redonda, Luisa Medeiros Massarani, jornalista e chefe do Museu da Vida da Fiocruz, no Rio de Janeiro, falou sobre iniciativas de divulgação científicas voltadas para o público infantil. “A experiência tem demonstrado uma grande receptividade das crianças, maior do que a de adultos e adolescentes. Principalmente devido à curiosidade da criança, que são consideradas como ‘cientistas naturais'”, explica.

Luisa falou sobre o crescimento de museus de ciências no País, que atualmente são cerca de 200, embora ainda estejam concentrados em algumas regiões. “Os museus têm apelo incrível para as crianças e são importantes também para o divulgador que vê na hora a reação da criança”, revela. Apesar de os museus terem grande parte do público formado por crianças, Luisa afirma que é preciso pensar em espaços específicos para elas, desde a redução do tamanho dos móveis até atividades interativas adequadas.

Ela defende que a criança deve ser encarada como ator social importante no processo de divulgação científica. “Falar de divulgação científica para criança não é falar de ciência unilateralmente, é preciso que a criança seja ator importante e protagonista do processo”, explica ao dizer que a experiência de uma feira de ciência, ou a visita a um museu fica na memória da criança e pode influenciar sua formação, além de provocar e despertar o interesse pela ciência.

A chefe do Museu da Vida citou exposições, livros e publicações voltadas para o público infantil. E destacou a importância de fazer avaliações junto às crianças depois dessas experiências, para saber qual caminho seguir.

Ildeu aproveitou para sugerir que artistas participem mais ativamente das reuniões da SBPC, não somente como um evento paralelo, como a SBPC Cultural, mas como integrantes de mesas e debates com os cientistas. A ideia é aproveitar o público da Reunião, que alcança 15, 20 mil pessoas para falar dessa relação.

(Jornal da Ciência)

Uma leitura de antropólogos e sociólogos sobre o futuro da Amazônia (Jornal da Ciência)

JC e-mail 4549, de 27 de Julho de 2012.

O enfraquecimento de agências multilaterais de cooperação internacional começa a ameaçar as políticas para conservação da Amazônia Legal. A afirmativa é do presidente do Programa Nova Cartografia Social, Alfredo Wagner de Almeida, que ministrou conferência ontem (26) na 64ª Reunião Anual da Sociedade Brasileira para o Progresso da Ciência (SBPC), realizada na Universidade Federal do Maranhão (UFMA), em São Luís.

Sob o tema “Povos e comunidades tradicionais atingidos por projetos militares”, o antropólogo alertou sobre a ação de sete estados que buscam reduzir a Amazônia Legal, cujos projetos tramitam no Legislativo. Dentre os quais estão o Mato Grosso que prevê retirar a participação de sua área como Amazônia Legal, igualmente a Rondônia, que quer retirar esse título de suas terras da região. Outros estados como Maranhão e Tocantins querem tirar o título de todas suas áreas consideradas Amazônia Legal.

A região engloba uma superfície de aproximadamente 5.217.423 km², o equivalente a cerca de 61% do território brasileiro. Foi instituída com objetivo de definir a delimitação geográfica da região política captadora de incentivos fiscais para promoção do desenvolvimento regional.

“Essa é uma primeira tentativa de reduzir a Amazônia Legal, pois esses estados agora não gozam mais dos benefícios concedidos pelas agências internacionais multilaterais”, analisou Almeida, também conselheiro da SBPC e professor da Universidade do Estado do Amazonas (UEA).

Segundo o pesquisador, os organismos internacionais, até então, eram fontes de recursos para programas de proteção à Amazônia. Tais como, o Projeto Integrado de Proteção às Populações e Terras Indígenas da Amazônia Legal (PPTAL), destinado à demarcação de terras indígenas, fomentado principalmente pelo governo da Alemanha. E o PPG7 (Programa Piloto para Proteção das Florestas Tropicais do Brasil). Foram essas políticas que fortaleceram a criação do Ministério do Meio Ambiente. “Sem o apoio das agências multilaterais as políticas para a Amazônia encolheram”, disse, sem citar valores.

Conforme o antropólogo, a decisão dos estados que querem sair da Amazônia Legal significa para eles “liderar mais terras segundo as quais consideram ser produtivas”, em detrimento da conservação das florestas.

As declarações do antropólogo são baseadas no dossiê “Amazônia: sociedade, fronteiras e políticas”, produzido por Edna Maria Ramos de Castro, socióloga do Núcleo de Altos Estudos Amazônicos, da Universidade Federal do Pará (UFPA), e diretora da SBPC, que intermediou a conferência. A íntegra do documento foi publicada recentemente no Caderno CRH da Bahia.

Terras indígenas – Na avaliação da autora do dossiê, os dispositivos jurídicos desses estados ameaçam as terras indígenas – protagonistas na conservação da biodiversidade que precisam da natureza para sobreviver. “São dispositivos legais, são claros na Constituição, mas essa prática pode levar a uma situação de impasse [da sociedade]”, analisou. Edna citou o caso da polêmica obra da hidrelétrica de Belo Monte que se tornou um ícone de um processo de resistência da sociedade brasileira.

Mudança de paradigma – O antropólogo fez uma leitura sobre o atual modelo político brasileiro administrativo. Ele vê uma mudança de uma política “de proteção” para uma “ideia de protecionismo”. “A distinção entre proteção e protecionismo revela em primeiro lugar o enfraquecimento das agências multilaterais internacionais”, disse. Segundo ele, o protecionismo “erige” fora do âmbito da proteção.

Do ponto de vista de Alfredo Wagner, os sinais de mudança refletem principalmente os desacordos na reunião da Organização Mundial do Comércio (OMC) em dezembro de 2011 em Genebra. Na ocasião, houve sinais de ruptura de acordos internacionais – até então chamados de mercado comum. Um exemplo “é o engavetamento” da chamada Rodada de Doha, em razão de divergência entre as partes sobre subsídios agrícolas concedidos por países desenvolvidos.

Expansão da área militar e infraestrutura – O antropólogo lembra que no auge dos organismos multilaterais a área de segurança, isto é, a dos militares, não era fomentada porque não fazia parte de uma política de mercado único. Ele observa, entretanto, uma mudança a partir de 2009 quando há um deslocamento do modelo e problemas com os militares começam a aparecer, em decorrência da reedição de projetos de fronteiras militarizadas. “A partir daí inicia um capítulo de conflitos”.

Afastamento de fundos internacionais e órgãos reguladores – Segundo ele, o que mais sobressai na “ideia do protecionismo” é a identificação de recursos naturais estratégicos, como commodities agrícolas e minérios, que – sob o argumento de desenvolvimento sustentável – podem ser utilizados para o incremento de grandes obras de infraestrutura.

“Tudo passa a ser interpretado como interesses nacionais. A ideia de bloco vai perdendo força, o que pode explicar as próprias tensões no Mercosul, quando a Venezuela é levada ao bloco em momentos de crise. Esses interesses nacionais passam a se articular de maneira disciplinada sem passar pelas entidades multilaterais”, considera o antropólogo.

Segundo ele, atual ação do Estado brasileiro não passa pelas entidades multilaterais. Reflexo é o afastamento do Fundo Monetário Internacional (FMI) e de duas normas estrangeiras. Uma delas é a Lei de Direitos Humanos Internacional da OEA (Organização dos Estados Americanos). Ele lembra que o Brasil deixou de investir “nessa corte” a partir do momento em que a hidrelétrica de Belo Monte foi condenada pelo órgão. “O Brasil passa a ter uma posição unilateral, semelhante a dos norte-americanos na Guerra do Golfo”, observa o antropólogo. “A ideia do protecionismo vem de forma bastante forte”.

Alfredo Wagner também observa sinais de afastamento da Convenção 169 em que obriga a consulta prévia de comunidades prejudicadas por grandes obras de infraestrutura, por exemplo. Segundo ele, o Brasil é condenado a seis violações em projetos militares. Uma é pela construção do Centro de Lançamentos de Alcântara (CLA) em comunidades quilombolas no Maranhão, sem licenciamento ambiental e sem consulta às comunidades “afetadas”.

Ele alerta também sobre quatro medidas preocupantes em andamento segundo as quais preveem a construção emergencial de hidrelétricas. Um exemplo é a Medida Provisória 558 de 18 de janeiro de 2012 em que prevê redução de unidades protegidas e de conservação de florestas sob o argumento de desenvolvimento. Segundo ele, o Ibama aprovou em apenas cinco dias uma minuta de termo de referência da Eletronorte para construção de uma hidrelétrica em São Luiz de Tapajós. Na prática, foi aprovado o plano de trabalho encaminhado para diagnosticar as obras. “Com o ritmo emergencial para essas obras parece que os direitos são colocados em suspenso”.

Recursos de inconstitucionalidade – Tal MP foi questionada pela Procuradoria Geral da República por uma ADIN (Ação Direta de Inconstitucionalidade). O Ministério Público Federal considerou que as unidades de conservação nas áreas de hidrelétricas são essenciais para minimizar os impactos ambientais dos projetos; e argumentou que qualquer discussão sobre a redução dessas áreas florestais deve ser realizada no Congresso Nacional, a fim de evitar a edição de uma MP. “O Brasil hoje vive o império das Medidas Provisórias que impedem a ampla discussão da sociedade. Isso dá uma ideia de capitalismo autoritário”, disse o antropólogo.

Privatização de terras na Amazônia – Ele também alerta sobre a privatização das terras públicas na Amazônia sob o “eufemismo” de regularização fundiária, via o programa Terra Legal, pela Lei 11.952 de julho de 2009. Encaminhada pela Presidência da República, a medida prevê privatizar 70 milhões de hectares de terras públicas, um volume considerável em relação ao total de 850 milhões de hectares de terras que compõem o Brasil, segundo o antropólogo. Alfredo Wagner alerta sobre a agilidade na titularidade das terras para grandes propriedades que a MP permite, em detrimento dos pequenos proprietários.

Inicialmente, a medida foi questionada pelo Ministério Público por uma ADIN pela justificativa de que ela estabelece “privilégios injustificáveis” em favor de grileiros que no passado se beneficiaram de terras públicas e houve concentração de terras. “Essa MP é tão cruel quanto a Lei de Terras Sarney de 1969”, disse o antropólogo.

Judicialização do Estado – Buscando tranquilizar os ânimos da plateia lotada por alunos, pesquisadores, cientistas, dentre outros – estimada em cerca de 140 pessoas – que temia ser a volta da ditadura militar, o antropólogo respondeu sobre o atual modelo: “Ele não é igual à ditadura militar”, respondeu o atribuindo a um “judicialização do Estado” e de “uma coisa esquisita”.

Na ocasião, o antropólogo usou a frase de sociólogos para explicar uma crise: “O velho ainda não morreu e o novo ainda não nasceu. Mas está havendo uma transformação.”

(Viviane Monteiro – Jornal da Ciência)

Listening to Tinnitus: Roles of Media When Hearing Breaks Down (Sounding Out!)

http://soundstudiesblog.com – 16 July 2012

3127974826_d8e62bde6f_b

Editor’s Note: Welcome to the third installment in our month-long exploration of listening in observation of World Listening Day on July 18, 2012.  For the full introduction to the series click here.  To peep the previous posts, click here. Otherwise, prepare yourself to listen carefully as Mack Hagood contemplates how sound studies scholars can help tinnitus sufferers (and vice versa).  –JSA

—-

 One January morning in 2006, Joel Styzens woke up and life sounded different. Superimposed over the quiet ambience of his Chicago apartment was a cluster of sounds: pure, high-pitched tones like those of a hearing test. Loud, steady, and constant, they weren’t going away.  He walked to the bathroom to wash his face. “As soon as I turned on the water on the faucet,” he told me in an interview, “the left ear was crackling… like, a speaker, you know, being overdriven.” Joel was 24 and a professional musician, someone who made his living through focused and detailed listening.

As days passed, he grew more fearful and depressed. For two months, he barely left the house. The air brakes of a city bus or a honking horn were painful and caused his heart to race. His sense of himself, his environment, and his identity as a musician were all undermined. This man who lived through his ears now faced the prospect of a life of tinnitus (ringing or other “phantom sounds”) and its frequent companion, hyperacusis (sound sensitivity sometimes accompanied by distortion). Joel could even identify the dominant pitch of his torment: it was A sharp.

We humanistic and qualitative sound scholars—particularly those of us focused on media and technology—can learn a lot from listening to tinnitus and the people who have it. Scholars of science and technology studies (STS) often utilize moments of technological breakdown to reveal the processes and mechanisms that constitute things we take for granted. Tinnitus and hyperacusis are, in the words of anthropologist Stefan Helmreich, “moments when hearing and listening break down” (629). Because sound scholars understand sound, hearing, and listening not only as the material effects of physics and physiology, but also as culturally and technologically emergent phenomena, we can potentially contribute much to the growing public conversation around tinnitus.

“Tinnitus” by Merrick Brown

And there is a lot at stake. Tinnitus affects 10-15% of adults and is the top service-related disability affecting U.S. veterans returning from Iraq and Afghanistan. Tinnitus and hyperacusis are also fairly common among musicians who work in loud performance and media production environments. It is perhaps ironic, then, that mediated sound and music are audiologists’ primary tools in helping people recover from these conditions.

My own study of tinnitus centers on its articulation with audio-spatial media—devices such as bedside sound machines, white noise generators, and noise-canceling headphones, all used to fabricate a desired sense of space through sound. People with tinnitus are among the most avid users of these devices, carefully mediating their aural-spatial relations as tinnitus becomes more evident in quiet spaces and hyperacusis flares up in noisy ones. During my fieldwork in audiology clinics and conferences, tinnitus support groups, andonline forums, I observed that audio media were being deployed as medicine and technologies of self-care. Gradually, I came to the realization that the experience, discourse, and treatment of tinnitus is always bound up in mediation. In fact, I believe that tinnitus signals the highly mediated nature of our most intimate perceptions of sound and self. Below, I sketch just a few of the places I think aural media scholarship could go in conversation with tinnitus and hyperacusis.

The sound of media aftermath

Hearing experts do not consider subjective tinnitus to be a disease, but rather a condition in which individuals experience the normal, random neuronal firing of their auditory system as sound. Although it may be tied to various diseases and disorders, tinnitus itself is benign and does not inherently signal progressive hearing loss nor any other malignant condition.

Image by Flickr User Phil Edmonds

Nevertheless, research shows a frequent association between tinnitus and reduced auditory input, comparable to a sound engineer turning up the volume on a weak signal and thus amplifying the mixing board’s inherent noise. This “automatic gain control” theory neatly explains a classic 1953 study, in which 94 percent of “normal hearing” people experienced tinnitus in the dead silence of an anechoic chamber. Unfortunately, it also helps confirm the fear that the ringing heard after a night of loud music is due to hearing loss, known clinically as “temporary threshold shift.”

As Joel’s case suggests, when repeated, such threshold shifts lead to permanent damage. Audiologists increasingly see media-induced hearing loss and tinnitus as an epidemic, with ubiquitous earbuds often positioned as the main culprits. I have heard clinicians express dismay at encountering more young people with “old ears” in their offices, and youth education programs are beginning to proliferate. These apparent relations between aural pleasure and self-harm are an intriguing and socially significant area for sound and media scholarship, but they should also be considered within the context of moral panics that have historically accompanied the emergence of new media.

Objectifying phantom sound

For both clinicians and sufferers, one of the most frustrating and confounding aspects of tinnitus is how hard it is to objectify, either as a subject of research and treatment or as a condition worthy of empathy and activism. For both clinicians and sufferers, media are the primary tools for converting tinnitus into a manageable object.

Media marketed to protect musicians against Tinnitus, Image by Flickr User Jochen Wolters

Although media scholars haven’t yet studied it as such, the audiologist’s clinic is a center of media production and consumer electronics retail. Having audio production experience, I felt a sense of recognition on seeing the mixer-like audiometer in the control room of Joel’s audiologist, Jill Meltzer, separated by a pane of glass from the soundproofed booth where her patients sit. It was a studio where Meltzer recorded hearing rather than sound, as she attempted the tricky work of matching the pitch, volume, and sensitivity levels of tinnitus and hyperacusis. Since medication and surgery are not effective treatment options, the remedies for sale are media prosthetics and palliatives such as wearable sound generators“fractal tone” hearing aidsNeuromonics, and soundmachines that help distract, calm, and habituate patients to the ringing. Meltzer and other clinicians consistently told me that they have only two tinnitus tools at their disposal—counseling and sound.

Audiometer and testing booth, Image by the author

The subjectivity of tinnitus is most frustrating for sufferers, however, who often encounter impatience and misunderstanding from family, friends, bosses, and even their doctors. Again, media serve to externalize and objectify the sound. Joel did this through music: “A Sharp,” Styzens’ first post-tinnitus composition, represents tinnitus with chordal dissonance and hyperacusis with a powerful change of dynamics on a guitar. He eventually recorded an entire album that explored his condition and raised awareness.

.

Other individuals, in an attempt to communicate the aural experience that drives their sleeplessness, depression, anxiety, or lack of concentration, create YouTube videos designed to recreate the subjective experience of tinnitus.

.

The American Tinnitus Association, an advocacy group, has used broadcast and social media to raise awareness and research funding, as we see in this PSA from 1985.

.

However, such dramatic uses of media may be in some ways too powerful. In fact, “raising awareness of tinnitus” might be as bad as it literally sounds.

Communicable dis-ease

In the process of externalizing their experience for others to hear, people with tinnitus can make their own perception of the sound grow stronger. They may also generate anxiety in others, encouraging them to notice and problematize their own, previously benign tinnitus.

Neuroscientist Pawel Jastreboff’s groundbreaking and influentialneurophysiological model of tinnitus postulates that tinnitus becomes bothersome only when the auditory cortex forms networks with other areas in the brain, resulting in a vicious circle of increasing perception and fear. The implication of this model, now substantiated by clinical research, is that the way people think about tinnitus is a much greater predictor of suffering than the perceived volume of the sound. As Jastreboff told me in an interview, “Incorrect information can induce bothersome tinnitus.” Information, of course, circulates through media. It may be productive, then, to think of tinnitus suffering as a communicable dis-ease, one strengthened in circulation through networks of neurons, discourse, and media.

I think there is both a need and an opportunity in tinnitus for an applied sound studies, one that intervenes in this mediated public discourse, works against moral panic and hyperawareness, and suggests the quieting possibilities that open up when we grasp the constructed nature of our aurality. Listening to tinnitus as a networked coproduction highlights the ways in which our most subjective aural perceptions are also social, cultural, and mediated—perhaps the fundamental insight of sound studies. My hope is that by listening to tinnitus we can speak to it as well.

__

*Featured Image Credit: A representation of Tinnitus by Flickr User Jason Rogers, called “Day 642/365–Myself is against me”

 __
Mack Hagood is a doctoral candidate at Indiana University’s Department of Communication and Culture, where he does ethnographic research in digital media, sound studies, and popular music. He has taught courses on sound cultures, global media, ethnographic methods, and audio production. He and his students won the Indiana Society of Professional Journalists’ 2012 Best Radio Use of Sound award for their documentary series “I-69: Sounds and Stories in the Path of a Superhighway.” His publications include studies of indie rock in Taiwan (Folklore Forumand the use of noise-canceling headphones in air travel (American Quarterly)He recently completed an article on combat Foley in Fight Club and is now finishing his dissertation, titled “Sonic Technologies of the Self: Mediating Sound, Space, Self, and Sociality.” He hears crickets even in the dead of winter.

Stop bullying the ‘soft’ sciences (L.A.Times)

OP-ED

The social sciences are just that — sciences.

By Timothy D. Wilson

July 12, 2012

Sociology studentA student is seen at the UC Irvine archive doing research for her sociology dissertation. (Los Angeles Times / July 9, 2009)

Once, during a meeting at my university, a biologist mentioned that he was the only faculty member present from a science department. When I corrected him, noting that I was from the Department ofPsychology, he waved his hand dismissively, as if I were a Little Leaguer telling a member of the New York Yankees that I too played baseball.

There has long been snobbery in the sciences, with the “hard” ones (physics, chemistry, biology) considering themselves to be more legitimate than the “soft” ones ( psychology, sociology). It is thus no surprise that many members of the general public feel the same way. But of late, skepticism about the rigors of social science has reached absurd heights.

The U.S. House of Representativesrecently voted to eliminate funding for political science research through the National Science Foundation. In the wake of that action, an opinion writer for the Washington Post suggested that the House didn’t go far enough. The NSF should not fund any research in the social sciences, wrote Charles Lane, because “unlike hypotheses in the hard sciences, hypotheses about society usually can’t be proven or disproven by experimentation.”

Lane’s comments echoed ones by Gary Gutting in the Opinionator blog of the New York Times. “While the physical sciences produce many detailed and precise predictions,” wrote Gutting, “the social sciences do not. The reason is that such predictions almost always require randomized controlled experiments, which are seldom possible when people are involved.”

This is news to me and the many other social scientists who have spent their careers doing carefully controlled experiments on human behavior, inside and outside the laboratory. What makes the criticism so galling is that those who voice it, or members of their families, have undoubtedly benefited from research in the disciplines they dismiss.

Most of us know someone who has suffered from depression and sought psychotherapy. He or she probably benefited from therapies such as cognitive behavioral therapy that have been shown to work in randomized clinical trials.

Problems such as child abuse and teenage pregnancy take a huge toll on society. Interventions developed by research psychologists, tested with the experimental method, have been found to lower the incidence of child abuse and reduce the rate of teenage pregnancies.

Ever hear of stereotype threat? It is the double jeopardy that people face when they are at risk of confirming a negative stereotype of their group. When African American students take a difficult test, for example, they are concerned not only about how well they will do but also about the possibility that performing poorly will reflect badly on their entire group. This added worry has been shown time and again, in carefully controlled experiments, to lower academic performance. But fortunately, experiments have also showed promising ways to reduce this threat. One intervention, for example, conducted in a middle school, reduced the achievement gap by 40%.

If you know someone who was unlucky enough to be arrested for a crime he didn’t commit, he may have benefited from social psychological experiments that have resulted in fairer lineups and interrogations, making it less likely that innocent people are convicted.

An often-overlooked advantage of the experimental method is that it can demonstrate what doesn’t work. Consider three popular programs that research psychologists have debunked: Critical Incident Stress Debriefing, used to prevent post-traumatic stress disorders in first responders and others who have witnessed horrific events; the D.A.R.E. anti-drug program, used in many schools throughout America; and Scared Straight programs designed to prevent at-risk teens from engaging in criminal behavior.

All three of these programs have been shown, with well-designed experimental studies, to be ineffective or, in some cases, to make matters worse. And as a result, the programs have become less popular or have changed their methods. By discovering what doesn’t work, social scientists have saved the public billions of dollars.

To be fair to the critics, social scientists have not always taken advantage of the experimental method as much as they could. Too often, for example, educational programs have been implemented widely without being adequately tested. But increasingly, educational researchers are employing better methodologies. For example, in a recent study, researchers randomly assigned teachers to a program called My Teaching Partner, which is designed to improve teaching skills, or to a control group. Students taught by the teachers who participated in the program did significantly better on achievement tests than did students taught by teachers in the control group.

Are the social sciences perfect? Of course not. Human behavior is complex, and it is not possible to conduct experiments to test all aspects of what people do or why. There are entire disciplines devoted to the experimental study of human behavior, however, in tightly controlled, ethically acceptable ways. Many people benefit from the results, including those who, in their ignorance, believe that science is limited to the study of molecules.

Timothy D. Wilson is a professor of psychology at the University of Virginia and the author of “Redirect: The Surprising New Science of Psychological Change.”

Scientific particles collide with social media to benefit of all (Irish Times)

The Irish Times – Thursday, July 12, 2012

xxx Large Hadron Collider at Cern: the research body now has 590,000 followers on Twitter

xxx Large Hadron Collider at Cern: the research body now has 590,000 followers on Twitter

MARIE BORAN

IN 2008 CERN switched on the Large Hadron Collider (LHC) in Geneva – around the same time it sent out its first tweet. Although the first outing of the LHC didn’t go according to plan, the Twitter account gained 10,000 followers within the first day, according to James Gillies, head of communications at Cern.

Speaking at the Euroscience Open Forum in Dublin this week, Gillies explained the role social media plays in engaging the public with the particle physics research its laboratory does. The Twitter account now has 590,000 followers and Cern broke important news via it in March 2010 by joyously declaring: “Experiment have seen collisions.”

“Why do we communicate at Cern? If you talk to the scientists who work there they will tell you it’s a good thing to do and they all want to do it,” Gillies said, adding that Cern is publicly funded so engaging with the people who pay the bills is important.

When the existence of the Higgs particle was announced last week, it wasn’t an exclusive press event. Live video was streamed across the web, questions were taken not only from journalists but also from Twitter followers, and Cern used this as a chance to announce jobs via Facebook.

While Cern appears to be the social media darling of the science world, other research institutes and scientists are still weighing up the pros and cons of platforms like Facebook, Twitter or YouTube.

There is a certain stigma attached to social networking sites, not just because much of the content is perceived as banal, but also because too much tweeting could be damaging to your image as a scientist.

Bora Zivkovic is blogs editor at Scientific American, organiser of the fast-growing science conference ScienceOnline and speaker at the social media panel this Saturday at the Euroscience Open Forum. He says the adoption of social media by scientists is slow but growing.

“Academics are quite risk-averse and are shy about trying new things that have a perceived potential to remove the edge they may have in the academic hierarchy, either through lost time or lost reputation.”

Zivkovic talks about fear of the “Sagan effect”, named after the late Carl Sagan. A talented astronomer and astrophysicist, he was loved by the public but snubbed by the science community.

“Many still see social media as self-promotion, which is still in some scientific circles viewed as a negative thing to do. The situation is reminiscent of the very slow adoption of email by researchers back in the early 1990s.

“Once the scientists figure out how to include social media in their daily workflow, realise it does not take away from their time but actually makes them more effective in reaching their academic goals, and realise that the ‘Sagan effect’ on reputation is a thing of the past, they will readily incorporate social media into their normal work.”

Many researchers still rely heavily on specialist mailing lists. The broadcast capability on social media is far greater and bespoke, claims Dr Matthew Rowe, research associate at the Knowledge Media Institute with the Open University.

“If I was to email people about some recent work I would presume that it would be marked as spam. However, if I was to announce the release of some work through social media, then a debate and conversation could evolve surrounding the topic; I have seen this happen many times on Facebook.”

Conversations on social media sites are often seen as trivial – for scientists, the end goal is “publish or perish”. Results must be published in a reputable academic journal and preferably cited by those in their area.

Twitter, it seems, can help. A 2011 paper from researcher Gunther Eysenbach found a correlation between Twitter activity and highly cited articles. The microblogging site may help citation rate or serve as a measure of how “citable” your paper may be.

In addition, a 2010 survey on Twitter found one-third of academics said they use it for sharing information with peers, communicating with students or as a real-time news source.

For some the argument for social media is the potential for connecting with volunteers and providing valuable data from the citizen scientist. Yolanda Melero Cavero’s MinkApp has connected locals with an effort to control the mink population in Scotland.

“The most interesting thing about MinkApp, for me, was the fact that the scientist was able to get 600 volunteers for her ecological study. Social media has the grassroots potential to engage with willing volunteers,” says Nancy Salmon, researcher at the department of occupational therapy at the University of Limerick.

Rowe gives some sage social media advice for academics about keeping on topic and your language jargon-free.

But there’s always room for humour as demonstrated by the Higgs boson jokes on Twitter and Facebook last week. As astronomer Phil Platt tweeted: “I’ve got 99.9999% problems, but a Higgs ain’t one.”

Bryan Fischer Blames ‘Liberals’ Way’ For Aurora Mass Shooting (The Huffington Post)

The Huffington Post  |  By Meredith Bennett-Smith Posted: 07/24/2012 2:51 pm Updated: 07/24/2012 9:12 pm

Video

Pundits across the political spectrum have been quick to use the weekend’s tragic mass shooting at an Aurora, Colo., movie theater as a means of pushing various threads of partisan rhetoric.

Bryan Fischer, the oft-quoted mouthpiece of the American Family Association, was quick to jump on the bandwagon, tying the mass shooting first to a general breakdown in Judeo-Christian values, and most recently to the public school system’s teaching of evolution.

The Raw Story published comments made Monday by Fischer, the director of issues analysis for the fundamentalist Christian organization, during his daily radio show, “Focal Point.” In an impressive feat of extrapolation, Fischer linked the massacre to “the liberals’ way” of teaching the theory of evolution and preventing prayer in schools.

Fischer wondered aloud if bestselling author and California magachurch evangelical Reverend Rick Warren was referring to the alleged shooter, James Holmes, when hetweeted, “When students are taught they are no different from animals, they act like it.”

“If this tweet was connected to the shooting, to this James Holmes, to the one that killed the 12 and wounded the 58 in this theater, it would be appropriate,” Fischer said.

Fischer went on to blame Holmes’ murderous tendencies on Charles’ Darwin’s principle of survival of the fittest.

“[Holmes] sees himself as evolutionarily advanced just like he was taught in school about Darwin, that this is how natural selection works,” Fischer said.

Fischer then moved on to also blame the killings on the end of organized prayer in schools. The Supreme Court prohibited state-sponsored prayer in schools in two landmark cases in the early 1960s: Engel v. Vitale in 1962 and Abington School District v. Schempp one year later.

“We have spent 60 years telling God to get lost,” Fischer said. “What if every single day in [James Holmes’] educational process, there had been readings from the word of God … Who knows if things could have been different. But we’ve tried it the other way. The point of my column, we’ve tried it the liberals’ way for 60 years now. What do we got? We have massacres in Aurora.”

Fischer did not mention the fact that James Holmes’ family belonged to the Penasquitos Lutheran Church for about ten years, as originally reported by the Associated Press. Holmes’ mother still attends services there regularly.

The American Family Association is no stranger to controversy. In comments made during a segment of the AFA Journal program on Friday and reported by Right Wing Watch, AFA news director Fred Jackson, co-host Teddy James and guest Jerry Newcombe of the Truth in Action Ministries suggested that violent incidents in America, including in Aurora, were evidence of God’s judgement.

“The AFA Journal has been dealing with denominations that no longer believe in the God of the Bible,” Jackson said. “They no longer believe that Jesus is the only way of salvation, they teach that God is OK with homosexuality, this is just increasing more and more. It is mankind shaking its fist at the authority of God.”

“And God will not be silent when he’s mocked, and we need to remember that,”James said, to which Jackson replied, “We are seeing his judgment. You know, some people talk about ‘God’s judgment must be just around the corner,’ we are seeing it.”