Arquivo da tag: Visualidade

Neil Armstrong Carried Argentine Soccer Team Pennant to the Moon (Fox News)

Published August 26, 2012

Armstrong-moon-ART.jpg

New York City welcomes the Apollo 11 crew in a showering of ticker tape down Broadway and Park Avenue August 13, 1969 in a parade termed as the largest in the city’s history. Pictured in the lead car, from the right, are astronauts Neil A. Armstrong, commander; Michael Collins, command module pilot; and Edwin E. Aldrin Jr., lunar module pilot. (Photo by NASA/Newsmakers)

Astronaut Neil Armstrong, who died earlier this weekend, carried a pennant belonging to Argentina’s Independiente de Avellaneda on his history-making 1969 trip to the Moon.

Armstrong, who passed away on Saturday at the age of 82 due to complications from recent heart surgery, confirmed during a November 1969 trip to Buenos Aires that he carried the souvenir to the Moon.

The first man to walk on the Moon visited Argentina’s capital along with Apollo 11 crewmates Edwin “Buzz” Aldrin and Michael Collins as part of a global tour organized by NASA.

Armstrong landed on the Moon with Aldrin on July 20, 1969, in the lunar module Eagle while Collins circled overhead aboard the command module Columbia.

The space pioneer said he had carried the pennant to the Moon, confirming statements by team officials that had been called into question by the public in Argentina at the time.

Hector Rodriguez, who served as Independiente’s public affairs chief at the time, proposed making Armstrong, Aldrin and Collins honorary partners in the team before Apollo 11’s voyage to the Moon.

“If they are going to be the greatest heroes of the century, they have to be Independiente partners,” Rodriguez said at the time.

Team management agreed to the deal and the three astronauts were registered as partners, with Aldrin as No. 80,399, Armstrong as No. 80,400 and Collins as No. 80,401.

Identification cards bearing photos provided by the U.S. Embassy in Buenos Aires were sent to the United States along with club pennants and gear for the astronauts’ children.

Armstrong thanked the team for its gesture in a May 1969 letter and said he wished to “be able to visit Buenos Aires soon and that circumstances will allow me to accept your invitation to visit the club,” an event that never took place.

Rodriguez was invited to a reception held for the astronauts in Buenos Aires by U.S. Ambassador to Argentina John Davis Lodge.

Armstrong said during the reception that the Independiente pennant brought the astronauts good luck on the trip to the Moon.

The story makes partners and fans of Independiente, which has won a record seven Libertadores Cups, proud.

The team, however, is currently struggling and could be relegated from Argentina’s First Division, something that has never happened before.

Intriguing Habitats, and Careful Discussions of Climate Change (N.Y.Times)

THE ANIMAL LIFEBOAT

Gretchen Ertl for The New York TimesPacific Sea nettle jellyfish at the New England Aquarium in Boston. Zoos and aquariums are working to include educational elements about the environment without alienating visitors.

By 

Published: August 26, 2012

BOSTON — Sitting on an artificial mangrove island in the middle of the ray and shark “touch tank,” Lindsay Jordan, a staff member at the New England Aquarium, explained the rays’ eating habits as children and their parents trailed fingers through the water. “Does anyone know how we touch these animals when we are not at the aquarium?” she asked.

The children’s faces turned up expectantly.

“The ocean absorbs one-third of the world’s carbon dioxide emissions,” Ms. Jordan said, explaining that it upsets the food chain. “When you turn on your car, it affects them.”

Downstairs, next to the jellyfish tanks, a rhyming video told how the jellyfish population was exploding in the wild because they thrive in warmer waters. In the main room, a staff member pointed to a rare blue lobster, saying that some lobsters have been scuttling out of Massachusetts and settling in cooler climes to the north.

With many zoos and aquariums now working with conservation organizations and financed by individuals who feel strongly about threatened habitats and species, managers have been wrestling with how aggressive to be in educating visitors on the perils of climate change.

Surveys show that American zoos and aquariums enjoy a high level of public trust and are ideally positioned to teach.

Yet many managers are fearful of alienating visitors — and denting ticket sales — with tours or wall labels that dwell bleakly on damaged coral reefs, melting ice caps or dying trees.

“You don’t want them walking away saying, ‘I paid to get in, I bought my kid a hot dog, I just want to show my kid a fish — and you are making me feel bad about climate change,’ ” said Paul Boyle, the senior vice president for conservation and education at the Association of Zoos and Aquariums.

Some zoos and aquariums have therefore held back, relegating the theme to, say, a sign about Arctic melting in the polar bear exhibit. But many have headed in the other direction, putting climate change front and center in a way that they hope will inspire a young generation of zoogoers.

Working with cognitive scientists and experts in linguistics and anthropology, a coalition of aquariums set out in 2008 to develop a patter that would intrigue rather than daunt or depress the average visitor. After the group was pleased with the script, it secured a grant of about $1 million last year from the National Science Foundation to train staffs across the nation. This month, the foundation awarded the group an additional $5.5 million for a five-year education effort.

Dr. Boyle said that most of the association’s 224 members now have some sort of climate message.

The form varies from subtle to pointed. The zoos in Cincinnati and Toledo, Ohio, for instance, have installed prominent solar arrays over their parking lots to power exhibits and set an example. The San Diego Zoo and the Brookfield Zoo near Chicago have made their exhibits of polar bears and other Arctic species more direct about the threats posed by global warming.

So far the feedback has largely been positive, officials at most zoos say.

Ariella Camera, a counselor with a summer program run by Boston Rising, an antipoverty group, said some of her charges recently took part in a game at the New England Aquarium that taught them what emits carbon dioxide (many factories, most cars) and what absorbs it (trees and the ocean). They were then challenged to balance the two.

Afterward the students struck up a lively conversation about their carbon footprints, Ms. Camera said. “It was a very engaging presentation,” she said.

Such anecdotes gratify Howard Ris, the aquarium’s president. “We would like as many people, if not everyone, to leave encouraged to take action,” he said.

Others are dubious that it will work. “Zoos have been making claims about their educational value for 150 years,” said Jeffrey Hyson, a cultural historian and the director of the American studies program at St. Joseph’s University in Philadelphia. The zoos “say a lot more about what they think they are doing than they can really demonstrate.”

Zoo managers acknowledge that they initially struggled with the challenge of delivering bad news.

In the 1980s and ’90s, Dr. Boyle noted, some zoos and aquariums made a big push to emphasize threats like the depletion of the earth’s ozone layer, the razing of rain forests by loggers and farmers and the overfishing of the Pacific. Electronic boards toted up the numbers of acres being cleared, and enlarged photographs depicted denuded landscapes.

Surveys of visitors showed a backlash. “For lots of reasons, the institutions tended to approach the issues by talking about the huge scale of the problems,” Dr. Boyle said. “They wanted to attract people’s attention, but what we saw happening over time was that everyday people were overwhelmed.” It did not help that a partisan split had opened in the United States over whether global warming was under way, and whether human activity was the leading cause.

At the Georgia Aquarium in Atlanta, Brian Davis, the vice president for education and training, says to this day his institution ensures its guests will not hear the term global warming. Visitors are “very conservative,” he said. “When they hear certain terms, our guests shut down. We’ve seen it happen.”

Such hesitancy inspired the group of leading aquariums to develop, test and refine their model, which comes off as casual and chatty.

Word choices matter, research showed. The FrameWorks Institute, a nonprofit organization that studies how people process abstract concepts, found the phrase “greenhouse gas effect” perplexed people. “They think it is a nice place for plants to grow,” said FrameWorks’ president, Susan Bales. So her group advised substituting “heat-trapping blanket” to describe the accumulation of gases in the atmosphere.

Today’s guides also make a point of encouraging groups to focus first on the animals, leaving any unpleasant message for later.

At the New England Aquarium’s giant reef tank, visitors peered over the side and watched sand tiger sharks, sea turtles and tropical fish swim around a giant coral reef. As a diver entered the tank to feed the fish, a guide explained that the smaller ones tend to hide in coral for safety.

A few minutes passed before she told the crowd that corals around the world are bleaching and dying because of a pronounced rise in ocean temperature and acidity.

Upon leaving, the visitors were briefed on positive steps they could take, like using public transportation or bikes and being cautious about energy consumption.

Yet sometimes, the zoo animals are so entrancing that a climate-related message may fall on deaf ears.

Leanne Gaffney, who recently brought four high school students from a summer enrichment program to the New England Aquarium, said they were fascinated by creatures like leafy sea dragons and tropical snakes, but not so much by how their habitats were faring.

“They are teenage boys,” she said. “Mostly they just wanted to see the anacondas.”

Climate Science as Culture War (Stanford Social Innovation Review)

ENVIRONMENT

The public debate around climate change is no longer about science—it’s about values, culture, and ideology.

By Andrew J. Hoffman | 18 | Fall 2012

earth_first_members_environmentSouth Florida Earth First members protest outside the Platts Coal Properties and Investment Conference in West Palm Beach. (Photo by Bruce R. Bennett/Zum Press/Newscom)

In May 2009, a development officer at the University of Michigan asked me to meet with a potential donor—a former football player and now successful businessman who had an interest in environmental issues and business, my interdisciplinary area of expertise. The meeting began at 7 a.m., and while I was still nursing my first cup of coffee, the potential donor began the conversation with “I think the scientific review process is corrupt.” I asked what he thought of a university based on that system, and he said that he thought that the university was then corrupt, too. He went on to describe the science of climate change as a hoax, using all the familiar lines of attack—sunspots and solar flares, the unscientific and politically flawed consensus model, and the environmental benefits of carbon dioxide.

As we debated each point, he turned his attack on me, asking why I hated capitalism and why I wanted to destroy the economy by teaching environmental issues in a business school. Eventually, he asked if I knew why Earth Day was on April 22. I sighed as he explained, “Because it is Karl Marx’s birthday.” (I suspect he meant to say Vladimir Lenin, whose birthday is April 22, also Earth Day. This linkage has been made by some on the far right who believe that Earth Day is a communist plot, even though Lenin never promoted environmentalism and communism does not have a strong environmental legacy.)

I turned to the development officer and asked, “What’s our agenda here this morning?” The donor interrupted to say that he wanted to buy me a ticket to the Heartland Institute’s Fourth Annual Conference on Climate Change, the leading climate skeptics conference. I checked my calendar and, citing prior commitments, politely declined. The meeting soon ended.

I spent the morning trying to make sense of the encounter. At first, all I could see was a bait and switch; the donor had no interest in funding research in business and the environment, but instead wanted to criticize the effort. I dismissed him as an irrational zealot, but the meeting lingered in my mind. The more I thought about it, the more I began to see that he was speaking from a coherent and consistent worldview—one I did not agree with, but which was a coherent viewpoint nonetheless. Plus, he had come to evangelize me. The more I thought about it, the more I became eager to learn about where he was coming from, where I was coming from, and why our two worldviews clashed so strongly in the present social debate over climate science. Ironically, in his desire to challenge my research, he stimulated a new research stream, one that fit perfectly with my broader research agenda on social, institutional, and cultural change.

Scientific vs. Social Consensus

Today, there is no doubt that a scientific consensus exists on the issue of climate change. Scientists have documented that anthropogenic sources of greenhouse gases are leading to a buildup in the atmosphere, which leads to a general warming of the global climate and an alteration in the statistical distribution of localized weather patterns over long periods of time. This assessment is endorsed by a large body of scientific agencies—including every one of the national scientific agencies of the G8 + 5 countries—and by the vast majority of climatologists. The majority of research articles published in refereed scientific journals also support this scientific assessment. Both the US National Academy of Sciences and the American Association for the Advancement of Science use the word “consensus” when describing the state of climate science.

And yet a social consensus on climate change does not exist. Surveys show that the American public’s belief in the science of climate change has mostly declined over the past five years, with large percentages of the population remaining skeptical of the science. Belief declined from 71 percent to 57 percent between April 2008 and October 2009, according to an October 2009 Pew Research Center poll; more recently, belief rose to 62 percent, according to a February 2012 report by the National Survey of American Public Opinion on Climate Change. Such a significant number of dissenters tells us that we do not have a set of socially accepted beliefs on climate change—beliefs that emerge, not from individual preferences, but from societal norms; beliefs that represent those on the political left, right, and center as well as those whose cultural identifications are urban, rural, religious, agnostic, young, old, ethnic, or racial.

Why is this so? Why do such large numbers of Americans reject the consensus of the scientific community? With upwards of two-thirds of Americans not clearly understanding science or the scientific process and fewer able to pass even a basic scientific literacy test, according to a 2009 California Academy of Sciences survey, we are left to wonder: How do people interpret and validate the opinions of the scientific community? The answers to this question can be found, not from the physical sciences, but from the social science disciplines of psychology, sociology, anthropology, and others.

To understand the processes by which a social consensus can emerge on climate change, we must understand that people’s opinions on this and other complex scientific issues are based on their prior ideological preferences, personal experience, and values—all of which are heavily influenced by their referent groups and their individual psychology. Physical scientists may set the parameters for understanding the technical aspects of the climate debate, but they do not have the final word on whether society accepts or even understands their conclusions. The constituency that is relevant in the social debate goes beyond scientific experts. And the processes by which this constituency understands and assesses the science of climate change go far beyond its technical merits. We must acknowledge that the debate over climate change, like almost all environmental issues, is a debate over culture, worldviews, and ideology.

This fact can be seen most vividly in the growing partisan divide over the issue. Political affiliation is one of the strongest correlates with individual uncertainty about climate change, not scientific knowledge.1 The percentage of conservatives and Republicans who believe that the effects of global warming have already begun declined from roughly 50 percent in 2001 to about 30 percent in 2010, while the corresponding percentage for liberals and Democrats increased from roughly 60 percent in 2001 to about 70 percent in 2010.2 (See “The Growing Partisan Divide over Climate Change,” below.)

 

Climate change has become enmeshed in the so-called culture wars. Acceptance of the scientific consensus is now seen as an alignment with liberal views consistent with other “cultural” issues that divide the country (abortion, gun control, health care, and evolution). This partisan divide on climate change was not the case in the 1990s. It is a recent phenomenon, following in the wake of the 1997 Kyoto Treaty that threatened the material interests of powerful economic and political interests, particularly members of the fossil fuel industry.3 The great danger of a protracted partisan divide is that the debate will take the form of what I call a “logic schism,” a breakdown in debate in which opposing sides are talking about completely different cultural issues.4

This article seeks to delve into the climate change debate through the lens of the social sciences. I take this approach not because the physical sciences have become less relevant, but because we need to understand the social and psychological processes by which people receive and understand the science of global warming. I explain the cultural dimensions of the climate debate as it is currently configured, outline three possible paths by which the debate can progress, and describe specific techniques that can drive that debate toward broader consensus. This goal is imperative, for without a broader consensus on climate change in the United States, Americans and people around the globe will be unable to formulate effective social, political, and economic solutions to the changing circumstances of our planet.

Cultural Processing of Climate Science

When analyzing complex scientific information, people are “boundedly rational,” to use Nobel Memorial Prize economist Herbert Simon’s phrase; we are “cognitive misers,” according to UCLA psychologist Susan Fiske and Princeton University psychologist Shelley Taylor, with limited cognitive ability to fully investigate every issue we face. People everywhere employ ideological filters that reflect their identity, worldview, and belief systems. These filters are strongly influenced by group values, and we generally endorse the position that most directly reinforces the connection we have with others in our referent group—what Yale Law School professor Dan Kahan refers to as “cultural cognition.” In so doing, we cement our connection with our cultural groups and strengthen our definition of self. This tendency is driven by an innate desire to maintain a consistency in beliefs by giving greater weight to evidence and arguments that support preexisting beliefs, and by expending disproportionate energy trying to refute views or arguments that are contrary to those beliefs. Instead of investigating a complex issue, we often simply learn what our referent group believes and seek to integrate those beliefs with our own views.

Over time, these ideological filters become increasingly stable and resistant to change through multiple reinforcing mechanisms. First, we’ll consider evidence when it is accepted or, ideally, presented by a knowledgeable source from our cultural community; and we’ll dismiss information that is advocated by sources that represent groups whose values we reject. Second, we will selectively choose information sources that support our ideological position. For example, frequent viewers of Fox News are more likely to say that the Earth’s temperature has not been rising, that any temperature increase is not due to human activities, and that addressing climate change would have deleterious effects on the economy.5 One might expect the converse to be true of National Public Radio listeners. The result of this cultural processing and group cohesion dynamics leads to two overriding conclusions about the climate change debate.

First, climate change is not a “pollution” issue. Although the US Supreme Court decided in 2007 that greenhouse gases were legally an air pollutant, in a cultural sense, they are something far different. The reduction of greenhouse gases is not the same as the reduction of sulfur oxides, nitrogen oxides, carbon monoxide, or particulates. These forms of pollution are man-made, they are harmful, and they are the unintended waste products of industrial production. Ideally, we would like to eliminate their production through the mobilization of economic and technical resources. But the chief greenhouse gas, carbon dioxide, is both man-made and natural. It is not inherently harmful; it is a natural part of the natural systems; and we do not desire to eliminate its production. It is not a toxic waste or a strictly technical problem to be solved. Rather, it is an endemic part of our society and who we are. To a large degree, it is a highly desirable output, as it correlates with our standard of living. Greenhouse gas emissions rise with a rise in a nation’s wealth, something all people want. To reduce carbon dioxide requires an alteration in nearly every facet of the economy, and therefore nearly every facet of our culture. To recognize greenhouse gases as a problem requires us to change a great deal about how we view the world and ourselves within it. And that leads to the second distinction.

Climate change is an existential challenge to our contemporary worldviews. The cultural challenge of climate change is enormous and threefold, each facet leading to the next. The first facet is that we have to think of a formerly benign, even beneficial, material in a new way—as a relative, not absolute, hazard. Only in an imbalanced concentration does it become problematic. But to understand and accept this, we need to conceive of the global ecosystem in a new way.

This challenge leads us to the second facet: Not only do we have to change our view of the ecosystem, but we also have to change our view of our place within it. Have we as a species grown to such numbers, and has our technology grown to such power, that we can alter and manage the ecosystem on a planetary scale? This is an enormous cultural question that alters our worldviews. As a result, some see the question and subsequent answer as intellectual and spiritual hubris, but others see it as self-evident.

If we answer this question in the affirmative, the third facet challenges us to consider new and perhaps unprecedented forms of global ethics and governance to address it. Climate change is the ultimate “commons problem,” as ecologist Garrett Hardin defined it, where every individual has an incentive to emit greenhouse gases to improve her standard of living, but the costs of this activity are borne by all. Unfortunately, the distribution of costs in this global issue is asymmetrical, with vulnerable populations in poor countries bearing the larger burden. So we need to rethink our ethics to keep pace with our technological abilities. Does mowing the lawn or driving a fuel-inefficient car in Ann Arbor, Mich., have ethical implications for the people living in low-lying areas of Bangladesh? If you accept anthropogenic climate change, then the answer to this question is yes, and we must develop global institutions to reflect that recognition. This is an issue of global ethics and governance on a scale that we have never seen, affecting virtually every economic activity on the globe and requiring the most complicated and intrusive global agreement ever negotiated.

Taken together, these three facets of our existential challenge illustrate the magnitude of the cultural debate that climate change provokes. Climate change challenges us to examine previously unexamined beliefs and worldviews. It acts as a flash point (albeit a massive one) for deeper cultural and ideological conflicts that lie at the root of many of our environmental problems, and it includes differing conceptions of science, economics, religion, psychology, media, development, and governance. It is a proxy for “deeper conflicts over alternative visions of the future and competing centers of authority in society,” as University of East Anglia climatologist Mike Hulme underscores in Why We Disagree About Climate Change. And, as such, it provokes a violent debate among cultural communities on one side who perceive their values to be threatened by change, and cultural communities on the other side who perceive their values to be threatened by the status quo.

Three Ways Forward

If the public debate over climate change is no longer about greenhouse gases and climate models, but about values, worldviews, and ideology, what form will this clash of ideologies take? I see three possible forms.

The Optimistic Form is where people do not have to change their values at all. In other words, the easiest way to eliminate the common problems of climate change is to develop technological solutions that do not require major alterations to our values, worldviews, or behavior: carbon-free renewable energy, carbon capture and sequestration technologies, geo-engineering, and others. Some see this as an unrealistic future. Others see it as the only way forward, because people become attached to their level of prosperity, feel entitled to keep it, and will not accept restraints or support government efforts to impose restraints.6Government-led investment in alternative energy sources, therefore, becomes more acceptable than the enactment of regulations and taxes to reduce fossil fuel use.

The Pessimistic Form is where people fight to protect their values. This most dire outcome results in a logic schism, where opposing sides debate different issues, seek only information that supports their position and disconfirms the others’, and even go so far as to demonize the other. University of Colorado, Boulder, environmental scientist Roger Pielke in The Honest Broker: Making Sense of Science in Policy and Politics describes the extreme of such schisms as “abortion politics,” where the two sides are debating completely different issues and “no amount of scientific information … can reconcile the different values.” Consider, for example, the recent decision by the Heartland Institute to post a billboard in Chicago comparing those who believe in climate change with the Unabomber. In reply, climate activist groups posted billboards attacking Heartland and its financial supporters. This attack-counterattack strategy is symptomatic of a broken public discourse over climate change.

The Consensus-Based Form involves a reasoned societal debate, focused on the full scope of technical and social dimensions of the problem and the feasibility and desirability of multiple solutions. It is this form to which scientists have the most to offer, playing the role of what Pielke calls the “honest broker”—a person who can “integrate scientific knowledge with stakeholder concerns to explore alternative possible courses of action.” Here, resolution is found through a focus on its underlying elements, moving away from positions (for example, climate change is or is not happening), and toward the underlying interests and values at play. How do we get there? Research in negotiation and dispute resolution can offer techniques for moving forward.

Techniques for a Consensus-Based Discussion

In seeking a social consensus on climate change, discussion must move beyond a strict focus on the technical aspects of the science to include its cultural underpinnings. Below are eight techniques for overcoming the ideological filters that underpin the social debate about climate change.

Know your audience | Any message on climate change must be framed in a way that fits with the cultural norms of the target audience. The 2011 study Climate Change in the American Mind segments the American public into six groups based on their views on climate change science. (See “Six Americas,” below.) On the two extremes are the climate change “alarmed” and “dismissive.” Consensus-based discussion is not likely open to these groups, as they are already employing logic schism tactics that are closed to debate or engagement. The polarity of these groups is well known: On the one side, climate change is a hoax, humans have no impact on the climate, and nothing is happening; on the other side, climate change is an imminent crisis that will devastate the Earth, and human activity explains all climate changes.

climate_change_chart_six_americas 

The challenge is to move the debate away from the loud minorities at the extremes and to engage the majority in the middle—the “concerned,” the “cautious,” the “disengaged,” and the “doubtful.” People in these groups are more open to consensus-based debate, and through direct engagement can be separated from the ideological extremes of their cultural community.

Ask the right scientific questions | For a consensus-based discussion, climate change science should be presented not as a binary yes or no question,7 but as a series of six questions. Some are scientific in nature, with associated levels of uncertainty and probability; others are matters of scientific judgment.

  • Are greenhouse gas concentrations increasing in the atmosphere? Yes. This is a scientific question, based on rigorous data and measurements of atmospheric chemistry and science.
  • Does this increase lead to a general warming of the planet? Yes. This is also a scientific question; the chemical mechanics of the greenhouse effect and “negative radiative forcing” are well established.
  • Has climate changed over the past century? Yes. Global temperature increases have been rigorously measured through multiple techniques and strongly supported by multiple scientific analyses.In fact, as Yale University economist William Nordhaus wrote in the March 12, 2012, New York Times, “The finding that global temperatures are rising over the last century-plus is one of the most robust findings in climate science and statistics.”
  • Are humans partially responsible for this increase? The answer to this question is a matter of scientific judgment. Increases in global mean temperatures have a very strong correlation with increases in man-made greenhouse gases since the Industrial Revolution. Although science cannot confirm causation, fingerprint analysis of multiple possible causes has been examined, and the only plausible explanation is that of human-induced temperature changes. Until a plausible alternative hypothesis is presented, this explanation prevails for the scientific community.
  • Will the climate continue to change over the next century? Again, this question is a matter of scientific judgment. But given the answers to the previous four questions, it is reasonable to believe that continued increases in greenhouse gases will lead to continued changes in the climate.
  • What will be the environmental and social impact of such change? This is the scientific question with the greatest uncertainty. The answer comprises a bell curve of possible outcomes and varying associated probabilities, from low to extreme impact. Uncertainty in this variation is due to limited current data on the Earth’s climate system, imperfect modeling of these physical processes, and the unpredictability of human actions that can both exasperate or moderate the climate shifts. These uncertainties make predictions difficult and are an area in which much debate can take place. And yet the physical impacts of climate change are already becoming visible in ways that are consistent with scientific modeling, particularly in Greenland, the Arctic, the Antarctic, and low-lying islands.

In asking these questions, a central consideration is whether people recognize the level of scientific consensus associated with each one. In fact, studies have shown that people’s support for climate policies and action are linked to their perceptions about scientific agreement. Still, the belief that “most scientists think global warming is happening” declined from 47 percent to 39 percent among Americans between 2008 and 2011.8

Move beyond data and models | Climate skepticism is not a knowledge deficit issue. Michigan State University sociologist Aaron McCright and Oklahoma State University sociologist Riley Dunlap have observed that increased education and self-reported understanding of climate science have been shown to correlate with lower concern among conservatives and Republicans and greater concern among liberals and Democrats. Research also has found that once people have made up their minds on the science of the climate issue, providing continued scientific evidence actually makes them more resolute in resisting conclusions that are at variance with their cultural beliefs.9 One needs to recognize that reasoning is suffused with emotion and people often use reasoning to reach a predetermined end that fits their cultural worldviews. When people hear about climate change, they may, for example, hear an implicit criticism that their lifestyle is the cause of the issue or that they are morally deficient for not recognizing it. But emotion can be a useful ally; it can create the abiding commitments needed to sustain action on the difficult issue of climate change. To do this, people must be convinced that something can be done to address it; that the challenge is not too great nor are its impacts preordained. The key to engaging people in a consensus-driven debate about climate change is to confront the emotionality of the issue and then address the deeper ideological values that may be threatened to create this emotionality.

Focus on broker frames | People interpret information by fitting it to preexisting narratives or issue categories that mesh with their worldview. Therefore information must be presented in a form that fits those templates, using carefully researched metaphors, allusions, and examples that trigger a new way of thinking about the personal relevance of climate change. To be effective, climate communicators must use the language of the cultural community they are engaging. For a business audience, for example, one must use business terminology, such as net present value, return on investment, increased consumer demand, and rising raw material costs.

More generally, one can seek possible broker frames that move away from a pessimistic appeal to fear and instead focus on optimistic appeals that trigger the emotionality of a desired future. In addressing climate change, we are asking who we strive to be as a people, and what kind of world we want to leave our children. To gain buy-in, one can stress American know-how and our capacity to innovate, focusing on activities already under way by cities, citizens, and businesses.10

This approach frames climate change mitigation as a gain rather than a loss to specific cultural groups. Research has shown that climate skepticism can be caused by a motivational tendency to defend the status quo based on the prior assumption that any change will be painful. But by encouraging people to regard pro-environmental change as patriotic and consistent with protecting the status quo, it can be framed as a continuation rather than a departure from the past.

Specific broker frames can be used that engage the interests of both sides of the debate. For example, when US Secretary of Energy Steven Chu referred in November 2010 to advances in renewable energy technology in China as the United States’ “Sputnik moment,” he was framing climate change as a common threat to US scientific and economic competitiveness. When Pope Benedict XVI linked the threat of climate change with threats to life and dignity on New Year’s Day 2010, he was painting it as an issue of religious morality. When CNA’s Military Advisory Board, a group of elite retired US military officers, called climate change a “threat multiplier” in its 2006 report, it was using a national security frame. When the Lancet Commission pronounced climate change to be the biggest global health threat of the 21st century in a 2009 article, the organization was using a quality of life frame. And when the Center for American Progress, a progressive Washington, D.C., think tank, connected climate change to the conservation ideals of Presidents Theodore Roosevelt and Richard Nixon, they were framing the issue as consistent with Republican values.

One broker frame that deserves particular attention is the replacement of uncertainty or probability of climate change with the risk of climate change.11 People understand low probability, high consequence events and the need to address them. For example, they buy fire insurance for their homes even though the probability of a fire is low, because they understand that the financial consequence is too great. In the same way, climate change for some may be perceived as a low risk, high consequence event, so the prudent course of action is to obtain insurance in the form of both behavioral and technological change.

Recognize the power of language and terminology | Words have multiple meanings in different communities, and terms can trigger unintended reactions in a target audience. For example, one study has shown that Republicans were less likely to think that the phenomenon is real when it is referred to as “global warming” (44 percent) rather than “climate change” (60 percent), but Democrats were unaffected by the term (87 percent vs. 86 percent). So language matters: The partisan divide dropped from 43 percent under a “global warming” frame to 26 percent under a “climate change” frame.12

Other terms with multiple meanings include “climate denier,” which some use to refer to those who are not open to discussion on the issue, and others see as a thinly veiled and highly insulting reference to “Holocaust denier”; “uncertainty,” which is a scientific concept to convey variance or deviation from a specific value, but is interpreted by a lay audience to mean that scientists do not know the answer; and “consensus,” which is the process by which the Intergovernmental Panel on Climate Change (IPCC) forms its position, but leads some in the public to believe that climate science is a matter of “opinion” rather than data and modeling.

Overall, the challenge becomes one of framing complex scientific issues in a language that a lay and highly politicized audience can hear. This becomes increasingly challenging when we address some inherently nonintuitive and complex aspects of climate modeling that are hard to explain, such as the importance of feedback loops, time delays, accumulations, and nonlinearities in dynamic systems.13 Unless scientists can accurately convey the nature of climate modeling, others in the social debate will alter their claims to fit their cultural or cognitive perceptions or satisfy their political interests.

Employ climate brokers | People are more likely to feel open to consider evidence when a recognized member of their cultural community presents it.14 Certainly, statements by former Vice President Al Gore and Sen. James Inhofe evoke visceral responses from individuals on either side of the partisan divide. But individuals with credibility on both sides of the debate can act as what I call climate brokers. Because a majority of Republicans do not believe the science of climate change, whereas a majority of Democrats do, the most effective broker would come from the political right. Climate brokers can include representatives from business, the religious community, the entertainment industry, the military, talk show hosts, and politicians who can frame climate change in language that will engage the audience to whom they most directly connect. When people hear about the need to address climate change from their church, synagogue, mosque, or temple, for example, they w ill connect the issue to their moral values. When they hear it from their business leaders and investment managers, they will connect it to their economic interests. And when they hear it from their military leaders, they will connect it to their interest in a safe and secure nation.

Recognize multiple referent groups | The presentation of information can be designed in a fashion that recognizes that individuals are members of multiple referent groups. The underlying frames employed in one cultural community may be at variance with the values dominant within the communities engaged in climate change debate. For example, although some may reject the science of climate change by perceiving the scientific review process to be corrupt as part of one cultural community, they also may recognize the legitimacy of the scientific process as members of other cultural communities (such as users of the modern health care system). Although someone may see the costs of fossil fuel reductions as too great and potentially damaging to the economy as members of one community, they also may see the value in reducing dependence on foreign oil as members of another community who value strong national defense. This frame incongruence emerged in the 2011 US Republican primary as candidate Jon Huntsman warned that Republicans risk becoming the “antiscience party” if they continue to reject the science on climate change. What Huntsman alluded to is that most Americans actually do trust the scientific process, even if they don’t fully understand it. (A 2004 National Science Foundation report found that two thirds of Americans do not clearly understand the scientific process.)

Employ events as leverage for change | Studies have found that most Americans believe that climate change will affect geographically and temporally distant people and places. But studies also have shown that people are more likely to believe in the science when they have an experience with extreme weather phenomena. This has led climate communicators to link climate change to major events, such as Hurricane Katrina, or to more recent floods in the American Midwest and Asia, as well as to droughts in Texas and Africa, to hurricanes along the East Coast and Gulf of Mexico, and to snowstorms in Western states and New England. The cumulative body of weather evidence, reported by media outlets and linked to climate change, will increase the number of people who are concerned about the issue, see it as less uncertain, and feel more confident that we must take actions to mitigate its effects. For example, in explaining the recent increase in belief in climate change among Americans, the 2012 National Survey of American Public Opinion on Climate Change noted that “about half of Americans now point to observations of temperature changes and weather as the main reasons they believe global warming is taking place.”15

Ending Climate Science Wars

Will we see a social consensus on climate change? If beliefs about the existence of global warming are becoming more ideologically entrenched and gaps between conservatives and liberals are widening, the solution space for resolving the issue will collapse and the debate will be based on power and coercion. In such a scenario, domination by the science-based forces looks less likely than domination by the forces of skepticism, because the former has to “prove” its case while the latter merely needs to cast doubt. But such a polarized outcome is not a predetermined outcome. And if it were to form, it can be reversed.

Is there a reason to be hopeful? When looking for reasons to be hopeful about a social consensus on climate change, I look to public opinion changes around cigarette smoking and cancer. For years, the scientific community recognized that the preponderance of epidemiological and mechanistic data pointed to a link between the habit and the disease. And for years, the public rejected that conclusion. But through a process of political, economic, social, and legal debate over values and beliefs, a social consensus emerged. The general public now accepts that cigarettes cause cancer and governments have set policy to address this. Interestingly, two powerful forces that many see as obstacles to a comparable social consensus on climate change were overcome in the cigarette debate.

The first obstacle is the powerful lobby of industrial forces that can resist a social and political consensus. In the case of the cigarette debate, powerful economic interests mounted a campaign to obfuscate the scientific evidence and to block a social and political consensus. Tobacco companies created their own pro-tobacco science, but eventually the public health community overcame pro-tobacco scientists.

The second obstacle to convincing a skeptical public is the lack of a definitive statement by the scientific community about the future implications of climate change. The 2007 IPCC report states that “Human activities … are modifying the concentration of atmospheric constituents … that absorb or scatter radiant energy. … [M]ost of the observed warming over the last 50 years is very likely to have been due to the increase in greenhouse gas emissions.” Some point to the word “likely” to argue that scientists still don’t know and action in unwarranted. But science is not designed to provide a definitive smoking gun. Remember that the 1964 surgeon general’s report about the dangers of smoking was equally conditional. And even today, we cannot state with scientific certainty that smoking causes lung cancer. Like the global climate, the human body is too complex a system for absolute certainty. We can explain epidemiologically why a person could get cancer from cigarette smoking and statistically how that person will likely get cancer, but, as the surgeon general report explains, “statistical methods cannot establish proof of a causal relationship in an association [between cigarette smoking and lung cancer]. The causal significance of an association is a matter of judgment, which goes beyond any statement of statistical probability.” Yet the general public now accepts this causal linkage.

What will get us there? Although climate brokers are needed from all areas of society—from business, religion, military, and politics—one field in particular needs to become more engaged: the academic scientist and particularly the social scientist. Too much of the debate is dominated by the physical sciences in defining the problem and by economics in defining the solutions. Both fields focus heavily on the rational and quantitative treatments of the issue and fail to capture the behavioral and cultural aspects that explain why people accept or reject scientific evidence, analysis, and conclusions. But science is never socially or politically inert, and scientists have a duty to recognize its effect on society and to communicate that effect to society. Social scientists can help in this endeavor.

But the relative absence of the social sciences in the climate debate is driven by specific structural and institutional controls that channel research work away from empirical relevance. Social scientists limit involvement in such “outside” activities, because the underlying norms of what is considered legitimate and valuable research, as well as the overt incentives and reward structures within the academy, lead away from such endeavors. Tenure and promotion are based primarily on the publication of top-tier academic journal articles. This is the signal of merit and success. Any effort on any other endeavor is decidedly discouraged.

The role of the public intellectual has become an arcane and elusive option in today’s social sciences. Moreover, it is a difficult role to play. The academic rules are not clear and the public backlash can be uncomfortable; many of my colleagues and I are regular recipients of hostile e-mail messages and web-based attacks. But the lack of academic scientists in the public debate harms society by leaving out critical voices for informing and resolving the climate debate. There are signs, however, that this model of scholarly isolation is changing. Some leaders within the field have begun to call for more engagement within the public arena as a way to invigorate the discipline and underscore its investment in the defense of civil society. As members of society, all scientists have a responsibility to bring their expertise to the decision-making process. It is time for social scientists to accept this responsibility.

Notes

1 Wouter Poortinga et al., “Uncertain Climate: An Investigation into Public Skepticism
About Anthropogenic Climate Change
,” Global Environmental Change, August 2011.
2 Aaron McCright and Riley Dunlap, “The Politicization of Climate Change and Polarization
in the American Public’s Views of Global Warming, 2001-2010
,” The Sociological
Quarterly
 52, 2011.
3 Clive Hamilton, “Why We Resist the Truth About Climate Change,” paper presented
to the Climate Controversies: Science and Politics conference, Brussels, Oct. 28, 2010.
4 Andrew Hoffman, “Talking Past Each Other? Cultural Framing of Skeptical and Convinced
Logics in the Climate Change Debate
,” Organization & Environment 24(1), 2011.
5 Jon Krosnick and Bo MacInnis, “Frequent Viewers of Fox News Are Less Likely to
Accept Scientists’ Views of Global Warming
,” Woods Institute for the Environment,
Stanford University, 2010.
6 Jeffrey Rachlinski, “The Psychology of Global Climate Change,” University of Illinois
Law Review
 1, 2000.
7 Max Boykoff, “The Real Swindle,” Nature Climate Change, February 2008.
8 Ding Ding et al., “Support for Climate Policy and Societal Action Are Linked to Perceptions
About Scientific Agreement
,” Nature Climate Change 1, 2011.
9 Matthew Feinberg and Robb Willer, “Apocalypse Soon? Dire Messages Reduce Belief in
Global Warming by Contradicting Just-World Beliefs
,” Psychological Science 22(1), 2011.
10 Thomas Vargish, “Why the Person Sitting Next to You Hates Limits to Growth,”
Technological Forecasting and Social Change 16, 1980.
11 Nick Mabey, Jay Gulledge, Bernard Finel, and Katherine Silverthorne, Degrees of Risk:
Defining a Risk Management Framework for Climate Security
, Third Generation Environmentalism,
2011.
12 Jonathan Schuldt, Sara H. Konrath, and Norbert Schwarz, “‘Global Warming’ or
‘Climate Change’? Whether the Planet Is Warming Depends on Question Wording
,”
Public Opinion Quarterly 75(1), 2011.
13 John Sterman, “Communicating Climate Change Risks in a Skeptical World,” Climatic
Change
, 2011.
14 Dan Kahan, Hank Jenkins-Smith, and Donald Braman, “Cultural Cognition of Scientific
Consensus
,” Journal of Risk Research 14, 2010.
15 Christopher Borick and Barry Rabe, “Fall 2011 National Survey of American Public
Opinion on Climate Change
,” Brookings Institution, Issues in Governance Studies,
Report No. 45, Feb. 2012.

Information Overload in the Era of ‘Big Data’ (Science Daily)

ScienceDaily (Aug. 20, 2012) — Botany is plagued by the same problem as the rest of science and society: our ability to generate data quickly and cheaply is surpassing our ability to access and analyze it. In this age of big data, scientists facing too much information rely on computers to search large data sets for patterns that are beyond the capability of humans to recognize — but computers can only interpret data based on the strict set of rules in their programming.

New tools called ontologies provide the rules computers need to transform information into knowledge, by attaching meaning to data, thereby making those data retrievable by computers and more understandable to human beings. Ontology, from the Greek word for the study of being or existence, traditionally falls within the purview of philosophy, but the term is now used by computer and information scientists to describe a strategy for representing knowledge in a consistent fashion. An ontology in this contemporary sense is a description of the types of entities within a given domain and the relationships among them.

A new article in this month’s American Journal of Botany by Ramona Walls (New York Botanical Garden) and colleagues describes how scientists build ontologies such as the Plant Ontology (PO) and how these tools can transform plant science by facilitating new ways of gathering and exploring data.

When data from many divergent sources, such as data about some specific plant organ, are associated or “tagged” with particular terms from a single ontology or set of interrelated ontologies, the data become easier to find, and computers can use the logical relationships in the ontologies to correctly combine the information from the different databases. Moreover, computers can also use ontologies to aggregate data associated with the different subclasses or parts of entities.

For example, suppose a researcher is searching online for all examples of gene expression in a leaf. Any botanist performing this search would include experiments that described gene expression in petioles and midribs or in a frond. However, a search engine would not know that it needs to include these terms in its search — unless it was told that a frond is a type of leaf, and that every petiole and every midrib are parts of some leaf. It is this information that ontologies provide.

The article in the American Journal of Botany by Walls and colleagues describes what ontologies are, why they are relevant to plant science, and some of the basic principles of ontology development. It includes an overview of the ontologies that are relevant to botany, with a more detailed description of the PO and the challenges of building an ontology that covers all green plants. The article also describes four keys areas of plant science that could benefit from the use of ontologies: (1) comparative genetics, genomics, phenomics, and development; (2) taxonomy and systematics; (3) semantic applications; and (4) education. Although most of the examples in this article are drawn from plant science, the principles could apply to any group of organisms, and the article should be of interest to zoologists as well.

As genomic and phenomic data become available for more species, many different research groups are embarking on the annotation of their data and images with ontology terms. At the same time, cross-species queries are becoming more common, causing more researchers in plant science to turn to ontologies. Ontology developers are working with the scientists who generate data to make sure ontologies accurately reflect current science, and with database developers and publishers to find ways to make it easier for scientist to associate their data with ontologies.

Journal Reference:

R. L. Walls, B. Athreya, L. Cooper, J. Elser, M. A. Gandolfo, P. Jaiswal, C. J. Mungall, J. Preece, S. Rensing, B. Smith, D. W. Stevenson. Ontologies as integrative tools for plant scienceAmerican Journal of Botany, 2012; 99 (8): 1263 DOI: 10.3732/ajb.1200222

Cientistas apontam problemas da cobertura da imprensa sobre mudanças climáticas (Fapesp)

Especialistas reunidos em São Paulo para debater gestão de riscos dos extremos climáticos manifestam preocupação com dificuldades enfrentadas por jornalistas para lidar com a complexidade do tema (Wikimedia)

21/08/2012

Por Fábio de Castro

Agência FAPESP – Na avaliação de especialistas reunidos em São Paulo para discutir a gestão de riscos dos extremos climáticos e desastres, para que seja possível gerenciar de forma adequada os impactos desses eventos, é fundamental informar a sociedade – incluindo os formuladores de políticas públicas – sobre as descobertas das ciências climáticas.

No entanto, pesquisadores estão preocupados com as dificuldades encontradas na comunicação com a sociedade. A complexidade dos estudos climáticos tende a gerar distorções na cobertura jornalística do tema e o resultado pode ser uma ameaça à confiança do público em relação à ciência.

A avaliação foi feita por participantes do workshop “Gestão dos riscos dos extremos climáticos e desastres na América Central e na América do Sul – o que podemos aprender com o Relatório Especial do IPCC sobre extremos?”, realizado na semana passada na capital paulista.

O evento teve o objetivo de debater as conclusões do Relatório Especial sobre Gestão dos Riscos de Extremos Climáticos e Desastres (SREX, na sigla em inglês) – elaborado e recentemente publicado pelo Painel Intergovernamental sobre Mudanças Climáticas (IPCC) – e discutir opções para gerenciamento dos impactos dos extremos climáticos, especialmente nas Américas do Sul e Central.

O workshop foi realizado pela FAPESP e pelo Instituto Nacional de Pesquisas Espaciais (Inpe), em parceria com o IPCC, o Overseas Development Institute (ODI) e a Climate and Development Knowledge (CKDN), ambos do Reino Unido, e apoio da Agência de Clima e Poluição do Ministério de Relações Exteriores da Noruega.

Durante o evento, o tema da comunicação foi debatido por autores do IPCC-SREX, especialistas em extremos climáticos, gestores e líderes de instituições de prevenção de desastres.

De acordo com Vicente Barros, do Centro de Investigação do Mar e da Atmosfera da Universidade de Buenos Aires, o IPCC, do qual é membro, entrou há três anos em um processo de reestruturação que compreende uma mudança na estratégia de comunicação.

“A partir de 2009, o IPCC passou a ser atacado violentamente e não estávamos preparados para isso, porque nossa função era divulgar o conhecimento adquirido, mas não traduzi-lo para a imprensa. Temos agora um grupo de jornalistas que procura fazer essa mediação, mas não podemos diluir demais as informações e a última palavra na formulação da comunicação é sempre do comitê executivo, porque o peso político do que é expresso pelo painel é muito grande”, disse Barros.

A linguagem é um grande problema, segundo Barros. Se for muito complexa, não atinge o público. Se for muito simplificada, tende a distorcer as conclusões e disseminar visões que não correspondem à realidade.

“O IPCC trata de problemas muito complexos e admitimos que não podemos fazer uma divulgação que chegue a todos. Isso é um problema. Acredito que a comunicação deve permanecer nas mãos dos jornalistas, mas talvez seja preciso investir em iniciativas de treinamento desses profissionais”, disse.

Fábio Feldman, do Fórum Paulista de Mudanças Climáticas, manifestou preocupação com as dificuldades de comunicação dos cientistas com o público, que, segundo ele, possibilitam que os pesquisadores “céticos” – isto é, que negam a influência humana nos eventos de mudanças climáticas – ganhem cada vez mais espaço na mídia e no debate público.

“Vejo com preocupação um avanço do espaço dado aos negacionistas no debate público. A imprensa acha que é preciso usar necessariamente o princípio do contraditório, dando espaço e importância equânimes para as diferentes posições no debate”, disse.

De acordo com Feldman, os cientistas – especialmente aqueles ligados ao IPCC – deveriam ter uma atitude mais pró-ativa no sentido de se contrapor aos “céticos” no debate público.

Posições diferentes

Para Reynaldo Luiz Victoria, da Coordenação do Programa FAPESP de Pesquisa em Mudanças Climáticas Globais, é importante que a imprensa trate as diferentes posições de modo mais equitativo.

“Há casos específicos em que a imprensa trata questões de maneira pouco equitativa – e eventualmente sensacionalista –, mas acho que nós, como pesquisadores, não temos obrigação de reagir. A imprensa deveria nos procurar para fazer o contraponto e esclarecer o público”, disse Victoria à Agência FAPESP.

Victoria, no entanto, destacou a importância de que os “céticos” também sejam ouvidos. “Alguns são cientistas sérios e merecem um tratamento equitativo. Certamente que não se pode ignorá-los, mas, quando fazem afirmações passíveis de contestação, a imprensa deve procurar alguém que possa dar um contraponto. Os jornalistas precisam nos procurar e não o contrário”, disse.

De modo geral, a cobertura da imprensa sobre mudanças climáticas é satisfatória, segundo Victoria. “Os bons jornais publicam artigos corretos e há jornalistas muito sérios produzindo material de alta qualidade”, destacou.

Para Luci Hidalgo Nunes, professora do Departamento de Geografia da Universidade Estadual de Campinas (Unicamp), os negacionistas ganham espaço porque muitas vezes o discurso polêmico tem mais apelo midiático do que a complexidade do conhecimento científico.

“O cientista pode ter um discurso bem fundamentado, mas que é considerado enfadonho pelo público. Enquanto isso, um pesquisador com argumentos pouco estruturados pode fazer um discurso simplificado, portanto atraente para o público, e polêmico, o que rende manchetes”, disse à Agência FAPESP.

Apesar de a boa ciência ter, em relação ao debate público, uma desvantagem inerente à sua complexidade, Nunes acredita ser importante que a imprensa continue pluralista. A pesquisadora publicou um estudo no qual analisa a cobertura do jornal O Estado de S. Paulo sobre mudanças climáticas durante um ano. Segundo Nunes, um dos principais pontos positivos observados consistiu em dar voz às diferentes posições.

“Sou favorável a que a imprensa cumpra seu papel e dê todos os parâmetros, para que haja um debate democrático. Acho que isso está sendo bem feito e a própria imprensa está aberta para nos dar mais espaço. Mas precisamos nos manifestar para criar essas oportunidades”, disse.

Nunes também considera que a cobertura da imprensa sobre mudanças climáticas, de modo geral, tem sido satisfatória, ainda que irregular. “O tema ganha vulto em determinados momentos, mas não se mantém na pauta do noticiário de forma permanente”, disse.

Segundo ela, o assunto sobressaiu especialmente em 2007, com a publicação do primeiro relatório do IPCC, e em 2012 durante a RIO+20.

“Em 2007, a cobertura foi intensa, mas a popularização do tema também deu margem a distorções e exageros. O sensacionalismo é ruim para a ciência, porque faz o tema ganhar as manchetes rapidamente por algum tempo, mas no médio prazo o efeito é inverso: as pessoas percebem os exageros e passam a olhar com descrédito os resultados científicos de modo geral”, disse.

Ensino público manda 45% dos alunos às universidades federais (Valor Econômico)

JC e-mail 4559, de 10 de Agosto de 2012.

De acordo com o estudo “Perfil Socioeconômico e Cultural dos Estudantes de Graduação das Universidades Federais Brasileiras”, concluído pelo Fórum Nacional de Pró-Reitores de Assuntos Comunitários e Estudantis (Fonaprace) em julho de 2011, 45% dos cerca de 900 mil alunos matriculados nas 59 instituições da rede de ensino superior do País vieram do ensino médio público.

No que se refere especificamente à obrigatoriedade de destinar 50% das vagas nas universidades federais a alunos que cursaram o ensino médio em escolas públicas, a lei de cotas aprovada na terça-feira (7) vai causar um impacto muito menor no atual sistema de matrículas do que a polêmica que tem gerado.

De acordo com o estudo “Perfil Socioeconômico e Cultural dos Estudantes de Graduação das Universidades Federais Brasileiras”, levantamento feito pelo Fórum Nacional de Pró-Reitores de Assuntos Comunitários e Estudantis (Fonaprace) entre outubro e novembro de 2010 e concluído em julho de 2011, 45% dos cerca de 900 mil alunos matriculados nas 59 instituições da rede de ensino superior do País vieram do ensino médio público.

A terceira edição do estudo – a primeira foi produzida em 1996-1997, mostra que os maiores percentuais de alunos oriundos das escolas públicas são nas regiões Norte, com 71,5%, e Sul (50,5%). Nordeste e Centro-Oeste vêm em seguida com, respectivamente, 41,5% e 40,5%. O Sudeste é a região com o menor índice: 37%.

O “Perfil Socioeconômico” é uma pesquisa amostral do Fonaprace baseado no conjunto dos estudantes das universidades federais matriculados nos cursos presenciais de graduação. Adotou-se um nível de confiabilidade de 95% e erro amostral de 5% por instituição. A base de dados do foi fornecida pela Secretaria de Ensino Superior do Ministério da Educação (Sesu-MEC) e passou por um processo de validação com cada universidade participante, que respondeu questionários quantitativos e qualitativos num sistema online.

Rosana Pereira Parente, pró-reitora de graduação da Universidade Federal do Amazonas (Ufam), integrante do Fonaprace, observa que uma política de Estado de ação afirmativa para equalizar o acesso ao ensino superior é importante por combater a desigualdade no País, mas um modelo único para realidades diferentes poder ser considerada “uma estratégia complicada”. “Algumas particularidades devem ser observadas, aqui na Região Norte o mercado privado da educação básica não é tão forte quanto nos grandes centros e temos mais índios que negros e pardos. Com a lei, ações afirmativas que temos aqui são prejudicadas”, pondera Rosana.

Na Ufam, a criação de novas políticas de cotas é discutida por um grupo específico dentro do conselho universitário. No momento, a instituição dá prioridade à entrada de 50% via Exame Nacional do Ensino Médio (Enem) e o vestibular contínuo dá os outros 50% das vagas a alunos do ensino médio – eles fazem “minivestibulares” desde o primeiro ano do ensino médio. “Mas queremos instituir ações que beneficiem alunos de baixa renda”, acrescenta Rosana.

Uma das principais conclusões do estudo é que o número de estudantes negros, pardos e índios e pobres aumentou nos últimos anos, tópicos também contemplados na lei de cotas aprovada nesta semana. Os responsáveis pela pesquisa sugerem que, devido a esse resultado, é urgente ampliar os investimentos na política de assistência estudantil.

“Já existem pesquisas no Brasil que tentam monitorar o avanço das políticas afirmativas na educação. Agora, com uma lei nacional, o novo modelo tem de vir acompanhado de ações de assistência estudantil para garantir não só o acesso, mas a permanência desse ‘novo’ aluno”, avalia Dalila Andrade Oliveira, professora da Faculdade de Educação da Universidade Federal de Minas Gerais (UFMG) e presidente da Associação Nacional de Pós-Graduação e Pesquisa em Educação (Anped).

De 2008 a 2012 o orçamento do MEC para o Programa Nacional de Assistência Estudantil (Pnaes), que garante bolsas mensais e auxílios financeiros para alimentação, moradia e compra de material didático, cresceu 300% em valores nominais, para R$ 500 milhões. Mas dirigentes federais falam que os recursos são insuficientes. “O Pnaes precisaria subir para R$ 1,5 bilhão para dar conta das atuais necessidades atuais. A nova lei, que fere a autonomia universitária, poderia vir acompanhada de um item para garantir a contrapartida orçamentária à universidade que receber estudantes mais pobres”, critica Gustavo Balduíno, da direção da Associação Nacional de Dirigentes de Instituições de Ensino Federais (Andifes).

Sobre o aspecto racial, o levantamento do Fonaprace mostra que em 2010 estudantes brancos eram maioria nas universidades federais: 54%. Na pesquisa anterior, em 2004, o percentual de brancos era de 59%. Os pretos aumentaram de 5,9% em 2004 para 8,7% em 2010, percentual que subiu em todas as regiões do País: com destaque para o Norte, que praticamente dobrou o seu percentual (13,4%, ante 6,8% em 2004), e Nordeste, cujas marcas passaram de 8,6% para 12,5%.

Perto de 45% dos estudantes das universidades federais pertencem às classes C, D e E. Os estudantes da classe A somam 15% do total de matrículas de 2010, com maior concentração na região Centro-Oeste (22%). Os universitários enquadrados na classe B representam 41% do total.

Programa de computador mimetiza evolução humana (Fapesp)

Software desenvolvido na USP de São Carlos cria e seleciona programas geradores de Árvores de Decisão, ferramentas capazes de fazer previsões. Pesquisa foi premiada nos Estados Unidos, no maior evento de computação evolutiva (Wikimedia)

16/08/2012

Por Karina Toledo

Agência FAPESP – Árvores de Decisão são ferramentas computacionais que conferem às máquinas a capacidade de fazer previsões com base na análise de dados históricos. A técnica pode, por exemplo, auxiliar o diagnóstico médico ou a análise de risco de aplicações financeiras.

Mas, para ter a melhor previsão, é necessário o melhor programa gerador de Árvores de Decisão. Para alcançar esse objetivo, pesquisadores do Instituto de Ciências Matemáticas e de Computação (ICMC) da Universidade de São Paulo (USP), em São Carlos, se inspiraram na teoria evolucionista de Charles Darwin.

“Desenvolvemos um algoritmo evolutivo, ou seja, que mimetiza o processo de evolução humana para gerar soluções”, disse Rodrigo Coelho Barros, doutorando do Laboratório de Computação Bioinspirada (BioCom) do ICMC e bolsista da FAPESP.

A computação evolutiva, explicou Barros, é uma das várias técnicas bioinspiradas, ou seja, que buscam na natureza soluções para problemas computacionais. “É notável como a natureza encontra soluções para problemas extremamente complicados. Não há dúvidas de que precisamos aprender com ela”, disse Barros.

Segundo Barros, o software desenvolvido em seu doutorado é capaz de criar automaticamente programas geradores de Árvores de Decisão. Para isso, faz cruzamentos aleatórios entre os códigos de programas já existentes gerando “filhos”.

“Esses ‘filhos’ podem eventualmente sofrer mutações e evoluir. Após um tempo, é esperado que os programas de geração de Árvores de Decisão evoluídos sejam cada vez melhores e nosso algoritmo seleciona o melhor de todos”, afirmou Barros.

Mas enquanto o processo de seleção natural na espécie humana leva centenas ou até milhares de anos, na computação dura apenas algumas horas, dependendo do problema a ser resolvido. “Estabelecemos cem gerações como limite do processo evolutivo”, contou Barros.

Inteligência artificial

Em Ciência da Computação, é denominada heurística a capacidade de um sistema fazer inovações e desenvolver técnicas para alcançar um determinado fim.

O software desenvolvido por Barros se insere na área de hiper-heurísticas, tópico recente na área de computação evolutiva que tem como objetivo a geração automática de heurísticas personalizadas para uma determinada aplicação ou conjunto de aplicações.

“É um passo preliminar em direção ao grande objetivo da inteligência artificial: o de criar máquinas capazes de desenvolver soluções para problemas sem que sejam explicitamente programadas para tal”, detalhou Barros.

O trabalho deu origem ao artigo A Hyper-Heuristic Evolutionary Algorithm for Automatically Designing Decision-Tree Algorithms, premiado em três categorias na Genetic and Evolutionary Computation Conference (GECCO), maior evento da área de computação evolutiva do mundo, realizado em julho na Filadélfia, Estados Unidos.

Além de Barros, também são autores do artigo os professores André Carlos Ponce de Leon Ferreira de Carvalho, orientador da pesquisa no ICMC, Márcio Porto Basgalupp, da Universidade Federal de São Paulo (Unifesp), e Alex Freitas, da University of Kent, no Reino Unido, que assumiu a co-orientação.

Os autores foram convidados a submeter o artigo para a revista Evolutionary Computation Journal, publicada pelo Instituto de Tecnologia de Massachusetts (MIT). “O trabalho ainda passará por revisão, mas, como foi submetido a convite, tem grande chance de ser aceito”, disse Barros.

A pesquisa, que deve ser concluída somente em 2013, também deu origem a um artigo publicado a convite no Journal of the Brazilian Computer Society, após ser eleito como melhor trabalho no Encontro Nacional de Inteligência Artificial de 2011.

Outro artigo, apresentado na 11ª International Conference on Intelligent Systems Design and Applications, realizada na Espanha em 2011, rendeu convite para publicação na revistaNeurocomputing.

Cyborg America: inside the strange new world of basement body hackers (The Verve)

The Verve, 8 August 2012

Shawn Sarver took a deep breath and stared at the bottle of Listerine on the counter. “A minty fresh feeling for your mouth… cures bad breath,” he repeated to himself, as the scalpel sliced open his ring finger. His left arm was stretched out on the operating table, his sleeve rolled up past the elbow, revealing his first tattoo, the Air Force insignia he got at age 18, a few weeks after graduating from high school. Sarver was trying a technique he learned in the military to block out the pain, since it was illegal to administer anesthetic for his procedure.

“A minty fresh feeling… cures bad breath,” Sarver muttered through gritted teeth, his eyes staring off into a void.

Tim, the proprietor of Hot Rod Piercing in downtown Pittsburgh, put down the scalpel and picked up an instrument called an elevator, which he used to separate the flesh inside in Sarver’s finger, creating a small empty pocket of space. Then, with practiced hands, he slid a tiny rare earth metal inside the open wound, the width of a pencil eraser and thinner than a dime. When he tried to remove his tool, however, the metal disc stuck to the tweezers. “Let’s try this again,” Tim said. “Almost done.”

The implant stayed put the second time. Tim quickly stitched the cut shut, and cleaned off the blood. “Want to try it out?” he asked Sarver, who nodded with excitement. Tim dangled the needle from a string of suture next to Sarver’s finger, closer and closer, until suddenly, it jumped through the air and stuck to his flesh, attracted by the magnetic pull of the mineral implant.

“I’m a cyborg!” Sarver cried, getting up to join his friends in the waiting room outside. Tim started prepping a new tray of clean surgical tools. Now it was my turn.

PART.01

With the advent of the smartphone, many Americans have grown used to the idea of having a computer on their person at all times. Wearable technologies like Google’s Project Glass are narrowing the boundary between us and our devices even further by attaching a computer to a person’s face and integrating the software directly into a user’s field of vision. The paradigm shift is reflected in the names of our dominant operating systems. Gone are Microsoft’s Windows into the digital world, replaced by a union of man and machine: the iPhone or Android.

For a small, growing community of technologists, none of this goes far enough. I first met Sarver at the home of his best friend, Tim Cannon, in Oakdale, a Pennsylvania suburb about 30 minutes from Pittsburgh where Cannon, a software developer, lives with his longtime girlfriend and their three dogs. The two-story house sits next to a beer dispensary and an abandoned motel, a reminder the city’s best days are far behind it. In the last two decades, Pittsburgh has been gutted of its population, which plummeted from a high of more than 700,000 in the 1980s to less than 350,000 today. For its future, the city has pinned much of its hopes on the biomedical and robotics research being done at local universities like Carnegie Mellon. “The city was dying and so you have this element of anti-authority freaks are welcome,” said Cannon. “When you have technology and biomedical research and a pissed-off angry population that loves tattoos, this is bound to happen. Why Pittsburgh? It’s got the right amount of fuck you.”

Cannon led me down into the basement, which he and Sarver have converted into a laboratory. A long work space was covered with Arduino motherboards, soldering irons, and electrodes. Cannon had recently captured a garter snake, which eyed us from inside a plastic jar. “Ever since I was a kid, I’ve been telling people that I want to be a robot,” said Cannon. “These days, that doesn’t seem so impossible anymore.” The pair call themselves grinders — homebrew biohackers obsessed with the idea of human enhancement — who are looking for new ways to put machines into their bodies. They are joined by hundreds of aspiring biohackers who populate the movement’s online forums and a growing number, now several dozen, who have gotten the magnetic implants in real life.

GONE ARE MICROSOFT’S WINDOWS INTO THE DIGITALWORLD, REPLACED BY A UNION OF MANAND MACHINE: THE IPHONE ORANDROID

COMPUTERS ARE HARDWARE. APPS ARE SOFTWARE. HUMANS AREWETWARE

“EVER SINCE IWAS A KID, I’VE BEEN TELLING PEOPLE THAT IWANT TO BE A ROBOT.”

Cannon looks and moves a bit like Shaggy from Scooby Doo, a languid rubberband of a man in baggy clothes and a newsboy cap. Sarver, by contrast, stands ramrod-straight, wearing a dapper three-piece suit and waxed mustache, a dandy steampunk with a high-pitched laugh. There is a distinct division of labor between the two: Cannon is the software developer and Sarver, who learned electrical engineering as a mechanic in the Air Force, does the hardware. The moniker for their working unit is Grindhouse Wetwares. Computers are hardware. Apps are software. Humans are wetware.

Cannon, like Sarver, served in the military, but the two didn’t meet until they had both left the service, introduced by a mutual friend in the Pittsburgh area. Politics brought them together. “We were both kind of libertarians, really strong anti-authority people, but we didn’t fit into the two common strains here: idiot anarchist who’s unrealistic or right-wing crazy Christian. Nobody was incorporating technology into it. So there was no political party but just a couple like-minded individuals, who were like… techno-libertarians!”

Cannon got his own neodymium magnetic implant a year before Sarver. Putting these rare earth metals into the body was pioneered by artists on the bleeding edge of piercing culture and transhumanists interested in experimenting with a sixth sense.Steve Haworth, who specializes in the bleeding edge of body modification and considers himself a “human evolution artist,” is considered one of the originators, and helped to inspire a generation of practitioners to perform magnetic implants, including the owner of Hot Rod Piercing in Pittsburgh. (Using surgical tools like a scalpel is a grey area for piercers. Operating with these instruments, or any kind of anesthesia, could be classified as practicing medicine. Without a medical license, a piercer who does this is technically committing assault on the person getting the implant.) On its own, the implant allows a person to feel electromagnetic fields: a microwave oven in their kitchen, a subway passing beneath the ground, or high-tension power lines overhead.

While this added perception is interesting, it has little utility. But the magnet, explains Cannon, is more of a stepping stone toward bigger things. “It can be done cheaply, with minimally invasive surgery. You get used to the idea of having something alien in your body, and kinda begin to see how much more the human body could do with a little help. Sure, feeling other magnets around you is fucking cool, but the real key is, you’re giving the human body a simple, digital input.”

As an example of how that might work, Cannon showed me a small device he and Sarver created called the Bottlenose. It’s a rectangle of black metal about half the size of a pack of cigarettes that slips over your finger. Named after the echolocation used by dolphins, it sends out an electromagnetic pulse and measures the time it takes to bounce back. Cannon slips it over his finger and closes his eyes. “I can kind of sweep the room and get this picture of where things are.” He twirls around the half-empty basement, eyes closed, then stops, pointing directly at my chest. “The magnet in my finger is extremely sensitive to these waves. So the Bottlenose can tell me the shape of things around me and how far away they are.”

The way Cannon sees it, biohacking is all around us. “In a way, eyeglasses are a body hack, a piece of equipment that enhances your sense, and pretty quickly becomes like a part of your body,” says Cannon. He took a pair of electrodes off the workbench and attached them to my temples. “Your brain works through electricity, so why not help to boost that?” A sharp pinch ran across my forehead as the first volts flowed into my skull. He and Sarver laughed as my face involuntarily twitched. “You’re one of us now,” Cannon says with a laugh.

HISTORY.01

In one sense, Mary Shelley’s Frankenstein, part man, part machine, animated by electricity and with superhuman abilities, might be the first dark, early vision of what humans’ bodies would become when modern science was brought to bear. A more utopian version was put forward in 1960, a year before man first travelled into space, by the scientist and inventor Manfred Clynes. Clynes was considering the problem of how mankind would survive in our new lives as outer space dwellers, and concluded that only by augmenting our physiology with drugs and machines could we thrive in extraterrestrial environs. It was Clynes and his co-author Nathan Kline, writing on this subject, who coined the term cyborg.

At its simplest, a cyborg is a being with both biological and artificial parts: metal, electrical, mechanical, or robotic. The construct is familiar to almost everyone through popular culture, perhaps most spectacularly in the recent Iron Man films. Tony Stark is surely our greatest contemporary cyborg: a billionaire businessman who designed his own mechanical heart, a dapper bachelor who can transform into a one-man fighter jet, then shed his armour as easily as a suit of clothes.

Britain is the birthplace of 21st-century biohacking, and the movement’s two foundational figures present a similar Jekyll and Hyde duality. One is Lepht Anonym, a DIY punk who was one of the earliest, and certainly the most dramatic, to throw caution to the wind and implant metal and machines into her flesh. The other is Kevin Warwick, an academic at the University of Reading’s department of cybernetics. Warwick relies on a trained staff of medical technicians when doing his implants. Lepht has been known to say that all she requires is a potato peeler and a bottle of vodka. In an article on h+, Anonym wrote:

I’m sort of inured to pain by this point. Anesthetic is illegal for people like me, so we learn to live without it; I’ve made scalpel incisions in my hands, pushed five-millimeter diameter needles through my skin, and once used a vegetable knife to carve a cavity into the tip of my index finger. I’m an idiot, but I’m an idiot working in the name of progress: I’m Lepht Anonym, scrapheap transhumanist. I work with what I can get.

Anonym’s essay, a series of YouTube videos, and a short profile in Wired established her as the face of the budding biohacking movement. It was Anonym who proved, with herself as the guinea pig, that it was possible to implant RFID chips and powerful magnets into one’s body, without the backing of an academic institution or help from a team of doctors.

 

“She is an inspiration to all of us,” said a biohacker who goes by the name of Sovereign Bleak. “To anyone who was frustrated with the human condition, who felt we had been promised more from the future, she said that it was within our grasp, and our rights, to evolve our bodies however we saw fit.” Over the last decade grinders have begun to form a loose culture, connected mostly by online forums like biohack.me, where hundreds of aspiring cyborgs congregate to swap tips about the best bio-resistant coatings to prevent the body from rejecting magnetic implants and how to get illegal anesthetics shipped from Canada to the United States. There is another strain of biohacking which focuses on the possibilities for DIY genetics, but their work is far more theoretical than the hands-on experiments performed by grinders.

But while Anonym’s renegade approach to bettering her own flesh birthed a new generation of grinders, it seems to have had some serious long-term consequences for her own health. “I’m a wee bit frightened right now,” Anonym wrote on her blog early this year. “I’m hearing things that aren’t there. Sure I see things that aren’t real from time to time because of the stupid habits I had when I was a teenager and the permanent, very mild damage I did to myself experimenting like that, but I don’t usually hear anything and this is not a flashback.”

MEDICAL NEED VERSUS HUMAN ENHANCEMENT

Neil Harbisson was born with a condition that allows him to see only in black and white. He became interested in cybernetics, and eventually began wearing the Eyeborg, a head-mounted camera which translated colors into vibrations that Harbisson could hear. The addition of the Eyeborg to his passport has led some to dub him the first cyborg officially recognized by the federal government. He now plans to extend and improve this cybernetic synesthesia by having the Eyeborg permanently surgically attached to his skull.

Getting a medical team to help him was no easy task. “Their position was that ‘doctors usually repair or fix humans’ and that my operation was not about fixing nor repairing myself but about creating a new sense: the perception of visual elements via bone-conducted sounds,” Harbisson told me by email. “The other main issue was that the operation would allow me to perceive outside the ability of human vision and human hearing (hearing via the bone allows you to hear a wider range of sounds, from infrasounds to ultrasounds, and some lenses can detect ultraviolets and infrareds). It took me over a year to convince them.”

In the end, the bio-ethical community still relies on promises of medical need to justify cybernetic enhancement. “I think I convinced them when I told them that this kind of operation could help ‘fix and repair’ blind people. If you use a different type of chip, a chip that translates words into sound, or distances into sound, for instance, the same electronic eye implant could be used to read or to detect obstacles which could mean the end of Braille and sticks. I guess hospitals and governments will soon start publishing their own laws about which kind of cybernetic implants they find are ethical/legal and which ones they find are not.”

PART.02

THE EXPERIENCE RANKED ALONGSIDE BREAKING MY ARM AND HAVING MY APPENDIX REMOVED

  

I had Lepht Anonym in the back of my mind as I stretched my arm out on the operating table at Hot Rod Piercing. The fingertip is an excellent place for a magnet because it is full of sensitive nerve tissue, fertile ground for your nascent sixth sense to pick up on the electro-magnetic fields all around us. It is also an exceptionally painful spot to have sliced open with a scalpel, especially when no painkillers are available. The experience ranked alongside breaking my arm and having my appendix removed, a level of pain that opens your mind to parts of your body which before you were not conscious of.

For the first few days after the surgery, it was difficult to separate out my newly implanted sense from the bits of pain and sensation created by the trauma of having the magnet jammed in my finger. Certain things were clear: microwave ovens gave off a steady field that was easy to perceive, like a pulsating wave of invisible water, or air heavy from heat coming off a fan. And other magnets, of course, were easy to identify. They lurked like landmines in everyday objects — my earbuds, my messenger bag — sending my finger ringing with a deep, sort of probing force field that shifted around in my flesh.

High-tension wires seemed to give off a sort of pulsating current, but it was often hard to tell, since my finger often began throbbing for no reason, as it healed from the trauma of surgery. Playing with strong, stand-alone magnets was a game of chicken. The party trick of making one leap across a table towards my finger was thrilling, but the awful squirming it caused inside my flesh made me regret it hours later. Grasping a colleague’s stylus too near the magnetic tip put a sort of freezing probe into my finger that I thought about for days afterwards.

Within a few weeks, the sensation began to fade. I noticed fewer and fewer instances of a sixth sense, beyond other magnets, which were quite obvious. I was glad that the implant didn’t interfere with my life, or prevent me from exercising, but I also grew a bit disenchanted, after all the hype and excitement the grinders I interviewed had shared about their newfound way of interacting with the world.

HISTORY.02

If Lepht Anonym is the cautionary tale, Prof. Kevin Warwick is the one bringing academic respectability to cybernetics. He was one of the first to experiment with implants, putting an RFID chip into his body back in 1998, and has also taken the techniques the farthest. In 2002, Prof. Warwick had cybernetic sensors implanted into the nerves of his arm. Unlike the grinders in Pittsburgh, he had the benefits of anesthesia and a full medical team, but he was still putting himself at great risk, as there was no research on the long-term effects of having these devices grafted onto his nervous system. “In a way that is what I like most about this,” he told me. “From an academic standpoint, it’s wide-open territory.”

I chatted with Warwick from his office at The University of Reading, stacked floor to ceiling with books and papers. He has light brown hair that falls over his forehead and an easy laugh. With his long sleeve shirt on, you would never know that his arm is full of complex machinery. The unit allows Warwick to manipulate a robot hand, a mirror of his own fingers and flesh. What’s more, the impulse could flow both ways. Warwick’s wife, Irena, had a simpler cybernetic implant done on herself. When someone grasped her hand, Prof. Warwick was able to experience the same sensation in his hand, from across the Atlantic. It was, Warwick writes, a sort of cybernetic telepathy, or empathy, in which his nerves were made to feel what she felt, via bits of data travelling over the internet.

The work was hailed by the mainstream media as a major step forward in helping amputees and victims of paralysis to regain a full range of abilities. But Prof. Warwick says that misses the point. “I quite like the fact that new medical therapies could potentially come out of this work, but what I am really interested in is not getting people back to normal; it’s enhancement of fully functioning humans to a higher level.”

It’s a sentiment that can take some getting used to. “A decade ago, if you talked about human enhancement, you upset quite a lot of people. Unless the end goal was helping the disabled, people really were not open to it.” With the advent of smartphones, says Prof. Warwick, all that has changed. “Normal folks really see the value of ubiquitous technology. In fact the social element has almost created the reverse. Now, you must be connected all the time.”

While he is an accomplished academic, Prof. Warwick has embraced biohackers and grinders as fellow travelers on the road to exploring our cybernetic future. “A lot of the time, when it comes to putting magnets into your body or RFID chips, there is more information on YouTube than in the peer-reviewed journals. There are artists and geeks pushing the boundaries, sharing information, a very renegade thing. My job is to take that, and apply some more rigorous scientific analysis.”

To that end, Prof. Warwick and one of his PhD students, Ian Harrison, are beginning a series of studies on biohackers with magnetic implants. “When it comes to sticking sensors into your nerve endings, so much is subjective,” says Harrison. “What one person feels, another may not. So we are trying to establish some baselines for future research.”

“IT’S LIKE THIS LAST, UNEXPLORED CONTINENT STARING US IN THE FACE.”The end goal for Prof. Warwick, as it was for the team at Grindhouse Wetwares in Pittsburgh, is still the stuff of science fiction. “When it comes to communication, humans are still so far behind what computers are capable of,” Prof. Warwick explained. “Bringing about brain to brain communication is something I hope to achieve in my lifetime.”For Warwick, this will advance not just the human body and the field of cybernetics, but allow for a more practical evaluation the entire canon of Western thought. “I would like to ask the questions that the philosopher Ludwig Wittgenstein asked, but in practice, not in theory.” It would be another attempt to study the mind, from inside and out, as Wittgenstein proposed. But with access to objective data. “Perhaps he was bang on, or maybe we will rubbish his whole career, but either way, it’s something we should figure out.”

As the limits of space exploration become increasingly clear, a generation of scientists who might once have turned to the stars are seeking to expand humanity’s horizons much closer to home. “Jamming stuff into your body, merging machines with your nerves and brain, it’s brand new,” said Warwick. “It’s like this last, unexplored continent staring us in the face.”

On a hot day in mid-July, I went for a walk around Manhattan with Dann Berg, who had a magnet implanted in his pinky three years earlier. I told him I was a little disappointed how rarely I noticed anything with my implant. “Actually, your experience is pretty common,” he told me. “I didn’t feel much for the first 6 months, as the nerves were healing from surgery. It took a long time for me to gain this kind of ambient awareness.”

Berg worked for a while in the piercing and tattoo studio, which brought him into contact with the body modification community who were experimenting with implants. At the same time, he was teaching himself to code and finding work as a front-end developer building web sites. “To me, these two things, the implant and the programming, they are both about finding new ways to see and experience the world.”

“WE’RE TOUCHING SOMETHING OTHER PEOPLE CAN’T SEE; THEY DON’T KNOW
IT EXISTS.”Berg took me to an intersection at Broadway and Bleecker. In the middle of the crosswalk, he stopped, and began moving his hand over a metal grate. “You feel that?” he asked. “It’s a dome, right here, about a foot off the ground, that just sets my finger off. Somewhere down there, part of the subway system or the power grid is working. We’re touching something other people can’t see; they don’t know it exists. That’s amazing to me.” People passing by gave us odd stares as Berg and I stood next to each other in the street, waving our hands around inside an invisible field, like mystics groping blindly for a ghost.

CYBORGS IN SOCIETY

Last month, a Canadian professor named Steve Mann was eating at a McDonald’s with his family. Mann wears a pair of computerized glasses at all times, similar to Google’s Project Glass. One of the employees asked him to take them off. When he refused, Mann says, an employee tried to rip the glasses off, an alleged attack made more brutal because the device is permanently attached and does not come off his skull without special tools.

On biohacking websites and transhumanist forums, the event was a warning sign of the battle to come. Some dubbed it the first hate crime against cyborgs. That would imply the employees knew Mann’s device was part of him, which is still largely unclear. But it was certainly a harbinger of the friction that will emerge between people whose bodies contain powerful machines and society at large.

PART.03

After zapping my brain with a few dozen volts, the boys from Grindhouse Wetwares offered to cook me dinner. Cannon popped a tray of mashed potatoes in the microwave and showed me where he put his finger to feel the electromagnetic waves streaming off. We stepped out onto the back porch and let his three little puggles run wild. The sound of cars passing on the nearby highway and the crickets warming up for sunset relaxed everyone. I asked what they thought the potential was for biohacking to become part of the mainstream.

“That’s the thing, it’s not that much of a leap,” said Cannon. “We’ve had pacemakers since the ’70s.” Brain implants are now being used to treat Parkinson’s disease and depression. Scientists hope that brain implants might soon restore mobility to paralyzed limbs. The crucial difference is that grinders are pursuing this technology for human enhancement, without any medical need. “How is this any different than plastic surgery, which like half the fucking country gets?” asked Cannon. “Look, you know the military is already working on stuff like this, right? And it won’t be too long before the corporations start following suit.”

Sarver joined the Air Force just weeks after 9/11. “I was a dyed-in-the-wool Roman Catholic Republican. I wasn’t thinking about the military, but after 9/11, I just believed the dogma.” In place of college, he got an education in electronics repairing fighter jets and attack helicopters. He left the war a very different man. “There were no terrorists in Iraq. We were the terrorists. These were scared people, already scared of their own government.”

Yet, while he rejected the conflict in the Middle East, Sarver’s time in the military gave him a new perspective on the human body. “I’ve been in the special forces,” said Sarver. “I know what the limits of the human body are like. Once you’ve seen the capabilities of a 5000psi hydraulic system, it’s no comparison.”

“THIS IS JUST A DECAYING LUMP OF FLESH THAT GETS OLD, IT’S LEAKING FLUID ALL THE TIME”

“IT’S GOING TO BE WEIRD AND UNCOMFORTABLEAND SCARY. BUT YOU CAN DO THAT, OR YOU CAN BECOME OBSOLETE.”

The boys from Grindhouse Wetwares both sucked down Parliament menthols the whole time we talked. There was no irony for them in dreaming of the possibilities for one’s body and willfully destroying it. “For me, the end game is my brain and spinal column in a jar, and a robot body out in the world doing my bidding,” said Sarver. “I would really prefer not to have to rely on an inefficient four-valve pump that sends liquid through these fragile hoses. Fuck cheetahs. I want to punch through walls.”

Flesh and blood are easily shed in grinder circles, at least theoretically speaking. “People recoil from the idea of tampering inside the body,” said Tim. “I am lost when it comes to people’s unhealthy connections to your body. This is just a decaying lump of flesh that gets old, it’s leaking fluid all the time, it’s obscene to think this is me. I am my ideas and the sum of my experiences.” As far as the biohackers are concerned, we are the best argument against intelligent design.

Neither man has any illusions about how fringe biohacking is now. But technology marches on. “People say nobody is going to want to get surgery for this stuff,” admits Cannon. But he believes that will change. “They will or they will be left behind. They have no choice. It’s going to be weird and uncomfortable and scary. But you can do that, or you can become obsolete.”

We came back into the kitchen for dinner. As I wolfed down steak and potatoes, Cannon broke into a nervous grin. “I want to show you something. It’s not quite ready, but this is what we’re working on.” He disappeared down into the basement lab and returned with a small device the size of a cigarette lighter, a simple circuit board with a display attached. This was the HELEDD, the next step in the Grindhouse Wetwares plan to unite man and machine. “This is just a prototype, but when we get it small enough, the idea is to have this beneath my skin,” he said, holding it up against his inner forearm.

The smartphone in your pocket would act as the brain for this implant, communicating via bluetooth with the HELEDD, which would use a series of LED lights to display the time, a text message, or the user’s heart rate. “We’re looking to get sensors in there for the big three,” said Tim. “Heart rate, body temperature, and blood pressure. Because then you are looking at this incredible data. Most people don’t know the effect on a man’s heart when he finds out his wife is cheating on him.”

Cannon hopes to have the operation in the next few months. A big part of what drives the duo to move so fast is the idea that there is no hierarchy established in this space. “We want to be doing this before the FDA gets involved and starts telling us what we can and cannot do. Someday this will be commercially feasible and Apple will design an implant which will sync with your phone, but that is not going to be for us. We like to open things up and break them.”

I point out that Steve Jobs may have died in large part because he was reluctant to get surgery, afraid that if doctors opened him up, they might not be able to put him back together good as new. “We’re grinders,” said Cannon. “I view it as kind of taking the pain for the people who are going to come after me. We’re paying now so that it will become socially acceptable later.”

3rdi, 2010-2011Photographed by Wafaa Bilal, Copyright: Wafaa Bilal
Image of Prof. Kevin Warwick courtesty of Prof. Kevin Warick
Portrait of Prof. Kevin Warwick originally shot for Time Magazine by Jim Naughten

In the Name of the Future, Rio Is Destroying Its Past (N.Y.Times)

OP-ED CONTRIBUTORS

By THERESA WILLIAMSON and MAURÍCIO HORA

Published: August 12, 2012

THE London Olympics concluded Sunday, but the battle over the next games has just begun in Rio, where protests against illegal evictions of some of the city’s poorest residents are spreading. Indeed, the Rio Olympics are poised to increase inequality in a city already famous for it.

Last month, Unesco awarded World Heritage Site status to a substantial portion of the city, an area that includes some of its hillside favelas, where more than 1.4 million of the city’s 6 million residents live. No favela can claim greater historical importance than Rio’s first — Morro da Providência — yet Olympic construction projects are threatening its future.

Providência was formed in 1897 when veterans of the bloody Canudos war in Brazil’s northeast were promised land in Rio de Janeiro, which was then the federal capital. Upon arriving, they found no such land available. After squatting in front of the Ministry of War, the soldiers were moved to a nearby hill belonging to a colonel, though they were given no title to the land. Originally named “Morro da Favela” after the spiny favela plant typical of the Canudos hills where soldiers had spent many nights, Providência grew during the early 20th century as freed slaves joined the soldiers. New European migrants came as well, as it was the only affordable way to live near work in the city’s center and port.

Overlooking the site where hundreds of thousands of African slaves first entered Brazil, Providência is part of one of the most important cultural sites in Afro-Brazilian history, where the first commercial sambas were composed, traditions like capoeira and candomblé flourished and Rio’s Quilombo Pedra do Sal was founded. Today 60 percent of its residents are Afro-Brazilian.

Over a century after its creation, Providência still bears the cultural and physical imprint of its initial residents. But now it is threatened with destruction in the name of Olympic improvements: almost a third of the community is to be razed, a move that will inevitably destabilize what’s left of it.

By mid-2013 Providência will have received 131 million reais ($65 million) in investments under a private-sector-led plan to redevelop Rio’s port area, including a cable car, funicular tram and wider roads. Previous municipal interventions to upgrade the community recognized its historical importance, but today’s projects have no such intent.

Although the city claims that investments will benefit residents, 30 percent of the community’s population has already been marked for removal and the only “public meetings” held were to warn residents of their fate. Homes are spray-painted during the day with the initials for the municipal housing secretary and an identifying number. Residents return from work to learn that their homes will be demolished, with no warning of what’s to come, or when.

A quick walk through the community reveals the appalling state of uncertainty residents are living in: at the very top of the hill, some 70 percent of homes are marked for eviction — an area supposedly set to benefit from the transportation investments being made. But the luxury cable car will transport 1,000 to 3,000 people per hour during the Olympics. It’s not residents who will benefit, but investors.

Residents of Providência are fearful. Only 36 percent of them hold documentation of their land rights, compared with 70 percent to 95 percent in other favelas. More than in other poor neighborhoods, residents are particularly unaware of their rights and terrified of losing their homes. Combine this with the city’s “divide and conquer” approach — in which residents are confronted individually to sign up for relocation, and no communitywide negotiations are permitted — and resistance is effectively squelched.

Pressure from human rights groups and the international news media has helped. But brutal evictions continue as well as new, subtler forms of removal. As part of the city’s port revitalization plan, authorities declared the “relocations” to be in the interest of residents because they live in “risky areas” where landslides might occur and because “de-densification” is required to improve quality of life.

But there is little evidence of landslide risk or dangerous overcrowding; 98 percent of Providência’s homes are made of sturdy brick and concrete and 90 percent have more than three rooms. Moreover, an important report by local engineers showed that the risk factors announced by the city were inadequately studied and inaccurate.

If Rio succeeds in disfiguring and dismantling its most historic favela, the path will be open to further destruction throughout the city’s hundreds of others. The economic, social and psychological impacts of evictions are dire: families moved into isolated units where they lose access to the enormous economic and social benefits of community cooperation, proximity to work and existing social networks — not to mention generations’ worth of investments made in their homes.

Rio is becoming a playground for the rich, and inequality breeds instability. It would be much more cost-effective to invest in urban improvements that communities help shape through a participatory democratic process. This would ultimately strengthen Rio’s economy and improve its infrastructure while also reducing inequality and empowering the city’s still marginalized Afro-Brazilian population.

Theresa Williamson, the publisher of RioOnWatch.org, founded Catalytic Communities, an advocacy group for favelas. Maurício Hora, a photographer, runs the Favelarte program in the Providência favela.

*   *   *

APRIL 2, 2012

Are the Olympics More Trouble Than They’re Worth?

ProtestingToby Melville/Reuters

Winning a bid to host the Olympics is just the beginning. As London prepares for the 2012 Games this summer, residents have plenty of doubts: Will it be too expensive? Will it disrupt life too much? In the end, will they be better off because of the Games, or just saddled with public debt and a velodrome no one knows what to do with?

What about Rio de Janeiro: Will it come out ahead, after having hosted the Pan American Games in 2007, the World Cup in 2014 and the Olympics in 2016?

READ THE DISCUSSION »

DEBATERS

Neil Jameson

The Games Help Londoners

NEIL JAMESON, LEAD ORGANIZER, LONDON CITIZENS

This is the world’s first “Living Wage Olympics,” and East London residents will reap the rewards.

Julian Cheyne

The Games Hurt Londoners

JULIAN CHEYNE, EVICTED RESIDENT, EAST LONDON

The Olympics are an expensive distraction that sets dangerous precedents, coddling the elite and trampling the poor.

Theresa Williamson

A Missed Opportunity in Rio

THERESA WILLIAMSON, FOUNDER, CATALYTIC COMMUNITIES

In preparing for the World Cup and the Olympics, Rio could make long-term investments and integrate the favelas. Instead it is aggravating its problems.

Bruno Reis

Brazil Can Come Out Ahead

BRUNO REIS, RISK ANALYST IN BRAZIL

These Games represent a golden opportunity, but will Rio de Janeiro repeat the success of Barcelona or the failure of Athens?

Andrew Zimbalist

Venues as an Asset or an Albatross

ANDREW ZIMBALIST, ECONOMIST, SMITH COLLEGE

Olympics planning takes place in a frenzied atmosphere — not optimal conditions for contemplating the future shape of an urban landscape.

Mitchell L. Moss

New York Is Lucky Not to Have the Games

MITCHELL L. MOSS, NEW YORK UNIVERSITY

London will be a morass this summer. Meanwhile, there has never been a better time to visit New York City.

Ações afirmativas e sistema de cotas nas universidades brasileiras

Mais um passo na luta pela democratização efetiva do Ensino Superior

dhescbrasil.org.br

10 de agosto de 2012

Em 07 de agosto de 2012 o Senado Federal aprovou um projeto que tramitava a cerca de  uma década no Congresso, instituindo a reserva de 50% das vagas das universidades e institutos tecnológicos federais para estudantes que cursaram o ensino médio em escola pública.

Além disso, a lei prevê que, destas vagas, metade serão destinadas a estudantes com renda familiar per capita até um salário mínimo e meio. Também prevê que em cada estado serão destinadas vagas para pretos, pardos e indígenas, respeitando o percentual destes grupos nos estados, de acordo com os dados do IBGE.

Tais medidas visam atender a demandas históricas de ativistas que lutam pelo direito à educação e também pela democratização efetiva do ensino superior no país. Como sabemos historicamente o sistema universitário brasileiro se desenvolveu de forma restrita em termos de número de vagas e também de grupos atendidos. O ensino superior foi pensado durante muito tempo como um sistema para poucos e, com frequência, para aqueles que conseguiram se preparar para competir por uma vaga num quadro altamente competitivo.

Ao longo dos anos 1990 e principalmente dos anos 2000 ampliou-se o consenso entre diferentes setores da sociedade brasileira sobre a enorme desigualdade no acesso ao ensino superior no Brasil, expresso no paradoxo conhecido de que entre os estudantes das universidades públicas predominam os estudantes que freqüentaram escolas particulares no ensino básico, sendo o inverso também verdadeiro.

Observou-se também que os jovens brasileiros que chegavam ao ensino superior eram predominantemente de classe média e de classe alta e em sua maioria brancos, deixando de fora desta possibilidade, portanto, um grande contingente de jovens pobres, pretos, pardos e indígenas.

Em face de esta exclusão educacional, entidades não governamentais e movimentos sociais se mobilizaram para oferecer oportunidades de formação complementar para os jovens pobres, pretos, pardos e indígenas aumentarem suas chances de ingresso. Universidades, prefeituras, empresas e igrejas também se engajaram nestas iniciativas, levando a resultados relevantes em termos de aprovação destes estudantes em exames de seleção.

Também órgãos governamentais passaram a desenvolver políticas para ampliar o acesso ao ensino superior de grupos historicamente excluídos, tais como a reserva de vagas em  universidades públicas, a criação do Programa Universidade para Todos (PROUNI), destinado a fornecer bolsas de estudo em instituições privadas de ensino superior e a  ampliação do investimento em universidades federais visando o aumento da oferta de cursos e vagas.

Em 2012 é possível afirmar que estas medidas produziram efeitos positivos no que diz respeito à ampliação do acesso ao ensino superior de jovens de grupos excluídos. Entretanto, ainda permanece uma distância entre o número de jovens que concluem o ensino médio em escola pública e os que conseguem ingressar em instituição pública de ensino superior. Também ainda é desproporcional o número de estudantes negros e indígenas que chegam ao ensino superior, em comparação com sua proporção na população.

A lei aprovada pelo Senado vem justamente ampliar de forma substantiva estas oportunidades, levando a um compromisso das instituições federais de ensino superior e técnico com esta expansão. A lei também traz um importante compromisso com a igualdade racial, através da formalização do compromisso de ampliação do ingresso de estudantes negros e indígenas em proporções definidas segundo sua representação na população de cada estado da federação.

Num país que, até recentemente, tinha dificuldades em aceitar a desigualdade racial presente na sociedade, a aprovação desta lei reveste-se de grande importância, pois permite que se avance na efetiva democratização de oportunidades de ingresso no ensino superior.

Cabe-nos, agora, perguntar? Todos os problemas se resolvem com esta medida? Obviamente não. Na verdade a aprovação desta lei traz desafios importantes, como a ampliação e consolidação de permanência de estudantes de menor renda no ensino superior, através de um efetivo e eficaz programa de assistência estudantil. Também traz o desafio de continuar ampliando as oportunidades para que milhões de jovens pobres, negros e indígenas possam ter acesso e completar com sucesso o ensino médio, a fim de que possam participar da seleção de ingresso ao ensino superior.

Medidas de democratização com as que estão contidas nesta nova lei são marcos importantes no longo caminho da realização do direito à educação no Brasil. Esperamos que, após a sanção desta lei pela presidência, possamos inaugurar um novo momento nas políticas educacionais no país, com ampliação do acesso, oportunidades mais democráticas de permanência no ensino superior e pela busca de maior igualdade em todos os níveis. O caminho é longo, mas, com esta lei, será dado um grande passo.

Rosana Heringer
Relatora do Direito Humano à Educação

*   *   *

INCLUSÃO NO ENSINO SUPERIOR: RAÇA OU RENDA?

João Feres Júnior*

Grupo Estratégico de Análise da Educação Superior no Brasil – FLACSO Brasil

A decisão por unanimidade do Supremo Tribunal Federal, no dia 26 de abril de 2012, que declarou a constitucionalidade do sistema de cotas étnico-raciais para admissão de alunos ao ensino superior, teve, entre várias consequências positivas, a virtude de abrir a possibilidade para que o debate acerca da inclusão por meio do acesso à educação superior se aprofunde. Mudamos, portanto, de um contexto no qual o debate era dominantemente normativo, preocupado principalmente com a questão da legalidade e constitucionalidade da ação afirmativa étnico-racial, para um novo contexto, no qual passa a importar a discussão concreta acerca dos mecanismos e critérios adotados pelas políticas de inclusão.

Além de sua pertinência moral, a decisão do Supremo é consonante com várias análises a partir de dados estatísticos sólidos, feitas a partir do final dos anos 1970 até o presente, que mostram a relevância da variável classe e da variável raça na reprodução da desigualdade no Brasil. Esse fato nos leva a intuir que o uso de ambas as variáveis em políticas de inclusão é recomendável. Tal intuição é em geral correta, mas não podemos nos esquecer de que da análise sociológica de dados populacionais ao desenho de políticas públicas a distância é grande e não pode ser percorrida sem mediações: identificação de públicos, adoção de categorias, criação de regras, estabelecimento de objetivos, avaliação de resultados etc.

Ao abordar a questão dos critérios de seleção, primeiro cabe fazer uma ressalva de caráter histórico. O debate midiático sobre ação afirmativa foca quase exclusivamente sobre a ação afirmativa étnico-racial. Contudo, a modalidade mais frequente de ação afirmativa adotada pelas universidades públicas brasileiras hoje tem como beneficiários alunos oriundos da escola pública: 61 de um total de 98 instituições, enquanto que apenas 40 têm políticas para negros (ou pretos e pardos).

Mas isso não é só: o processo de criação dessas políticas de inclusão no ensino superior brasileiro – hoje 72% das universidades públicas brasileira têm algum tipo de ação afirmativa – não pode ser narrado sem falarmos do protagonismo do Movimento Negro e de seus simpatizantes ao articular a demanda por inclusão frente às universidades por todo o Brasil. Ao serem pressionadas por esses setores da sociedade civil organizada, as universidades reagiram, cada uma a seu modo, pouquíssimas vezes criando cotas somente para negros (4 casos), muitas vezes criando cotas para
negros e alunos de escola pública (31), e majoritariamente criando cotas para alunos de escola pública. Não houve, por outro lado, nenhum movimento independente para a inclusão de alunos pobres no ensino superior. Em suma, se não fosse pela demanda por inclusão para negros, o debate sobre o papel da universidade no Brasil democrático certamente estaria bem mais atrasado.

O ponto mais importante, contudo, é entender que as mediações entre o conhecimento sociológico e a política pública têm de ser regidas por um espírito pragmatista que segue o seguinte método: a partir de uma concordância básica acerca da situação e dos objetivos, estabelecemos ações mediadoras para a implantação de uma política e então passamos a observar seus resultados. A observação sistemática (e não impressionista) dos resultados é fundamental para que possamos regular as ações mediadoras a fim de atingir nossos objetivos, ou mesmo mudar os objetivos ou a leitura da situação. Sem esse espírito é difícil proceder de maneira progressista na abordagem de qualquer assunto que diga respeito a uma intervenção concreta na realidade.

Assim, ainda que saibamos que ambas as variáveis, classe e raça, devam ser objeto de políticas de inclusão, não existe um plano ideal para aplicá-las. Será que deveriam ser separadas (cotas para negros e cotas para escola pública) ou combinadas (cotas que somente aceitem candidatos com as duas qualificações)? Fato é que pouquíssimas universidades adotam a primeira opção, enquanto 36 das 40 universidades públicas com ação afirmativa para negros têm algum critério de classe combinado, seja ele escola pública ou renda.

Há também outra questão importante: a variável classe deve ser operacionalizada pelo critério de renda ou escola pública? No agregado, as universidades escolheram preferencialmente “escola pública”, 30 das 40, pois ele é mais eficaz do que “declaração de renda” para se auferir a classe social do ingressante – pessoas com renda informal facilmente burlariam o procedimento. Contudo, 6 universidades, entre elas as universidades estaduais do Rio de Janeiro, exemplos pioneiros de adoção de ação afirmativa no país, adotam o critério de renda. No caso das universidades fluminenses, os programas que começaram em 2003 tinham cotas para escola pública separadas de cotas para “negros e pardos” (sic), mas em 2005 a lei foi alterada passando a sobrepor um limite de renda à cota racial.

Informações advindas de pessoas que participaram do debate que levou a tal mudança apontam para o fato de que a exposição do assunto à mídia, fortemente enviesada contra tais políticas, fez com que os tomadores de decisão tentassem se proteger do argumento de que a ação afirmativa beneficiaria somente a classe média negra. A despeito da causa que levou a tal mudança, o método sugerido acima nos leva a olhar para as consequências. Dados da UENF (Universidade Estadual do Norte Fluminense Darcy Ribeiro) mostram que nos anos em que vigorou o sistema antigo, 2003 e 2004, entraram respectivamente 40 e 60 alunos não-brancos – aproximadamente 11% do total de ingressantes. A sobreposição de critérios que passou a operar no ano seguinte derrubou esse número para 19. A média de alunos não-brancos que ingressaram sob o novo regime de 2005 a 2009 é ainda menor – 13 –, o que representa parcos 3% do total de ingressantes.

Conclusão: uma política que produzia resultados foi tornada praticamente irrelevante devido à adoção de critérios que no papel parecem justos, ou adequados, ou politicamente estratégicos. Contudo, o resultado deveria ser a parte fundamental. O exemplo comprova nosso ponto de vista de que não há receitas mágicas. Se isso é verdade, então a experimentação faz-se necessária. Mas fica faltando ainda um elemento crucial nessa equação. Para avaliarmos os resultados da experimentação é preciso que as universidades com programas de inclusão tornem públicos seus dados, e isso não tem acontecido, com raríssimas exceções. Sem avaliações sólidas das políticas, corremos o risco de ficarmos eternamente no plano da conjectura e da anedota e assim não conseguir atingir o objetivo maior dessas iniciativas, que é o de democratizar o acesso à educação superior no Brasil.

Rio de Janeiro, junho de 2012

Este texto é uma contribuição do autor ao projeto Grupo Estratégico de Análise da Educação Superior
(GEA-ES), realizado pela FLACSO-Brasil com apoio da Fundação Ford.

Post Normal Science: Deadlines (Climate Etc.)

Posted on August 3, 2012

by Steven Mosher

Science has changed. More precisely, in post normal conditions the behavior of people doing science has changed.

Ravetz describes a post normal situation by the following criteria:

  1. Facts are uncertain
  2. Values are in conflict
  3. Stakes are high
  4. Immediate action is required

The difference between Kuhnian normal science, or the behavior of those doing science under normal conditions, and post normal science is best illustrated by example. We can use the recent discovery of the Higgs Boson as an example. Facts were uncertain–they always are to a degree; no values were in conflict; the stakes were not high; and, immediate action was not required. What we see in that situation is those doing science acting as we expect them to, according to our vague ideal of science. Because facts are uncertain, they listen to various conflicting theories. They try to put those theories to a test. They face a shared uncertainity and in good faith accept the questions and doubts of others interested in the same field. Their participation in politics is limited to asking for money. Because values are not in conflict no theorist takes the time to investigate his opponent’s views on evolution or smoking or taxation. Because the field of personal values is never in play, personal attacks are minimized. Personal pride may be at stake, but values rarely are. The stakes for humanity in the discovery of the Higgs are low: at least no one argues that our future depends upon the outcome. No scientist straps himself to the collider and demands that it be shut down. And finally, immediate action is not required; under no theory is the settling of the uncertainty so important as to rush the result. In normal science, according to Kuhn,  we can view the behavior of those doing science as puzzle solving. The details of a paradigm are filled out slowly and deliberately.

The situation in climate science are close to the polar opposite of this. That does not mean and should not be construed as a criticism of climate science or its claims. The simple point is this: in a PNS situation, the behavior of those doing science changes. To be sure much of their behavior remains the same. They formulate theories; they collect data, and they test their theories against the data. They don’t stop doing what we notional  describe as science. But, as foreshadowed above in the description of how high energy particle physicists behave, one can see how that behavior changes in a PNS situation. There is uncertainty, but the good faith that exists in normal science, the faith that other people are asking questions because they actually want the answer is gone. Asking questions, raising doubts, asking to see proof becomes suspect in and of itself. And those doing science are faced with a question that science cannot answer: Does this person really want the answer or are they amerchant of doubt? Such a question never gets asked in normal science. Normal science doesn’t ask this question because science cannot answer it.

Because values are in conflict the behavior of those doing science changes. In normal science no one would care if Higgs was a Christian or an atheist. No one would care if he voted liberal or conservative; but because two different value systems are in conflict in climate science, the behavior of those doing science changes. They investigate each other. They question motives. They form tribes.  And because the stakes are high the behavior of those doing science changes as well. They protest; they take money from lobby groups on both sides and worse of all they perform horrendous raps on youTube. In short, they become human; while those around them canonize them or demonize them and their findings become iconized or branded as hoaxes.

This brings us to the last aspect of a PNS situation: immediate action is required. This perhaps is the most contentious aspect of PNS, in fact I would argue it is thedefining characteristic. In all PNS situations it is almost always the case the one side sees the need for action, given the truth of their theory, while the doubtersmust of necessity see no need for immediate action. They must see no need for immediate action because their values are at risk and because the stakes are high. Another way to put this is as follows. When you are in a PNS situation, all sides must deny it. Those demanding immediate action, deny it by claiming more certainty*than is present; those refusing immediate action, do so by increasing demands for certainty. This leads to a centralization and valorization of the topic of uncertainty, and epistemology becomes a topic of discussion for those doing science. That is decidedly not normal science.

The demand for immediate action, however, is broader than simply a demand that society changes. In a PNS situation the behavior of those doing science changes. One of the clearest signs that you are in PNS is the change in behavior around deadlines. Normal science has no deadline. In normal science, the puzzle is solved when it is solved. In normal science there may be a deadline to shut down the collider for maintenance. Nobody rushes the report to keep the collider running longer than it should. And if a good result is found, the schedules can be changed to accommodate the scienceBroadly speaking, science drives the schedule; the schedule doesn’t drive the science.

The climategate mails are instructive here. As one reads through the mails it’s clear that the behavior of those doing science is not what one would call disinterested patient puzzle solving. Human beings acting in a situation where values are in conflict and stakes are high will engage in behavior that they might not otherwise. Those changes are most evident in situations surrounding deadlines. The point here is not to rehash The Crutape Lettersbut rather to relook at one incident ( there are others, notably around congressional hearings ) where deadlines came into play. The deadline in question was the deadline for submitting papers for consideration. As covered in The Crutape Letters and in The Hockeystick Illusion, the actions taken by those doing science around the“Jesus Paper” is instructive. In fact, were I to rewrite the Crutape letters I would do it from the perspective of PNS, focusing on how the behavior of those doing science deviated from the ideals of openness, transparency and letting truth come on its own good time.

Climategate is about FOIA. There were two critical paths for FOIA: one sought data, the other sought the emails of scientists. Not quite normal. Not normal in that data is usually shared; not normal in that we normally respect the privacy of those doing science. But this is PNS, and all bets are off. Values and practices from other fields, such as business and government,  are imported into the culture of science: Data hoarding is defended using IP and confidentiality agreements. Demanding private mail is defended using values imported from performing business for the public. In short, one sign that a science is post normal, is the attempt to import values and procedures from related disciplines. Put another way, PNS poses the question of governance. Who runs science and how should they run it.

The “Jesus paper” in a nutshell can be explained as follows. McIntyre and McKittrick had a paper published in the beginning of 2005. That paper needed to be rebutted in order to make Briffa’s job of writing chapter 6 easier. However, there was a deadline in play. Papers had to be accepted by a date certain. At one point Steven Schneider suggested the creation of a new category, a novelty–  provisionally accepted — so that the “jesus paper” could make the deadline. McIntyre covers the issue here. One need not re-adjudicate whether or not the IPCC rules were broken. And further these rules have nothing whatsoever ever to do with the truth of the claims in that paper. This is not about the truth of the science. What is important is the importation of the concept of a deadline into the search for truth. What is important is that the behavior of those doing science changes. Truth suddenly cares about a date. Immediate action is required. In this case immediate action is taken to see to it that the paper makes it into the chapter. Normal science takes no notice of deadlines. In PNS, deadlines matter.

Last week we saw another example of deadlines and high stakes changing the behavior of those doing science. The backstory here explains .   It appears to me that the behavior of those involved changed from what I have known it to be. It changed because they perceived that immediate action was required. A deadline had to be met. Again, as with the Jesus paper, the facts surrounding the releasedo not go to the truth of the claims. In normal science, a rushed claimed might very well get the same treatment as an unrushed claim: It will be evaluated on its merits. In PNS, either the rush to meet an IPCC deadline– as in the case of the Jesus paper, or the rush to be ready for congress –as in the Watts case, is enoughfor some doubt the science.  What has been testified to in Congress by Christy, a co author, may very well be true. But in this high stakes arena, where facts are uncertain and values are in conflict, the behavior of those doing science can and does change. Not all their behavior changes. They still observe and test and report. But the manner in which they do that changes. Results are rushed and data is held in secret. Deadlines change everything. Normal science doesn’t operate this way; if it does, quality can suffer. And yet, the demand for more certainty than is needed, the bad faith game of delaying action by asking questions, precludes a naïve return to science without deadlines.

The solution that Ravetz suggests is extended peer review and a recognition of the importance of quality. In truth, the way out of a PNS situation is not that simple. The first step out of a PNS situation is the recognition that one is in the situation to begin with. Today, few people embroiled in this debate would admit that the situation has changed how they would normally behave. An admission that this isn’t working is a cultural crisis for science. No one has the standing to describe how one should conduct science in a PNS situation. No one has the standing to chart the path out of a PNS situation. The best we can do is describe what we see. Today, I observe that deadlines change the behavior of those doing science. We see that in climategate; we see that in the events of the past week. That’s doesn’t entail anything about the truth of science performed under pressure. But it should make us pause and consider if truth will be found any faster by rushing the results and hiding the data.

*I circulated a copy of this to Michael Tobis to get his reaction. MT took issue with this characterization. MT, I believe, originated the argument that our uncertainty is a reason for action. It is true that while the certainty about the science  has been a the dominant piece of the rhetoric, there has been a second thread of rhetoric that bases action in the uncertainty about sensitivity. I would call this certainty shifting. While the uncertainty about facts of sensitivity are accepted in this path of argument the certainty is shifted to certainty about values and certainty about impacts. In short, the argument becomes that while we are uncertain about sensitivity the certainty we have about large impacts and trans-generational obligations necessitates action.

Scientists struggle with limits – and risks – of advocacy (eenews.net)

Monday, July 9, 2012

Paul Voosen, E&E reporter

Jon Krosnick has seen the frustration etched into the faces of climate scientists.

For 15 years, Krosnick has charted the rising public belief in global warming. Yet, as the field’s implications became clearer, action has remained elusive. Science seemed to hit the limits of its influence. It is a result that has prompted some researchers to cross their world’s no man’s land — from advice to activism.

As Krosnick has watched climate scientists call for government action, he began pondering a recent small dip in the public’s belief. And he wondered: Could researchers’ move into the political world be undermining their scientific message?

Jon Krosnick
Stanford’s Jon Krosnick has been studying the public’s belief in climate change for 15 years, but only recently did he decide to probe their reaction to scientists’ advocacy. Photo courtesy of Jon Krosnick.

“What if a message involves two different topics, one trustworthy and one not trustworthy?” said Krosnick, a communication and psychology professor at Stanford University. “Can the general public detect crossing that line?”

His results, not yet published, would seem to say they can.

Using a national survey, Krosnick has found that, among low-income and low-education respondents, climate scientists suffered damage to their trustworthiness and credibility when they veered from describing science into calling viewers to ask the government to halt global warming. And not only did trust in the messenger fall — even the viewers’ belief in the reality of human-caused warming dropped steeply.

It is a warning that, even as the frustration of inaction mounts and the politicization of climate science deepens, researchers must be careful in getting off the political sidelines.

“The advice that comes out of this work is that all of us, when we claim to have expertise and offer opinions on matters [in the world], need to be guarded about how far we’re willing to go,” Krosnick said. Speculation, he added, “could compromise everything.”

Krosnick’s survey is just the latest social science revelation that has reordered how natural scientists understand their role in the world. Many of these lessons have stemmed from the public’s and politicians’ reactions to climate change, which has provided a case study of how science communication works and doesn’t work. Complexity, these researchers have found, does not stop at their discipline’s verge.

For decades, most members of the natural sciences held a simple belief that the public stood lost, holding out empty mental buckets for researchers to fill with knowledge, if they could only get through to them. But, it turns out, not only are those buckets already full with a mix of ideology and cultural belief, but it is incredibly fraught, and perhaps ineffective, for scientists to suggest where those contents should be tossed.

It’s been a difficult lesson for researchers.

“Many of us have been saddened that the world has done so little about it,” said Richard Somerville, a meteorologist at the Scripps Institution of Oceanography and former author of the United Nations’ authoritative report on climate change.

“A lot of physical climate scientists, myself included, have in the past not been knowledgeable about what the social sciences have been saying,” he added. “People who know a lot about the science of communication … [are] on board now. But we just don’t see that reflected in the policy process.”

While not as outspoken as NASA’s James Hansen, who has taken a high-profile moral stand alongside groups like 350.org and Greenpeace, Somerville has been a leader in bringing scientists together to call for greenhouse gas reductions. He helped organize the 2007 Bali declaration, a pointed letter from more than 200 scientists urging negotiators to limit global CO2 levels well below 450 parts per million.

Such declarations, in the end, have done little, Somerville said.

“If you look at the effect this has had on the policy process, it is very, very small,” he said.

This failed influence has spurred scientists like Somerville to partner closely with social scientists, seeking to understand why their message has failed. It is an effort that received a seal of approval this spring, when the National Academy of Sciences, the nation’s premier research body, hosted a two-day meeting on the science of science communication. Many of those sessions pivoted on public views of climate change.

It’s a discussion that’s been long overdue. When it comes to how the public learns about expert opinions, assumptions mostly rule in the sciences, said Dan Kahan, a professor of law and psychology at Yale Law School.

“Scientists are filled with conjectures that are plausible about how people make sense about information,” Kahan said, “only some fraction of which [are] correct.”

Shifting dynamic

Krosnick’s work began with a simple, hypothetical scene: NASA’s Hansen, whose scientific work on climate change is widely respected, walks into the Oval Office.

As he has since the 1980s, Hansen rattles off the inconvertible, ever-increasing evidence of human-caused climate change. It’s a stunning litany, authoritative in scope, and one the fictional president — be it a Bush or an Obama — must judge against Hansen’s scientific credentials, backed by publications and institutions of the highest order. If Hansen stops there, one might think, the case is made.

But he doesn’t stop. Hansen continues, arguing, as a citizen, for an immediate carbon tax.

“Whoa, there!” Krosnick’s president might think. “He’s crossed into my domain, and he’s out of touch with how policy works.” And if Hansen is willing to offer opinions where he lacks expertise, the president starts to wonder: “Can I trust any of his work?”

Richard Somerville
Part of Scripps’ legendary climate team — Charles David Keeling was an early mentor — Richard Somerville helped organize the 2007 Bali declaration by climate scientists, calling for government action on CO2 emissions. Photo by Sylvia Bal Somerville.

Researchers have studied the process of persuasion for 50 years, Krosnick said. Over that time, a few vital truths have emerged, including that trust in a source matters. But looking back over past work, Krosnick found no answer to this question. The treatment was simplistic. Messengers were either trustworthy or not. No one had considered the case of two messages, one trusted and one shaky, from the same person.

The advocacy of climate scientists provided an excellent path into this shifting dynamic.

Krosnick’s team hunted down video of climate scientists first discussing the science of climate change and then, in the same interview, calling for viewers to pressure the government to act on global warming. (Out of fears of bruised feelings, Krosnick won’t disclose the specific scientists cited.) They cut the video in two edits: one showing only the science, and one showing the science and then the call to arms.

Krosnick then showed a nationally representative sample of 793 Americans one of three videos: the science-only cut, the science and political cut, and a control video about baking meatloaf (The latter being closer to politics than Krosnick might admit). The viewers were then asked a series of questions both about their opinion of the scientist’s credibility and their overall beliefs on global warming.

For a cohort of 548 respondents who either had a household income under $50,000 or no more than a high school diploma, the results were stunning and statistically significant. Across the board, the move into politics undermined the science.

The viewers’ trust in the scientist dropped 16 percentage points, from 48 to 32 percent. Their belief in the scientist’s accuracy fell from 47 to 36 percent. Their overall trust in all scientists went from 60 to 52 percent. Their belief that government should “do a lot” to stop warming fell from 62 to 49 percent. And their belief that humans have caused climate change fell 14 percentage points, from 81 to 67 percent.

Krosnick is quick to note the study’s caveats. First, educated or wealthy viewers had no significant reaction to the political call and seemed able to parse the difference between science and a personal political view. The underlying reasons for the drop are far from clear, as well — it could simply be a function of climate change’s politicization. And far more testing needs to be done to see whether this applies in other contexts.

With further evidence, though, the implications could be widespread, Krosnick said.

“Is it the case that the principle might apply broadly?” he asked. “Absolutely.”

‘Fraught with misadventure’

Krosnick’s study is likely rigorous and useful — he is known for his careful methods — but it still carries with it a simple, possibly misleading frame, several scientists said.

Most of all, it remains hooked to a premise that words float straight from the scientist’s lips to the public’s ears. The idea that people learn from scientists at all or that they are simply misunderstanding scientific conclusions is not how reality works, Yale’s Kahan said.

“The thing that goes into the ear is fraught with misadventure,” he said.

Kahan has been at the forefront of charting how the empty-bucket theory of science communication — called the deficit model — fails. People interpret new information within the context of their own cultural beliefs, peers and politics. They use their reasoning to pick the evidence that supports their views, rather than the other way around. Indeed, recent work by Kahan found that higher-educated respondents were more likely to be polarized than their less-educated peers.

Krosnick’s study will surely spur new investigations, Kahan said, though he resisted definite remarks until he could see the final work. If the study’s conditions aren’t realistic, even a simple model can have “plenty of implications for all kinds of ways of which people become exposed to science,” he said.

The survey sits well with other research in the field and carries an implication about what role scientists should play in scientific debates, added Matthew Nisbet, a communication professor at American University.

“As soon as you start talking about a policy option, you’re presenting information that is potentially threatening to people’s values or identity,” he said. The public, he added, doesn’t “view scientists and scientific information in a vacuum.”

The deficit model has remained an enduring frame for scientists, many of whom are just becoming aware of social science work on the problem. Kahan compares it to the stages of grief. The first stage was that the truth just needs to be broadcast to change minds. The second, and one still influential in the scientific world, is that if the message is just simplified, the right images used, than the deficit will be filled.

“That too, I think, is a stage of misperception about how this works,” Kahan said.

Take the hand-wringing about science education that accompanied a recent poll finding that 46 percent of the United States believed in a creationist origin for humans. It’s a result that speaks to belief, not an understanding of evolution. Many surveyed who believed in evolution would still fail to explain natural selection, mutation or genetic variance, Kahan said, just as they don’t have to understand relativity to use their GPS.

Much of science doesn’t run up against the public’s belief systems and is accepted with little fuss. It’s not as if Louis Pasteur had to sell pasteurization by using slick images of children getting sick; for nearly all of society, it was simply a useful tool. People want to defer to the experts, as long as they don’t have to concede their beliefs on the way.

“People know what’s known without having a comprehension of why that’s the truth,” Kahan said.

There remains a danger in the emerging consensus that all scientific knowledge is filtered by the motivated reasoning of political and cultural ideology, Nisbet added. Not all people can be sorted by two, or even four, variables.

“In the new ideological deficit model, we tend to assume that failures in communication are caused by conservative media and conservative psychology,” he said. “The danger in this model is that we define the public in exclusively binary terms, as liberals versus conservatives, deniers versus believers.”

‘Crossing that line’

So why do climate scientists, more than most fields, cross the line into advocacy?

Most of all, it’s because their scientific work tells them the problem is so pressing, and time dependent, given the centuries-long life span of CO2 emissions, Somerville said.

“You get to the point where the emissions are large enough that you’ve run out of options,” he said. “You can no longer limit [it]. … We may be at that point already.”

There may also be less friction for scientists to suggest communal solutions to warming because, as Nisbet’s work has found, scientists tend to skew more liberal than the general population with more than 50 percent of one U.S. science society self-identifying as “liberal.” Given this outlook, they are more likely to accept efforts like cap and trade, a bill that, in implying a “cap” on activity, rubbed conservatives wrong.

Dan Kahan
A prolific law professor and psychologist at Yale, Dan Kahan has been charting how the public comes to, and understands, science. Photo courtesy of Dan Kahan.

“Not a lot of scientists would question if this is an effective policy,” Nisbet said.

It is not that scientists are unaware that they are moving into policy prescription, either. Most would intuitively know the line between their work and its political implications.

“I think many are aware when they’re crossing that line,” said Roger Pielke Jr., an environmental studies professor at the University of Colorado, Boulder, “but they’re not aware of the consequences [of] doing so.”

This willingness to cross into advocacy could also stem from the fact that it is the next logical skirmish. The battle for public opinion on the reality of human-driven climate change is already over, Pielke said, “and it’s been won … by the people calling for action.”

While there are slight fluctuations in public belief, in general a large majority of Americans side with what scientists say about the existence and causes of climate change. It’s not unanimous, he said, but it’s larger than the numbers who supported actions like the Montreal Protocol, the bank bailout or the Iraq War.

What has shifted has been its politicization: As more Republicans have begun to disbelieve global warming, Democrats have rallied to reinforce the science. And none of it is about the actual science, of course. It’s a fact Scripps’ Somerville now understands. It’s a code, speaking for fear of the policies that could happen if the science is accepted.

Doubters of warming don’t just hear the science. A policy is attached to it in their minds.

“Here’s a fact,” Pielke said. “And you have to change your entire lifestyle.”

For all the focus on how scientists talk to the public — whether Hansen has helped or hurt his cause — Yale’s Kahan ultimately thinks the discussion will mean very little. Ask most of the public who Hansen is, and they’ll mention something about the Muppets. It can be hard to accept, for scientists and journalists, but their efforts at communication are often of little consequence, he said.

“They’re not the primary source of information,” Kahan said.

‘A credible voice’

Like many of his peers, Somerville has suffered for his acts of advocacy.

“We all get hate email,” he said. “I’ve given congressional testimony and been denounced as an arrogant elitist hiding behind a discredited organization. Every time I’m on national news, I get a spike in ugly email. … I’ve received death threats.”

There are also pressures within the scientific community. As an elder statesman, Somerville does not have to worry about his career. But he tells young scientists to keep their heads down, working on technical papers. There is peer pressure to stay out of politics, a tension felt even by Somerville’s friend, the late Stephen Schneider, also at Stanford, who was long one of the country’s premier speakers on climate science.

He was publicly lauded, but many in the climate science community grumbled, Somerville said, that Schneider should “stop being a motormouth and start publishing technical papers.”

But there is a reason tradition has sustained the distinction between advising policymakers and picking solutions, one Krosnick’s work seems to ratify, said Michael Mann, a climatologist at Pennsylvania State University and a longtime target of climate contrarians.

“It is thoroughly appropriate, as a scientist, to discuss how our scientific understanding informs matters of policy, but … we should stop short of trying to prescribe policy,” Mann said. “This distinction is, in my view, absolutely critical.”

Somerville still supports the right of scientists to speak out as concerned citizens, as he has done, and as his friend, NASA’s Hansen, has done more stridently, protesting projects like the Keystone XL pipeline. As long as great care is taken to separate the facts from the political opinion, scientists should speak their minds.

“I don’t think being a scientist deprives you of the right to have a viewpoint,” he said.

Somerville often returns to a quote from the late Sherwood Rowland, a Nobel laureate from the University of California, Irvine, who discovered the threat chlorofluorocarbons posed to ozone: “What’s the use of having developed a science well enough to make predictions if, in the end, all we’re willing to do is stand around and wait for them to come true?”

Somerville asked Rowland several times whether the same held for global warming.

“Yes, absolutely,” he replied.

It’s an argument that Krosnick has heard from his own friends in climate science. But often this fine distinction gets lost in translation, as advocacy groups present the scientist’s personal message as the message of “science.” It’s luring to offer advice — Krosnick feels it himself when reporters call — but restraint may need to rule.

“In order to preserve a credible voice in public dialogue,” Krosnick said, “it might be that scientists such as myself need to restrain ourselves as speaking as public citizens.”

Broader efforts of communication, beyond scientists, could still mobilize the public, Nisbet said. Leave aside the third of the population who are in denial or alarmed about climate change, he said, and figure out how to make it relevant to the ambivalent middle.

“We have yet to really do that on climate change,” he said.

Somerville is continuing his efforts to improve communication from scientists. Another Bali declaration is unlikely, though. What he’d really like to do is get trusted messengers from different moral realms beyond science — leaders like the Dalai Lama — to speak repeatedly on climate change.

It’s all Somerville can do. It would be too painful to accept the other option, that climate change is like racism, war or poverty — problems the world has never abolished.

“[It] may well be that it is a problem that is too difficult for humanity to solve,” he said.

Irony Seen Through the Eye of MRI (Science Daily)

ScienceDaily (Aug. 3, 2012) — In the cognitive sciences, the capacity to interpret the intentions of others is called “Theory of Mind” (ToM). This faculty is involved in the understanding of language, in particular by bridging the gap between the meaning of the words that make up a statement and the meaning of the statement as a whole.

In recent years, researchers have identified the neural network dedicated to ToM, but no one had yet demonstrated that this set of neurons is specifically activated by the process of understanding of an utterance. This has now been accomplished: a team from L2C2 (Laboratoire sur le Langage, le Cerveau et la Cognition, Laboratory on Language, the Brain and Cognition, CNRS / Université Claude Bernard-Lyon 1) has shown that the activation of the ToM neural network increases when an individual is reacting to ironic statements.

Published in Neuroimage, these findings represent an important breakthrough in the study of Theory of Mind and linguistics, shedding light on the mechanisms involved in interpersonal communication.

In our communications with others, we are constantly thinking beyond the basic meaning of words. For example, if asked, “Do you have the time?” one would not simply reply, “Yes.” The gap between what is saidand what it means is the focus of a branch of linguistics called pragmatics. In this science, “Theory of Mind” (ToM) gives listeners the capacity to fill this gap. In order to decipher the meaning and intentions hidden behind what is said, even in the most casual conversation, ToM relies on a variety of verbal and non-verbal elements: the words used, their context, intonation, “body language,” etc.

Within the past 10 years, researchers in cognitive neuroscience have identified a neural network dedicated to ToM that includes specific areas of the brain: the right and left temporal parietal junctions, the medial prefrontal cortex and the precuneus. To identify this network, the researchers relied primarily on non-verbal tasks based on the observation of others’ behavior[1]. Today, researchers at L2C2 (Laboratoire sur le Langage, le Cerveau et la Cognition, Laboratory on Language, the Brain and Cognition, CNRS / Université Claude Bernard-Lyon 1) have established, for the first time, the link between this neural network and the processing of implicit meanings.

To identify this link, the team focused their attention on irony. An ironic statement usually means the opposite of what is said. In order to detect irony in a statement, the mechanisms of ToM must be brought into play. In their experiment, the researchers prepared 20 short narratives in two versions, one literal and one ironic. Each story contained a key sentence that, depending on the version, yielded an ironic or literal meaning. For example, in one of the stories an opera singer exclaims after a premiere, “Tonight we gave a superb performance.” Depending on whether the performance was in fact very bad or very good, the statement is or is not ironic.

The team then carried out functional magnetic resonance imaging (fMRI) analyses on 20 participants who were asked to read 18 of the stories, chosen at random, in either their ironic or literal version. The participants were not aware that the test concerned the perception of irony. The researchers had predicted that the participants’ ToM neural networks would show increased activity in reaction to the ironic sentences, and that was precisely what they observed: as each key sentence was read, the network activity was greater when the statement was ironic. This shows that this network is directly involved in the processes of understanding irony, and, more generally, in the comprehension of language.

Next, the L2C2 researchers hope to expand their research on the ToM network in order to determine, for example, whether test participants would be able to perceive irony if this network were artificially inactivated.

Note:

[1] For example, Grèzes, Frith & Passingham (J. Neuroscience, 2004) showed a series of short (3.5 second) films in which actors came into a room and lifted boxes. Some of the actors were instructed to act as though the boxes were heavier (or lighter) than they actually were. Having thus set up deceptive situations, the experimenters asked the participants to determine if they had or had not been deceived by the actors in the films. The films containing feigned actions elicited increased activity in the rTPJ (right temporal parietal junction) compared with those containing unfeigned actions.

Journal Reference:

Nicola Spotorno, Eric Koun, Jérôme Prado, Jean-Baptiste Van Der Henst, Ira A. Noveck. Neural evidence that utterance-processing entails mentalizing: The case of ironyNeuroImage, 2012; 63 (1): 25 DOI:10.1016/j.neuroimage.2012.06.046

The Conversion of a Climate-Change Skeptic (N.Y.Times)

OP-ED CONTRIBUTOR

By RICHARD A. MULLER

Published: July 28, 2012

Berkeley, Calif.

CALL me a converted skeptic. Three years ago I identified problems in previous climate studies that, in my mind, threw doubt on the very existence of global warming. Last year, following an intensive research effort involving a dozen scientists, I concluded that global warming was real and that the prior estimates of the rate of warming were correct. I’m now going a step further: Humans are almost entirely the cause.

My total turnaround, in such a short time, is the result of careful and objective analysis by the Berkeley Earth Surface Temperature project, which I founded with my daughter Elizabeth. Our results show that the average temperature of the earth’s land has risen by two and a half degrees Fahrenheit over the past 250 years, including an increase of one and a half degrees over the most recent 50 years. Moreover, it appears likely that essentially all of this increase results from the human emission of greenhouse gases.

These findings are stronger than those of the Intergovernmental Panel on Climate Change, the United Nations group that defines the scientific and diplomatic consensus on global warming. In its 2007 report, the I.P.C.C. concluded only that most of the warming of the prior 50 years could be attributed to humans. It was possible, according to the I.P.C.C. consensus statement, that the warming before 1956 could be because of changes in solar activity, and that even a substantial part of the more recent warming could be natural.

Our Berkeley Earth approach used sophisticated statistical methods developed largely by our lead scientist, Robert Rohde, which allowed us to determine earth land temperature much further back in time. We carefully studied issues raised by skeptics: biases from urban heating (we duplicated our results using rural data alone), from data selection (prior groups selected fewer than 20 percent of the available temperature stations; we used virtually 100 percent), from poor station quality (we separately analyzed good stations and poor ones) and from human intervention and data adjustment (our work is completely automated and hands-off). In our papers we demonstrate that none of these potentially troublesome effects unduly biased our conclusions.

The historic temperature pattern we observed has abrupt dips that match the emissions of known explosive volcanic eruptions; the particulates from such events reflect sunlight, make for beautiful sunsets and cool the earth’s surface for a few years. There are small, rapid variations attributable to El Niño and other ocean currents such as the Gulf Stream; because of such oscillations, the “flattening” of the recent temperature rise that some people claim is not, in our view, statistically significant. What has caused the gradual but systematic rise of two and a half degrees? We tried fitting the shape to simple math functions (exponentials, polynomials), to solar activity and even to rising functions like world population. By far the best match was to the record of atmospheric carbon dioxide, measured from atmospheric samples and air trapped in polar ice.

Just as important, our record is long enough that we could search for the fingerprint of solar variability, based on the historical record of sunspots. That fingerprint is absent. Although the I.P.C.C. allowed for the possibility that variations in sunlight could have ended the “Little Ice Age,” a period of cooling from the 14th century to about 1850, our data argues strongly that the temperature rise of the past 250 years cannot be attributed to solar changes. This conclusion is, in retrospect, not too surprising; we’ve learned from satellite measurements that solar activity changes the brightness of the sun very little.

How definite is the attribution to humans? The carbon dioxide curve gives a better match than anything else we’ve tried. Its magnitude is consistent with the calculated greenhouse effect — extra warming from trapped heat radiation. These facts don’t prove causality and they shouldn’t end skepticism, but they raise the bar: to be considered seriously, an alternative explanation must match the data at least as well as carbon dioxide does. Adding methane, a second greenhouse gas, to our analysis doesn’t change the results. Moreover, our analysis does not depend on large, complex global climate models, the huge computer programs that are notorious for their hidden assumptions and adjustable parameters. Our result is based simply on the close agreement between the shape of the observed temperature rise and the known greenhouse gas increase.

It’s a scientist’s duty to be properly skeptical. I still find that much, if not most, of what is attributed to climate change is speculative, exaggerated or just plain wrong. I’ve analyzed some of the most alarmist claims, and my skepticism about them hasn’t changed.

Hurricane Katrina cannot be attributed to global warming. The number of hurricanes hitting the United States has been going down, not up; likewise for intense tornadoes. Polar bears aren’t dying from receding ice, and the Himalayan glaciers aren’t going to melt by 2035. And it’s possible that we are currently no warmer than we were a thousand years ago, during the “Medieval Warm Period” or “Medieval Optimum,” an interval of warm conditions known from historical records and indirect evidence like tree rings. And the recent warm spell in the United States happens to be more than offset by cooling elsewhere in the world, so its link to “global” warming is weaker than tenuous.

The careful analysis by our team is laid out in five scientific papers now online atBerkeleyEarth.org. That site also shows our chart of temperature from 1753 to the present, with its clear fingerprint of volcanoes and carbon dioxide, but containing no component that matches solar activity. Four of our papers have undergone extensive scrutiny by the scientific community, and the newest, a paper with the analysis of the human component, is now posted, along with the data and computer programs used. Such transparency is the heart of the scientific method; if you find our conclusions implausible, tell us of any errors of data or analysis.

What about the future? As carbon dioxide emissions increase, the temperature should continue to rise. I expect the rate of warming to proceed at a steady pace, about one and a half degrees over land in the next 50 years, less if the oceans are included. But if China continues its rapid economic growth (it has averaged 10 percent per year over the last 20 years) and its vast use of coal (it typically adds one new gigawatt per month), then that same warming could take place in less than 20 years.

Science is that narrow realm of knowledge that, in principle, is universally accepted. I embarked on this analysis to answer questions that, to my mind, had not been answered. I hope that the Berkeley Earth analysis will help settle the scientific debate regarding global warming and its human causes. Then comes the difficult part: agreeing across the political and diplomatic spectrum about what can and should be done.

Richard A. Muller, a professor of physics at the University of California, Berkeley, and a former MacArthur Foundation fellow, is the author, most recently, of “Energy for Future Presidents: The Science Behind the Headlines.”

*   *   *

Climate change study forces sceptical scientists to change minds (The Guardian)

Earth’s land shown to have warmed by 1.5C over past 250 years, with humans being almost entirely responsible

Leo Hickman
guardian.co.uk, Sunday 29 July 2012 14.03 BST

Prof Richard MullerProf Richard Muller considers himself a converted sceptic following the study’s surprise results. Photograph: Dan Tuffs for the Guardian

The Earth’s land has warmed by 1.5C over the past 250 years and “humans are almost entirely the cause”, according to a scientific study set up to address climate change sceptics’ concerns about whether human-induced global warming is occurring.

Prof Richard Muller, a physicist and climate change sceptic who founded the Berkeley Earth Surface Temperature (Best) project, said he was surprised by the findings. “We were not expecting this, but as scientists, it is our duty to let the evidence change our minds.” He added that he now considers himself a “converted sceptic” and his views had undergone a “total turnaround” in a short space of time.

“Our results show that the average temperature of the Earth’s land has risen by 2.5F over the past 250 years, including an increase of 1.5 degrees over the most recent 50 years. Moreover, it appears likely that essentially all of this increase results from the human emission of greenhouse gases,” Muller wrote in an opinion piece for the New York Times.

Can scientists in California end the war on climate change?
Study finds no grounds for climate sceptics’ concerns
Video: Berkeley Earth tracks climate change
Are climate sceptics more likely to be conspiracy theorists?

The team of scientists based at the University of California, Berkeley, gathered and merged a collection of 14.4m land temperature observations from 44,455 sites across the world dating back to 1753. Previous data sets created by Nasa, the US National Oceanic and Atmospheric Administration, and the Met Office and the University of East Anglia’s climate research unit only went back to the mid-1800s and used a fifth as many weather station records.

The funding for the project included $150,000 from the Charles G Koch Charitable Foundation, set up by the billionaire US coal magnate and key backer of the climate-sceptic Heartland Institute thinktank. The research also received $100,000 from the Fund for Innovative Climate and Energy Research, which was created by Bill Gates.

Unlike previous efforts, the temperature data from various sources was not homogenised by hand – a key criticism by climate sceptics. Instead, the statistical analysis was “completely automated to reduce human bias”. The Best team concluded that, despite their deeper analysis, their own findings closely matched the previous temperature reconstructions, “but with reduced uncertainty”.

Last October, the Best team published results that showed the average global land temperature has risen by about 1C since the mid-1950s. But the team did not look for possible fingerprints to explain this warming. The latest data analysis reached much further back in time but, crucially, also searched for the most likely cause of the rise by plotting the upward temperature curve against suspected “forcings”. It analysed the warming impact of solar activity – a popular theory among climate sceptics – but found that, over the past 250 years, the contribution of the sun has been “consistent with zero”. Volcanic eruptions were found to have caused short dips in the temperature rise in the period 1750–1850, but “only weak analogues” in the 20th century.

“Much to my surprise, by far the best match came to the record of atmospheric carbon dioxide, measured from atmospheric samples and air trapped in polar ice,” said Muller. “While this doesn’t prove that global warming is caused by human greenhouse gases, it is currently the best explanation we have found, and sets the bar for alternative explanations.”

Muller said his team’s findings went further and were stronger than the latest report published by the Intergovernmental Panel on ClimateChange.

In an unconventional move aimed at appeasing climate sceptics by allowing “full transparency”, the results have been publicly released before being peer reviewed by the Journal of Geophysical Research. All the data and analysis is now available to be freely scrutinised at the Bestwebsite. This follows the pattern of previous Best results, none of which have yet been published in peer-reviewed journals.

When the Best project was announced last year, the prominent climate sceptic blogger Anthony Watts was consulted on the methodology. He stated at the time: “I’m prepared to accept whatever result they produce, even if it proves my premise wrong.” However, tensions have since arisen between Watts and Muller.

Early indications suggest that climate sceptics are unlikely to fully accept Best’s latest results. Prof Judith Curry, a climatologist at the Georgia Institute of Technology who runs a blog popular with climate sceptics and who is a consulting member of the Best team, told the Guardian that the method used to attribute the warming to human emissions was “way over-simplistic and not at all convincing in my opinion”. She added: “I don’t think this question can be answered by the simple curve fitting used in this paper, and I don’t see that their paper adds anything to our understanding of the causes of the recent warming.”

Prof Michael Mann, the Penn State palaeoclimatologist who has faced hostility from climate sceptics for his famous “hockey stick” graph showing a rapid rise in temperatures during the 20th century, said he welcomed the Best results as they “demonstrated once again what scientists have known with some degree of certainty for nearly two decades”. He added: “I applaud Muller and his colleagues for acting as any good scientists would, following where their analyses led them, without regard for the possible political repercussions. They are certain to be attacked by the professional climate change denial crowd for their findings.”

Muller said his team’s analysis suggested there would be 1.5 degrees of warming over land in the next 50 years, but if China continues its rapid economic growth and its vast use of coal then that same warming could take place in less than 20 years.

“Science is that narrow realm of knowledge that, in principle, is universally accepted,” wrote Muller. “I embarked on this analysis to answer questions that, to my mind, had not been answered. I hope that the Berkeley Earth analysis will help settle the scientific debate regarding global warming and its human causes. Then comes the difficult part: agreeing across the political and diplomatic spectrum about what can and should be done.”

Ciência e cultura, o que elas têm em comum? (Jornal da Ciência)

JC e-mail 4549, de 27 de Julho de 2012.

A pergunta foi tema da mesa-redonda “Divulgação da Ciência e da Cultura”, realizada na 64ª Reunião Anual da Sociedade Brasileira para o Progresso da Ciência (SBPC), que termina hoje (27), em São Luís.

Para Ildeu de Castro Moreira, diretor de Popularização e Difusão da Ciência e Tecnologia do Ministério da Ciência, Tecnologia e Inovação (MCTI) e conselheiro da SBPC, o debate sobre a relação da ciência com a arte é muito importante porque são duas facetas fundamentais da cultura humana. “Ciência, arte e cultura têm em comum a criatividade inerente ao ser humano”, definiu. Ele explica que arte e ciência são atividades humanas e sociais baseadas na criatividade e curiosidade.

Físico e divulgador científico, Ildeu falou sobre o “imaginário científico presente na mente de artistas”, e explicou que a ciência também tem preocupação estética e guarda semelhanças com a arte. Para ele, há beleza nas teorias científicas. “Equações matemáticas e fórmulas físicas são lindas. Podem parecer chatas em sala de aula, mas contando com a ajuda do olhar de um artista é possível mostrar essa beleza. É preciso aprender a olhar a beleza da ciência, assim como temos que aprender a olhar muita coisa na arte contemporânea”, exemplifica.

Para Ildeu, as conexões entre ciência e arte são importantes para fazer a divulgação científica chegar mais facilmente ao público. Em sua exposição, ele mostrou manifestações artísticas que falam de ciência, dando exemplos de poesias, músicas, enredos de escolas de samba, ditos populares e cordel.

Público infantil – Em sua apresentação na mesa-redonda, Luisa Medeiros Massarani, jornalista e chefe do Museu da Vida da Fiocruz, no Rio de Janeiro, falou sobre iniciativas de divulgação científicas voltadas para o público infantil. “A experiência tem demonstrado uma grande receptividade das crianças, maior do que a de adultos e adolescentes. Principalmente devido à curiosidade da criança, que são consideradas como ‘cientistas naturais'”, explica.

Luisa falou sobre o crescimento de museus de ciências no País, que atualmente são cerca de 200, embora ainda estejam concentrados em algumas regiões. “Os museus têm apelo incrível para as crianças e são importantes também para o divulgador que vê na hora a reação da criança”, revela. Apesar de os museus terem grande parte do público formado por crianças, Luisa afirma que é preciso pensar em espaços específicos para elas, desde a redução do tamanho dos móveis até atividades interativas adequadas.

Ela defende que a criança deve ser encarada como ator social importante no processo de divulgação científica. “Falar de divulgação científica para criança não é falar de ciência unilateralmente, é preciso que a criança seja ator importante e protagonista do processo”, explica ao dizer que a experiência de uma feira de ciência, ou a visita a um museu fica na memória da criança e pode influenciar sua formação, além de provocar e despertar o interesse pela ciência.

A chefe do Museu da Vida citou exposições, livros e publicações voltadas para o público infantil. E destacou a importância de fazer avaliações junto às crianças depois dessas experiências, para saber qual caminho seguir.

Ildeu aproveitou para sugerir que artistas participem mais ativamente das reuniões da SBPC, não somente como um evento paralelo, como a SBPC Cultural, mas como integrantes de mesas e debates com os cientistas. A ideia é aproveitar o público da Reunião, que alcança 15, 20 mil pessoas para falar dessa relação.

(Jornal da Ciência)

Uma leitura de antropólogos e sociólogos sobre o futuro da Amazônia (Jornal da Ciência)

JC e-mail 4549, de 27 de Julho de 2012.

O enfraquecimento de agências multilaterais de cooperação internacional começa a ameaçar as políticas para conservação da Amazônia Legal. A afirmativa é do presidente do Programa Nova Cartografia Social, Alfredo Wagner de Almeida, que ministrou conferência ontem (26) na 64ª Reunião Anual da Sociedade Brasileira para o Progresso da Ciência (SBPC), realizada na Universidade Federal do Maranhão (UFMA), em São Luís.

Sob o tema “Povos e comunidades tradicionais atingidos por projetos militares”, o antropólogo alertou sobre a ação de sete estados que buscam reduzir a Amazônia Legal, cujos projetos tramitam no Legislativo. Dentre os quais estão o Mato Grosso que prevê retirar a participação de sua área como Amazônia Legal, igualmente a Rondônia, que quer retirar esse título de suas terras da região. Outros estados como Maranhão e Tocantins querem tirar o título de todas suas áreas consideradas Amazônia Legal.

A região engloba uma superfície de aproximadamente 5.217.423 km², o equivalente a cerca de 61% do território brasileiro. Foi instituída com objetivo de definir a delimitação geográfica da região política captadora de incentivos fiscais para promoção do desenvolvimento regional.

“Essa é uma primeira tentativa de reduzir a Amazônia Legal, pois esses estados agora não gozam mais dos benefícios concedidos pelas agências internacionais multilaterais”, analisou Almeida, também conselheiro da SBPC e professor da Universidade do Estado do Amazonas (UEA).

Segundo o pesquisador, os organismos internacionais, até então, eram fontes de recursos para programas de proteção à Amazônia. Tais como, o Projeto Integrado de Proteção às Populações e Terras Indígenas da Amazônia Legal (PPTAL), destinado à demarcação de terras indígenas, fomentado principalmente pelo governo da Alemanha. E o PPG7 (Programa Piloto para Proteção das Florestas Tropicais do Brasil). Foram essas políticas que fortaleceram a criação do Ministério do Meio Ambiente. “Sem o apoio das agências multilaterais as políticas para a Amazônia encolheram”, disse, sem citar valores.

Conforme o antropólogo, a decisão dos estados que querem sair da Amazônia Legal significa para eles “liderar mais terras segundo as quais consideram ser produtivas”, em detrimento da conservação das florestas.

As declarações do antropólogo são baseadas no dossiê “Amazônia: sociedade, fronteiras e políticas”, produzido por Edna Maria Ramos de Castro, socióloga do Núcleo de Altos Estudos Amazônicos, da Universidade Federal do Pará (UFPA), e diretora da SBPC, que intermediou a conferência. A íntegra do documento foi publicada recentemente no Caderno CRH da Bahia.

Terras indígenas – Na avaliação da autora do dossiê, os dispositivos jurídicos desses estados ameaçam as terras indígenas – protagonistas na conservação da biodiversidade que precisam da natureza para sobreviver. “São dispositivos legais, são claros na Constituição, mas essa prática pode levar a uma situação de impasse [da sociedade]”, analisou. Edna citou o caso da polêmica obra da hidrelétrica de Belo Monte que se tornou um ícone de um processo de resistência da sociedade brasileira.

Mudança de paradigma – O antropólogo fez uma leitura sobre o atual modelo político brasileiro administrativo. Ele vê uma mudança de uma política “de proteção” para uma “ideia de protecionismo”. “A distinção entre proteção e protecionismo revela em primeiro lugar o enfraquecimento das agências multilaterais internacionais”, disse. Segundo ele, o protecionismo “erige” fora do âmbito da proteção.

Do ponto de vista de Alfredo Wagner, os sinais de mudança refletem principalmente os desacordos na reunião da Organização Mundial do Comércio (OMC) em dezembro de 2011 em Genebra. Na ocasião, houve sinais de ruptura de acordos internacionais – até então chamados de mercado comum. Um exemplo “é o engavetamento” da chamada Rodada de Doha, em razão de divergência entre as partes sobre subsídios agrícolas concedidos por países desenvolvidos.

Expansão da área militar e infraestrutura – O antropólogo lembra que no auge dos organismos multilaterais a área de segurança, isto é, a dos militares, não era fomentada porque não fazia parte de uma política de mercado único. Ele observa, entretanto, uma mudança a partir de 2009 quando há um deslocamento do modelo e problemas com os militares começam a aparecer, em decorrência da reedição de projetos de fronteiras militarizadas. “A partir daí inicia um capítulo de conflitos”.

Afastamento de fundos internacionais e órgãos reguladores – Segundo ele, o que mais sobressai na “ideia do protecionismo” é a identificação de recursos naturais estratégicos, como commodities agrícolas e minérios, que – sob o argumento de desenvolvimento sustentável – podem ser utilizados para o incremento de grandes obras de infraestrutura.

“Tudo passa a ser interpretado como interesses nacionais. A ideia de bloco vai perdendo força, o que pode explicar as próprias tensões no Mercosul, quando a Venezuela é levada ao bloco em momentos de crise. Esses interesses nacionais passam a se articular de maneira disciplinada sem passar pelas entidades multilaterais”, considera o antropólogo.

Segundo ele, atual ação do Estado brasileiro não passa pelas entidades multilaterais. Reflexo é o afastamento do Fundo Monetário Internacional (FMI) e de duas normas estrangeiras. Uma delas é a Lei de Direitos Humanos Internacional da OEA (Organização dos Estados Americanos). Ele lembra que o Brasil deixou de investir “nessa corte” a partir do momento em que a hidrelétrica de Belo Monte foi condenada pelo órgão. “O Brasil passa a ter uma posição unilateral, semelhante a dos norte-americanos na Guerra do Golfo”, observa o antropólogo. “A ideia do protecionismo vem de forma bastante forte”.

Alfredo Wagner também observa sinais de afastamento da Convenção 169 em que obriga a consulta prévia de comunidades prejudicadas por grandes obras de infraestrutura, por exemplo. Segundo ele, o Brasil é condenado a seis violações em projetos militares. Uma é pela construção do Centro de Lançamentos de Alcântara (CLA) em comunidades quilombolas no Maranhão, sem licenciamento ambiental e sem consulta às comunidades “afetadas”.

Ele alerta também sobre quatro medidas preocupantes em andamento segundo as quais preveem a construção emergencial de hidrelétricas. Um exemplo é a Medida Provisória 558 de 18 de janeiro de 2012 em que prevê redução de unidades protegidas e de conservação de florestas sob o argumento de desenvolvimento. Segundo ele, o Ibama aprovou em apenas cinco dias uma minuta de termo de referência da Eletronorte para construção de uma hidrelétrica em São Luiz de Tapajós. Na prática, foi aprovado o plano de trabalho encaminhado para diagnosticar as obras. “Com o ritmo emergencial para essas obras parece que os direitos são colocados em suspenso”.

Recursos de inconstitucionalidade – Tal MP foi questionada pela Procuradoria Geral da República por uma ADIN (Ação Direta de Inconstitucionalidade). O Ministério Público Federal considerou que as unidades de conservação nas áreas de hidrelétricas são essenciais para minimizar os impactos ambientais dos projetos; e argumentou que qualquer discussão sobre a redução dessas áreas florestais deve ser realizada no Congresso Nacional, a fim de evitar a edição de uma MP. “O Brasil hoje vive o império das Medidas Provisórias que impedem a ampla discussão da sociedade. Isso dá uma ideia de capitalismo autoritário”, disse o antropólogo.

Privatização de terras na Amazônia – Ele também alerta sobre a privatização das terras públicas na Amazônia sob o “eufemismo” de regularização fundiária, via o programa Terra Legal, pela Lei 11.952 de julho de 2009. Encaminhada pela Presidência da República, a medida prevê privatizar 70 milhões de hectares de terras públicas, um volume considerável em relação ao total de 850 milhões de hectares de terras que compõem o Brasil, segundo o antropólogo. Alfredo Wagner alerta sobre a agilidade na titularidade das terras para grandes propriedades que a MP permite, em detrimento dos pequenos proprietários.

Inicialmente, a medida foi questionada pelo Ministério Público por uma ADIN pela justificativa de que ela estabelece “privilégios injustificáveis” em favor de grileiros que no passado se beneficiaram de terras públicas e houve concentração de terras. “Essa MP é tão cruel quanto a Lei de Terras Sarney de 1969”, disse o antropólogo.

Judicialização do Estado – Buscando tranquilizar os ânimos da plateia lotada por alunos, pesquisadores, cientistas, dentre outros – estimada em cerca de 140 pessoas – que temia ser a volta da ditadura militar, o antropólogo respondeu sobre o atual modelo: “Ele não é igual à ditadura militar”, respondeu o atribuindo a um “judicialização do Estado” e de “uma coisa esquisita”.

Na ocasião, o antropólogo usou a frase de sociólogos para explicar uma crise: “O velho ainda não morreu e o novo ainda não nasceu. Mas está havendo uma transformação.”

(Viviane Monteiro – Jornal da Ciência)

Stop bullying the ‘soft’ sciences (L.A.Times)

OP-ED

The social sciences are just that — sciences.

By Timothy D. Wilson

July 12, 2012

Sociology studentA student is seen at the UC Irvine archive doing research for her sociology dissertation. (Los Angeles Times / July 9, 2009)

Once, during a meeting at my university, a biologist mentioned that he was the only faculty member present from a science department. When I corrected him, noting that I was from the Department ofPsychology, he waved his hand dismissively, as if I were a Little Leaguer telling a member of the New York Yankees that I too played baseball.

There has long been snobbery in the sciences, with the “hard” ones (physics, chemistry, biology) considering themselves to be more legitimate than the “soft” ones ( psychology, sociology). It is thus no surprise that many members of the general public feel the same way. But of late, skepticism about the rigors of social science has reached absurd heights.

The U.S. House of Representativesrecently voted to eliminate funding for political science research through the National Science Foundation. In the wake of that action, an opinion writer for the Washington Post suggested that the House didn’t go far enough. The NSF should not fund any research in the social sciences, wrote Charles Lane, because “unlike hypotheses in the hard sciences, hypotheses about society usually can’t be proven or disproven by experimentation.”

Lane’s comments echoed ones by Gary Gutting in the Opinionator blog of the New York Times. “While the physical sciences produce many detailed and precise predictions,” wrote Gutting, “the social sciences do not. The reason is that such predictions almost always require randomized controlled experiments, which are seldom possible when people are involved.”

This is news to me and the many other social scientists who have spent their careers doing carefully controlled experiments on human behavior, inside and outside the laboratory. What makes the criticism so galling is that those who voice it, or members of their families, have undoubtedly benefited from research in the disciplines they dismiss.

Most of us know someone who has suffered from depression and sought psychotherapy. He or she probably benefited from therapies such as cognitive behavioral therapy that have been shown to work in randomized clinical trials.

Problems such as child abuse and teenage pregnancy take a huge toll on society. Interventions developed by research psychologists, tested with the experimental method, have been found to lower the incidence of child abuse and reduce the rate of teenage pregnancies.

Ever hear of stereotype threat? It is the double jeopardy that people face when they are at risk of confirming a negative stereotype of their group. When African American students take a difficult test, for example, they are concerned not only about how well they will do but also about the possibility that performing poorly will reflect badly on their entire group. This added worry has been shown time and again, in carefully controlled experiments, to lower academic performance. But fortunately, experiments have also showed promising ways to reduce this threat. One intervention, for example, conducted in a middle school, reduced the achievement gap by 40%.

If you know someone who was unlucky enough to be arrested for a crime he didn’t commit, he may have benefited from social psychological experiments that have resulted in fairer lineups and interrogations, making it less likely that innocent people are convicted.

An often-overlooked advantage of the experimental method is that it can demonstrate what doesn’t work. Consider three popular programs that research psychologists have debunked: Critical Incident Stress Debriefing, used to prevent post-traumatic stress disorders in first responders and others who have witnessed horrific events; the D.A.R.E. anti-drug program, used in many schools throughout America; and Scared Straight programs designed to prevent at-risk teens from engaging in criminal behavior.

All three of these programs have been shown, with well-designed experimental studies, to be ineffective or, in some cases, to make matters worse. And as a result, the programs have become less popular or have changed their methods. By discovering what doesn’t work, social scientists have saved the public billions of dollars.

To be fair to the critics, social scientists have not always taken advantage of the experimental method as much as they could. Too often, for example, educational programs have been implemented widely without being adequately tested. But increasingly, educational researchers are employing better methodologies. For example, in a recent study, researchers randomly assigned teachers to a program called My Teaching Partner, which is designed to improve teaching skills, or to a control group. Students taught by the teachers who participated in the program did significantly better on achievement tests than did students taught by teachers in the control group.

Are the social sciences perfect? Of course not. Human behavior is complex, and it is not possible to conduct experiments to test all aspects of what people do or why. There are entire disciplines devoted to the experimental study of human behavior, however, in tightly controlled, ethically acceptable ways. Many people benefit from the results, including those who, in their ignorance, believe that science is limited to the study of molecules.

Timothy D. Wilson is a professor of psychology at the University of Virginia and the author of “Redirect: The Surprising New Science of Psychological Change.”

Local Weather Patterns Affect Beliefs About Global Warming (Science Daily)

People living in places experiencing warmer-than-normal temperatures at the time they were surveyed were significantly more likely than others to say there is evidence for global warming. (Credit: © Rafael Ben-Ari / Fotolia)

ScienceDaily (July 25, 2012) — Local weather patterns temporarily influence people’s beliefs about evidence for global warming, according to research by political scientists at New York University and Temple University. Their study, which appears in theJournal of Politics, found that those living in places experiencing warmer-than-normal temperatures at the time they were surveyed were significantly more likely than others to say there is evidence for global warming.

“Global climate change is one of the most important public policy challenges of our time, but it is a complex issue with which Americans have little direct experience,” wrote the study’s co-authors, Patrick Egan of New York University and Megan Mullin of Temple University. “As they try to make sense of this difficult issue, many people use fluctuations in local temperature to reassess their beliefs about the existence of global warming.”

Their study examined five national surveys of American adults sponsored by the Pew Research Center: June, July, and August 2006, January 2007, and April 2008. In each survey, respondents were asked the following question: “From what you’ve read and heard, is there solid evidence that the average temperature on earth has been getting warmer over the past few decades, or not?” On average over the five surveys, 73 percent of respondents agreed that Earth is getting warmer.

Egan and Mullin wondered about variation in attitudes among the survey’s respondents, and hypothesized that local temperatures could influence perceptions. To measure the potential impact of temperature on individuals’ opinions, they looked at zip codes from respondents in the Pew surveys and matched weather data to each person surveyed at the time of each poll. They used local weather data to determine if the temperature in the location of each respondent was significantly higher or lower than normal for that area at that time of year.

Their results showed that an abnormal shift in local temperature is associated with a significant shift in beliefs about evidence for global warming. Specifically, for every three degrees Fahrenheit that local temperatures in the past week have risen above normal, Americans become one percentage point more likely to agree that there is ”solid evidence” that Earth is getting warmer. The researchers found cooler-than-normal temperatures have similar effects on attitudes — but in the opposite direction.

The study took into account other variables that may explain the results — such as existing political attitudes and geography — and found the results still held.

The researchers also wondered if heat waves — or prolonged higher-than-normal temperatures — intensified this effect. To do so, they looked at respondents living in areas that experienced at least seven days of temperatures of 10° or more above normal in the three weeks prior to interview and compared their views with those who experienced the same number of hot days, but did not experience a heat wave.

Their estimates showed that the effect of a heat wave on opinion is even greater, increasing the share of Americans believing in global warming by 5.0 to 5.9 percentage points.

However, Egan and Mullin found the effects of temperature changes to be short-lived — even in the wake of heat waves. Americans who had been interviewed after 12 or more days had elapsed since a heat wave were estimated to have attitudes that were no different than those who had not been exposed to a heat wave.

“Under typical circumstances, the effects of temperature fluctuations on opinion are swiftly wiped out by new weather patterns,” they wrote. “More sustained periods of unusual weather cause attitudes to change both to a greater extent and for a longer period of time. However, even these effects eventually decay, leaving no long-term impact of weather on public opinion.”

The findings make an important contribution to the political science research on the relationship between personal experience and opinion on a larger issue, which has long been studied with varying results.

“On issues such as crime, the economy, education, health care, public infrastructure, and taxation, large shares of the public are exposed to experiences that could logically be linked to attitude formation,” the researchers wrote. “But findings from research examining how these experiences affect opinion have been mixed. Although direct experience — whether it be as a victim of crime, a worker who has lost a job or health insurance, or a parent with children in public schools — can influence attitudes, the impact of these experiences tends to be weak or nonexistent after accounting for typical predictors such as party identification and liberal-conservative ideology.”

“Our research suggests that personal experience has substantial effects on political attitudes,” Egan and Mullin concluded. “Rich discoveries await those who can explore these questions in ways that permit clean identification of these effects.”

Egan is an assistant professor in the Wilf Family Department of Politics at NYU and Mullin is an associate professor in the Department of Political Science at Temple University

What is a carbon price and why do we need one? (The Guardian)

This Q&A is part of the Guardian’s Ultimate climate change FAQ

Grantham Research Institute and 
guardian.co.uk, Monday 16 July 2012 10.38 BST
Parliament House during a pro-carbon tax rally in Canberra, Australia

A pro-carbon tax rally in Canberra, Australia, October 2011. Photograph: Alan Porritt/AFP/Getty Images

A carbon price is a cost applied to carbon pollution to encourage polluters to reduce the amount of greenhouse gas they emit into the atmosphere. Economists widely agree that introducing a carbon price is the single most effective way for countries to reduce their emissions.

Climate change is considered a market failure by economists, because it imposes huge costs and risks on future generations who will suffer the consequences of climate change, without these costs and risks normally being reflected in market prices. To overcome this market failure, they argue, we need to internalise the costs of future environmental damage by putting a price on the thing that causes it – namely carbon emissions.

carbon price not only has the effect of encouraging lower-carbon behaviour (eg using a bike rather than driving a car), but also raises money that can be used in part to finance a clean-up of “dirty” activities (eg investment in research into fuel cells to help cars pollute less). With a carbon price in place, the costs of stopping climate change are distributed across generations rather than being borne overwhelmingly by future generations.

There are two main ways to establish a carbon price. First, a government can levy a carbon tax on the distribution, sale or use of fossil fuels, based on their carbon content. This has the effect of increasing the cost of those fuels and the goods or services created with them, encouraging business and people to switch to greener production and consumption. Typically the government will decide how to use the revenue, though in one version, the so-called fee-and-dividend model – the tax revenues are distributed in their entirety directly back to the population.

The second approach is a quota system called cap-and-trade. In this model, the total allowable emissions in a country or region are set in advance (“capped”). Permits to pollute are created for the allowable emissions budget and either allocated or auctioned to companies. The companies can trade permits between one another, introducing a market for pollution that should ensure that the carbon savings are made as cheaply as possible.

To serve its purpose, the carbon price set by a tax or cap-and-trade scheme must be sufficiently high to encourage polluters to change behaviour and reduce pollution in accordance with national targets. For example, the UK has a target to reduce carbon emissions by 80% by 2050, compared with 1990 levels, with various intermediate targets along the way. The government’s independent advisers, the Committee on Climate Change, estimates that a carbon price of £30 per tonne of carbon dioxide in 2020 and £70 in 2030 would be required to meet these goals.

Currently, many large UK companies pay a price for the carbon they emit through the EU’s emissions trading scheme. However, the price of carbon through the scheme is considered by many economists to be too low to help the UK to meet its targets, so the Treasury plans to make all companies covered by the scheme pay a minimum of £16 per tonne of carbon emitted from April 2013.

Ideally, there should be a uniform carbon price across the world, reflecting the fact that a tonne of carbon dioxide does the same amount of damage over time wherever it is emitted. Uniform pricing would also remove the risk that polluting businesses flee to so-called “pollution havens”‘ – countries where a lack of environmental regulation enables them to continue to pollute unrestrained. At the moment, carbon pricing is far from uniform but a growing number of countries and regions have, or plan to have, carbon pricing schemes in place, whether through cap-and-trade or carbon taxes. These include the European Union, Australia, South Korea, South Africa, parts of China and California.

• This article was written by Alex Bowen of the Grantham Research Institute on Climate Change and the Environment at LSE in collaboration with the Guardian

European Commission backs calls for open access to scientific research (The Guardian)

Move follows announcement by UK government that it wants all taxpayer-funded research to be free to view by 2014

Reuters/guardian.co.uk, Tuesday 17 July 2012 14.41 BST
Neelie Kroes

Neelie Kroes, European Commission vice-president for digital agenda, said: ‘Taxpayers should not have to pay twice for scientific research.’ Photograph: Georges Gobet/AFP/Getty Images

The European Commission, which controls one of the world’s largest science budgets, has backed calls for free access to publicly fundedresearch in a move that could force a major change in the business model for publishers such as Reed Elsevier.

“Taxpayers should not have to pay twice for scientific research and they need seamless access to raw data,” said Neelie Kroes, European Commission vice-president for digital agenda.

The EC saidon Tuesday that open access will be a “general principle” applied to grants awarded through the €80bn Horizon 2020 programme for research and innovation.

From 2014 all articles produced with funding from Horizon 2020 will have to be accessible and the goal is for 60% of European publicly funded research to be available by 2016.

The news follows the announcement by the British government that it wants all taxpayer-funded research to be free to view by 2014. David Willets, the universities and science minister told the Gaurdian: “If the taxpayer has paid for this research to happen, that work shouldn’t be put behind a paywall before a British citizen can read it.”

The most prestigious academic journals, such as Nature, Science and Cell, earn the bulk of their revenues through subscriptions from readers.

They have lucrative deals with university libraries, worth about £150m to £200m a year in the UK, to give access to the same scientists who produce and review, usually without payment, the research they publish.

Open-access journals, such as the Public Library of Science, are ofteninternet-based and charge researchers a fee for publication, allowing free access for anyone after publication.

The open-access market has been growing rapidly over the past decade but still only accounts for about 3% of the £5.1bn global market for scholarly journals.

The subscription model has come under attack from some scientists, who argue that publishing companies are making fat profits on the back of taxpayer-funded research.

Elsevier publishes more than 2,000 journals with a staff of about 7,000. It made a profit last year of £768m on revenues of £2.1bn, giving a margin of about 37%.

Publishers argue that quality does not come cheap and their subscription charges reflect the need to maintain large editorial departments and databases of published research.

Máire Geoghegan-Quinn, European commissioner for research, innovation and science, swept this argument aside. “We must give taxpayers more bang for their buck,” she said in a statement. “Open access to scientific papers and data is an important means of achieving this.”

The commission’s move follows recent news that the European medicines regulator will open its data vaults to allow independent researchers to scrutinise results from drug companies’ trials.

“The EU’s decision to adopt a similar policy to that of the UK will mean that the transition time from subscription-based to open-access publishing will be substantially reduced,” Professor Adam Tickell, who was involved in a recent UK government-commissioned report on the issue, told Reuters.

Tickell, of the University of Birmingham, predicted a rapid and substantial reduction in the cost of subscriptions, adding: “With the support of the EU, UK government and major charities, such as the Wellcome Trust, open access to research findings will soon be a reality.”

A Century Of Weather Control (POP SCI)

Posted 7.19.12 at 6:20 pm – http://www.popsci.com

 

Keeping Pilots Updated, November 1930

It’s 1930 and, for obvious reasons, pilots want regular reports on the weather. What to do? Congress’s solution was to give the U.S. Weather Bureau cash to send them what they needed. It was a lot of cash, too: $1.4 million, or “more than one third the sum it spend annually for all of its work.”

About 13,000 miles of airway were monitored for activity, and reports were regularly sent via the now quaintly named “teletype”–an early fax machine, basically, that let a typed message be reproduced. Pilots were then radioed with the information.

From the article “Weather Man Makes the Air Safe.”

 

Battling Hail, July 1947

We weren’t shy about laying on the drama in this piece on hail–it was causing millions in damage across the country and we were sick of it. Our writer says, “The war against hail has been declared.” (Remember: this was only two years after World War II, which was a little more serious. Maybe our patriotism just wouldn’t wane.)

The idea was to scatter silver iodide as a form of “cloud seeding”–turning the moisture to snow before it hails. It’s a process that’s still toyed with today.

From the article “The War Against Hail.”

 

Hunting for a Tornado “Cure,” March 1958

1957 was a record-breaking year for tornadoes, and PopSci was forecasting even rougher skies for 1958. As described by an official tornado watcher: ‘”They’re coming so fast and thick … that we’ve lost count.'”

To try to stop it, researchers wanted to learn more. Meteorologists asked for $5 million more a year from Congress to be able to study tornadoes whirling through the Midwest’s Tornado Alley, then, hopefully, learn what they needed to do to stop them.

From the article “What We’re Learning About Tornadoes.”

 

Spotting Clouds With Nimbus, November 1963

Weather satellites were a boon to both forecasters and anyone affected by extreme weather. The powerful Hurricane Esther was discovered two days before anything else spotted it, leaving space engineers “justifiably proud.” The next satellite in line was the Nimbus, which Popular Science devoted multiple pages to covering, highlighting its ability to photograph cloud cover 24 hours a day and give us better insight into extreme weather.

Spoiler: the results really did turn out great, with Nimbus satellites paving the way for modern GPS devices.

From the article “The Weather Eye That Never Blinks.”

 

Saving Money Globally With Forecasts, November 1970

Optimism for weather satellites seemed to be reaching a high by the ’70s, with Popular Science recounting all the disasters predicted–how they “saved countless lives through early hurricane warnings”–and now even saying they’d save your vacation.

What they were hoping for then was an accurate five-day forecast for the world, which they predicted would save billions and make early warnings even better.

From the article “How New Weather Satellites Will Give You More Reliable Forecasts.”

 

Extreme Weather Alerts on the Radio, July 1979

Those weather alerts that come on your television during a storm–or at least one radio version of those–were documented byPopular Science in 1979. But rather than being something that anyone could tune in to, they were specialized radios you had to purchase, which seems like a less-than-great solution to the problem. But at this point the government had plans to set up weather monitoring stations near 90 percent of the country’s population, opening the door for people to find out fast what the weather situation was.

From the article “Weather-Alert Radios–They Could Save Your Life.”

 

Stopping “Bolts From the Blue,” May 1990

Here Popular Science let loose a whooper for anyone with a fear of extreme weather: lightning kills a lot more people every year than you think, and sometimes a lightning bolt will come and hit you even when there’s not a storm. So-called “bolts from the blue” were a part of the story on better predicting lightning, a phenomenon more manic than most types of weather. Improved sensors played a major part in better preparing people before a storm.

From the article “Predicting Deadly Lightning.”

 

Infrared Views of Weather, August 1983

Early access to computers let weather scientists get a 3-D, radar-based view of weather across the country. The system culled information from multiple sources and placed it in one viewable display. (The man pictured looks slightly bored for how revolutionary it is.) The system was an attempt to take global information and make it into “real-time local predictions.”

From the article “Nowcasting: New Weather Computers Pinpoint Deadly Storms.”

 

Modernizing the National Weather Service, August 1997

A year’s worth of weather detection for every American was coming at the price of “a Big Mac, fries, and a Coke,” the deputy director of the National Weather Service said in 1997. The computer age better tied together the individual parts of weather forecasting for the NWS, leaving a unified whole that could grab complicated meteorological information and interpret it in just a few seconds.

From the article “Weather’s New Outlook.”

 

Modeling Weather With Computers, September 2001

Computer simulations, we wrote, would help us predict future storms more accurately. But it took (at the time) the largest supercomputer around to give us the kinds of models we wanted. Judging by the image, we might’ve already made significant progress on the weather modeling front.

Researchers Produce First Complete Computer Model of an Organism (Science Daily)

ScienceDaily (July 21, 2012) — In a breakthrough effort for computational biology, the world’s first complete computer model of an organism has been completed, Stanford researchers reported last week in the journal Cell.

The Covert Lab incorporated more than 1,900 experimentally observed parameters into their model of the tiny parasite Mycoplasma genitalium. () (Credit: Illustration by Erik Jacobsen / Covert Lab)

A team led by Markus Covert, assistant professor of bioengineering, used data from more than 900 scientific papers to account for every molecular interaction that takes place in the life cycle of Mycoplasma genitalium, the world’s smallest free-living bacterium.

By encompassing the entirety of an organism in silico, the paper fulfills a longstanding goal for the field. Not only does the model allow researchers to address questions that aren’t practical to examine otherwise, it represents a stepping-stone toward the use of computer-aided design in bioengineering and medicine.

“This achievement demonstrates a transforming approach to answering questions about fundamental biological processes,” said James M. Anderson, director of the National Institutes of Health Division of Program Coordination, Planning and Strategic Initiatives. “Comprehensive computer models of entire cells have the potential to advance our understanding of cellular function and, ultimately, to inform new approaches for the diagnosis and treatment of disease.”

The research was partially funded by an NIH Director’s Pioneer Award from the National Institutes of Health Common Fund.

From information to understanding

Biology over the past two decades has been marked by the rise of high-throughput studies producing enormous troves of cellular information. A lack of experimental data is no longer the primary limiting factor for researchers. Instead, it’s how to make sense of what they already know.

Most biological experiments, however, still take a reductionist approach to this vast array of data: knocking out a single gene and seeing what happens.

“Many of the issues we’re interested in aren’t single-gene problems,” said Covert. “They’re the complex result of hundreds or thousands of genes interacting.”

This situation has resulted in a yawning gap between information and understanding that can only be addressed by “bringing all of that data into one place and seeing how it fits together,” according to Stanford bioengineering graduate student and co-first author Jayodita Sanghvi.

Integrative computational models clarify data sets whose sheer size would otherwise place them outside human ken.

“You don’t really understand how something works until you can reproduce it yourself,” Sanghvi said.

Small is beautiful

Mycoplasma genitalium is a humble parasitic bacterium known mainly for showing up uninvited in human urogenital and respiratory tracts. But the pathogen also has the distinction of containing the smallest genome of any free-living organism — only 525 genes, as opposed to the 4,288 of E. coli, a more traditional laboratory bacterium.

Despite the difficulty of working with this sexually transmitted parasite, the minimalism of its genome has made it the focus of several recent bioengineering efforts. Notably, these include the J. Craig Venter Institute’s 2008 synthesis of the first artificial chromosome.

“The goal hasn’t only been to understand M. genitalium better,” said co-first author and Stanford biophysics graduate student Jonathan Karr. “It’s to understand biology generally.”

Even at this small scale, the quantity of data that the Stanford researchers incorporated into the virtual cell’s code was enormous. The final model made use of more than 1,900 experimentally determined parameters.

To integrate these disparate data points into a unified machine, the researchers modeled individual biological processes as 28 separate “modules,” each governed by its own algorithm. These modules then communicated to each other after every time step, making for a unified whole that closely matched M. genitalium‘s real-world behavior.

Probing the silicon cell

The purely computational cell opens up procedures that would be difficult to perform in an actual organism, as well as opportunities to reexamine experimental data.

In the paper, the model is used to demonstrate a number of these approaches, including detailed investigations of DNA-binding protein dynamics and the identification of new gene functions.

The program also allowed the researchers to address aspects of cell behavior that emerge from vast numbers of interacting factors.

The researchers had noticed, for instance, that the length of individual stages in the cell cycle varied from cell to cell, while the length of the overall cycle was much more consistent. Consulting the model, the researchers hypothesized that the overall cell cycle’s lack of variation was the result of a built-in negative feedback mechanism.

Cells that took longer to begin DNA replication had time to amass a large pool of free nucleotides. The actual replication step, which uses these nucleotides to form new DNA strands, then passed relatively quickly. Cells that went through the initial step quicker, on the other hand, had no nucleotide surplus. Replication ended up slowing to the rate of nucleotide production.

These kinds of findings remain hypotheses until they’re confirmed by real-world experiments, but they promise to accelerate the process of scientific inquiry.

“If you use a model to guide your experiments, you’re going to discover things faster. We’ve shown that time and time again,” said Covert.

Bio-CAD

Much of the model’s future promise lies in more applied fields.

CAD — computer-aided design — has revolutionized fields from aeronautics to civil engineering by drastically reducing the trial-and-error involved in design. But our incomplete understanding of even the simplest biological systems has meant that CAD hasn’t yet found a place in bioengineering.

Computational models like that of M. genitalium could bring rational design to biology — allowing not only for computer-guided experimental regimes, but also for the wholesale creation of new microorganisms.

Once similar models have been devised for more experimentally tractable organisms, Karr envisions bacteria or yeast specifically designed to mass-produce pharmaceuticals.

Bio-CAD could also lead to enticing medical advances — especially in the field of personalized medicine. But these applications are a long way off, the researchers said.

“This is potentially the new Human Genome Project,” Karr said. “It’s going to take a really large community effort to get close to a human model.”

Stanford’s Department of Bioengineering is jointly operated by the School of Engineering and the School of Medicine.

Global CO2 Emissions Continued to Increase in 2011, With Per Capita Emissions in China Reaching European Levels (Science Daily)

ScienceDaily (July 19, 2012) — Global emissions of carbon dioxide (CO2) — the main cause of global warming — increased by 3% last year, reaching an all-time high of 34 billion tonnes in 2011. In China, the world’s most populous country, average emissions of CO2 increased by 9% to 7.2 tonnes per capita. China is now within the range of 6 to 19 tonnes per capita emissions of the major industrialised countries. In the European Union, CO2 emissions dropped by 3% to 7.5 tonnes per capita. The United States remains one of the largest emitters of CO2, with 17.3 tones per capita, despite a decline due to the recession in 2008-2009, high oil prices and an increased share of natural gas.

These are the main findings of the annual report ‘Trends in global CO2emissions’, released July 19 by the European Commission’s Joint Research Centre (JRC) and the Netherlands Environmental Assessment Agency (PBL).

Based on recent results from the Emissions Database for Global Atmospheric Research (EDGAR) and latest statistics on energy use and relevant activities such as gas flaring and cement production, the report shows that global CO2 emissions continued to grow in 2011, despite reductions in OECD countries. Weak economic conditions, a mild winter, and energy savings stimulated by high oil prices led to a decrease of 3% in CO2 emissions in the European Union and of 2% in both the United States and Japan. Emissions from OECD countries now account for only one third of global CO2 emissions — the same share as that of China and India combined, where emissions increased by 9% and 6% respectively in 2011. Economic growth in China led to significant increases in fossil fuel consumption driven by construction and infrastructure expansion. The growth in cement and steel production caused China’s domestic coal consumption to increase by 9.7%.

The 3% increase in global CO2 emissions in 2011 is above the past decade’s average annual increase of 2.7%, with a decrease in 2008 and a surge of 5% in 2010. The top emitters contributing to the 34 billion tonnes of CO2 emitted globally in 2011 are: China (29%), the United States (16%), the European Union (11%), India (6%), the Russian Federation (5%) and Japan (4%).

Cumulative CO2 emissions call for action

An estimated cumulative global total of 420 billion tonnes of CO2 were emitted between 2000 and 2011 due to human activities, including deforestation. Scientific literature suggests that limiting the rise in average global temperature to 2°C above pre-industrial levels — the target internationally adopted in UN climate negotiations — is possible only if cumulative CO2emissions in the period 2000-2050 do not exceed 1 000 to 1 500 billion tonnes. If the current global trend of increasing CO2emissions continues, cumulative emissions will surpass this limit within the next two decades.

Fortunately, this trend is being mitigated by the expansion of renewable energy supplies, especially solar and wind energy and biofuels. The global share of these so-called modern renewables, which exclude hydropower, is growing at an accelerated speed and quadrupled from 1992 to 2011. This potentially represents about 0.8 billion tonnes of CO2emissions avoided as a result of using renewable energy supplies in 2011, which is close to Germany’s total CO2emissions in 2011.

“Trends in global CO2 emissions” report:http://edgar.jrc.ec.europa.eu/CO2REPORT2012.pdf