Arquivo da categoria: ciência

‘Could Somebody Please Debunk This?’: Writing About Science When Even the Scientists Are Nervous (New York Times)

Milk has become a symbol for white supremacists who repurpose genetic research, because of a genetic trait known to be more common in white adults than others: the ability to digest lactose.
Milk has become a symbol for white supremacists who repurpose genetic research, because of a genetic trait known to be more common in white adults than others: the ability to digest lactose.Credit: Colum O’dwyer/EyeEm

By Amy Harmon

Oct. 18, 2018

Times Insider delivers behind-the-scenes insights into how news, features and opinion come together at The New York Times.

N. is a black high school student in Winston-Salem, N.C., who does not appear in my article on Thursday’s front page about how human geneticists have been slow to respond to the invocation of their research by white supremacists. (Note: N.’s full name has been removed to minimize online harassment.)

But the story of how he struggled last spring to find sources to refute the claims of white classmates that people of European descent had evolved to be intellectually superior to Africans is the reason I persevered in the assignment, even when I felt as if my head were going to explode.

N. had vowed to take up the subject for a persuasive speech assignment in his Rhetoric class. Googling for information that would help him, however, yielded a slew of blogs and videos arguing the other side. “There’s only one scientific response for every hundred videos or so,” he told me when we spoke on the phone.

“Could somebody please debunk this blog post, if it can be debunked?” he finally posted on the Reddit forum r/badscience. “It’s convincing me of things I really don’t want to be convinced of.”

I was introduced to N. by Kevin Bird, a white graduate student at Michigan State University who had answered N.’s Reddit query, and others that had been flooding that forum about claims of racial differences that invoke the jargon and scientific papers of modern genetic research.

I had misgivings about simply reporting on the rise of a kind of repackaged scientific racism, which I had been tracking as a national correspondent who writes about science. Under the coded term “race realism,” it implied, falsely, that science had found a genetic basis for racial differences in traits like intelligence and behavior. Why draw attention to it?

But a series of Twitter posts from Mr. Bird late last year crystallized a question that had been on my mind. Unlike in the case of climate change, vaccines or other areas of science where scientists routinely seek to correct public misconceptions, those who study how the world’s major population groups vary genetically were largely absent from these forums. Nor was there an obvious place for someone like N. to turn for basic, up-to-date facts on human genetic diversity.

“Right now the propaganda being generated from misrepresented population genetic studies is far outpacing the modest attempts of scientists to publicly engage with the topic,” Mr. Bird had tweeted. “Why,” he asked in another tweet, “are scientists dropping the ball?”

In the course of investigating that question, I spent many hours digesting scientific papers on genetics and interviewing their authors. Some of them, I learned, subscribed to a common ethos among scientists that their job is to provide data and let society decide what to do with it. Others felt it was not productive to engage with what they regarded as a radical fringe.

It was more than a radical fringe at stake, I would tell them. Lots of nonscientists were just confused. It wasn’t just N. Mr. Bird had fielded queries from a graduate student in applied physics at Harvard and an information technology consultant in Michigan whose Twitter profile reads “anti-fascist, anti-bigot.’’ I talked to an Army veteran attending community college in Florida and a professional video gamer who felt ill-equipped to refute science-themed racist propaganda that they encountered online. It had come up in a source’s book group in Boston. They wanted to invite a guest scientist to tutor them but couldn’t figure out who.

But another reason some scientists avoid engaging on this topic, I came to understand, was that they do not have definitive answers about whether there are average differences in biological traits across populations. And they have increasingly powerful tools to try to detect how natural selection may have acted differently on the genes that contribute to assorted traits in various populations.

What’s more, some believe substantial differences will be found. Others think it may not be feasible to ever entirely disentangle an immutable genetic contribution to a behavior from its specific cultural and environmental influences. Yet all of them agree that there is no evidence that any differences which may be found will line up with the prejudices of white supremacists.

As I struggled to write my article, I began, sort of, to feel their pain. With each sentence, I was striving not to give credence to racist ideas, not to misrepresent the science that exists and not to overrepresent how much science actually does exist — while trying also to write in a way that a nonscientist, like N., could understand.

It was hard. It did almost make my head explode. I tested the patience of a very patient editor. The end result, I knew, would not be perfect. But every time I was ready to give up, I thought about N. Here was a kid making a good-faith effort to learn, and the existing resources were failing him. If I could help, however incompletely — even if just to try to explain the absence of information — I felt that was a responsibility I had to meet.

A few weeks ago, as I was getting the story ready to go, I asked N. for an update. “I’ve read a lot more papers since then,” he wrote. (He aced his presentation.) “Many of my arguments are stronger, some have been discarded. I’ve also become much more aware of this stuff around me. In some ways, it’s regrettable, but in other ways, it’s satisfying knowing so much.”

Can Biology Class Reduce Racism? (New York Times)

By Amy Harmon

Dec. 7, 2019Updated 11:43 a.m. ET

COLORADO SPRINGS — Biology textbooks used in American high schools do not go near the sensitive question of whether genetics can explain why African-Americans are overrepresented as football players and why a disproportionate number of American scientists are white or Asian.

But in a study starting this month, a group of biology teachers from across the country will address it head-on. They are testing the idea that thescience classroom may be the best place to provide a buffer against the unfounded genetic rationales for human difference that often become the basis for racial intolerance.

At a recent training in Colorado, the dozen teachers who had volunteered to participate in the experiment acknowledged the challenges of inserting the combustible topic of race and ancestry into straightforward lessons on the 19th-century pea-breeding experiments of Gregor Mendel and the basic function of the strands of DNA coiled in every cell.

The new approach represents a major deviation from the usual school genetics fare, which devotes little time to the extent of genetic differences across human populations, or how traits in every species are shaped by a complex mix of genes and environment.

It also challenges a prevailing belief among science educators that questions about race are best left to their counterparts in social studies.

The history of today’s racial categories arose long before the field of genetics and have been used to justify all manner of discriminatory policies. Race, a social concept bound up in culture and family, is not a topic of study in modern human population genetics, which typically uses concepts like “ancestry” or “population” to describe geographic genetic groupings.

But that has not stopped many Americans from believing that genes cause racial groups to have distinct skills, traits and abilities. And among some biology teachers, there has been a growing sense that avoiding any direct mention of race in their genetics curriculum may be backfiring.

“I know it’s threatening,” said Brian Donovan, a science education researcher at the nonprofit BSCS Science Learning who is leading the study. “The thing to remember is that kids are already making sense of race and biology, but with no guidance.”

Human population geneticists have long emphasized that racial disparities found in society do not in themselves indicate corresponding genetic differences. A recent paper by leading researchers in the field invokes statistical models to argue that health disparities between black and white Americans are more readily explained by environmental effects such as racism than the DNA they inherited from ancestors.

Yet there is a rising concern that genetic misconceptions are playing into divisive American attitudes about race.

In a 2018 survey of 721 students from affluent, majority-white high schools, Dr. Donovan found that one in five agreed with statements like “Members of one racial group are more ambitious than members of another racial group because of genetics.”

A similar percentage of white American adults attribute the black-white income gap to genetic differences, according to an estimate by a team of sociologists published this fall. Though rarely acknowledged in debates over affirmative action or polling responses, “belief in genetic causes of racial inequality remains widespread in the United States,” wrote Ann Morning, of New York University, and her colleagues.

For his part, Dr. Donovan has argued that grade-school biology classes may offer the only opportunity to dispel unfounded genetic explanations for racial inequality on a mass scale. Middle schools and high schools are the first, and perhaps the only, place that most Americans are taught about genetics.

The new curriculum acknowledges there are minor genetic differences between geographic populations loosely correlated to today’s racial categories. But the unit also conveys what geneticists have reiterated: People inherit their environment and culture with their genes, and it is a daunting task to disentangle them. A key part of the curriculum, Dr. Donovan said, is teaching students to “understand the limits of our knowledge.’’

In the pilot study that helped Dr. Donovan secure a research grant from the National Science Foundation, students in eight classrooms exposed to a rudimentary version of the curriculum were less likely than others to endorse statements suggesting that racial groups have defining qualities that are determined by genes. The new study will measure the curriculum’s effect on such attitudes by asking students to fill out surveys before and after the unit.

The training exercise, which a reporter attended on the condition that names would be withheld to avoid jeopardizing the study, showed what it might take to offer students, as one Colorado teacher put it, “something better than ‘don’t worry about it, we’re 99.9 percent the same.’”

For the trainees, from five states and seven school districts, much of the opening morning was devoted to brainstorming how to check in with students, especially black students, who seem defensive or scared, sullen or silent, and how to recognize the unit’s fraught nature.

“Something like ‘These ideas are dangerous, and ‘How do we have a safe conversation about unsafe ideas?’” one teacher said. “But I would have to practice it so I don’t choke up like I am now.’’

Before breaking for lunch, Dr. Donovan, a former middle school science teacher who studied under the Stanford population geneticist Noah Rosenberg while pursuing a science education Ph.D., had a message for them: “If you back out at the end of this,” he said, “I’ll understand.”

The lessons are structured around two fictional teenagers, Robin and Taylor, who both understand that the differences between the DNA in any two people make up about one-tenth of 1 percent of their genome. But they disagree about how those differences intersect with race.

Taylor thinks that there are genetic differences between people but that those differences are not associated with race.  

Robin thinks that the genetic differences within a racial group are small and that most genetic differences exist between people of different races.

The truth is that neither has a completely accurate view.

As human populations spread around the globe, with people living in relative isolation for millenniums, some differences emerged. But the genetic variation between groups in, say, Africa and Europe are much smaller than the differences within each group.

Taylor, who had downplayed the significance of race, eventually had to admit there were some proportionally small differences between population groups. And Robin had to acknowledge having vastly overemphasized the amount of DNA differences between races.

But the two fictional teenagers still clashed over the opening question. Robin believed that there are genes for athletic or intellectual abilities, and that they are the best explanation for racial disparities in the National Football League and in the worlds of math and science. Taylor said genes had nothing to do with it.

Again, neither was completely right.

In their typical classes, the teachers said, they highlight traits driven by single genes — the texture of peas, or a disease like cystic fibrosis. It is an effective way to convey both how traits are transmitted from one generation to another, and how alterations in DNA can produce striking consequences.

But such traits are relatively rare. In Dr. Donovan’s curriculum, students are taught that thousands of variations in DNA influence a more common trait like height or IQ. Only a small fraction of the trait differences between individuals in the same ancestry group has been linked to particular genes. Unknown factors and the social and physical environment — including health, nutrition, opportunity and deliberate practice — also influence trait development. And students are given data about how racism has produced profoundly different environments for black and white Americans.

For Robin, the lessons said, grasping the complexity of it all made it impossible to argue that there was a gene, or even a few genes, specifically for athletics or intelligence, or that the cumulative effect of many genes could make a definitive difference.

And yet, on whiteboards, teachers listed comments and questions they anticipated from real students, including one that recurred in various forms.

“Isn’t this just a liberal agenda?”

Dr. Donovan told teachers that the curriculum also counters the viewpoint represented by Taylor — that ability is affected only by “how you’re raised, the opportunities you have, the choices you make and the effort you put in.” Recent studies, they are told, show that genetic variants play some role in shaping differences between individuals of the same population group.

Teachers participating in the training said that student beliefs about racial genetic differences at their schools surface in offhand pronouncements about who can dance and who is smart. They also lurk, some suggested, behind the expressions of intolerance that have recently marked many American schools. And what students learn about human genetic variation, teachers said, can lead to misguided conclusions: “They know DNA causes differences in skin color,” said a teacher from Washington State, “and they make the logical jump that DNA causes ‘race.’”

Class time in which to dispel confusion is limited. “It’s always like ‘O.K., but now we’re going to start the lesson on peas,’” said a Kansas teacher. Pent-up curiosity, said one from Indiana, routinely arises in year-end surveys: “I’m wondering if you know any resources where I could learn more about the genetics behind race,” one of her juniors wrote last spring.

Science teachers have had no shortage of reasons in recent decades to cede conversations on race to the humanities.

There was, for one thing, the need to repudiate the first half of the 20th century, during which science textbooks were replete with racial stereotypes and uncritical references to eugenics.

And 21st-century geneticists looking for clues to human evolution and medicine in the DNA of people from around the world took pains to note that they were not studying “race.”

“We basically decided, no, race is still a social construction, it’s not a biological thing,” Ken Miller, an author of the widely used Prentice Hall biology textbook, told the science magazine Undark of the decision to omit mention of race.

And not everyone is eager to reinsert it. Several school districts have rejected Dr. Donovan’s application to participate in the study, even when teachers have expressed interest.VideoCreditCredit…David McLeod

“I am denying the research request based on the sensitive nature of the research,” the research supervisor for one Colorado district wrote in an email.

But Jaclyn Reeves-Pepin, executive director of the National Association of Biology Teachers, said efforts to avoid lending scientific credibility to unfounded perceptions of genetic difference may themselves be sowing confusion.

“If I was a student asking about race and my teacher said, ‘Race is a social construct, we’re not going to talk about it in science class,’ well — that’s not an explanation of what students are observing in their world,” Ms. Reeves-Pepin said. In advance of the group’s annual meeting this fall, a session featuring Dr. Donovan’s curriculum received the highest score from a review panel of biology teachers of all 200 submissions, she said.

As in any experiment, the subjects will need to be informed of the risks and benefits before they consent to participate.

The benefits, a group of Midwestern 12th graders who will begin the unit this month were told, include “a research-based curriculum designed to teach complex genetics.” For the risks, the students were warned that they may feel some discomfort in science class.

>Hearts Beat as One in a Daring Ritual (N.Y. Times)

>

Dimitris Xygalatas
SPAIN Fire-walkers carry family members or friends as they cross the coals.


By PAM BELLUCK

They do it every June 23, at midnight, celebrating the summer solstice by crossing a 23-foot-long carpet of oak embers that have burned for hours before sizzling down to a glowing red. The event is full of pageantry and symbolism: processions with religious statues, trumpets sounding before each fire-walk, and three virgins (or, these days, three women who are unmarried).
So when scientists wanted to measure the physiological effects of fire-walking to see if there were biological underpinnings of communal rituals, they encountered a few hurdles.
“We talked about measuring blood pressurecortisol levels, pain tolerance,” said Ivana Konvalinka, a bioengineering doctoral student at Aarhus University in Denmark who helped lead the team. “We even talked about oxytocin,” a hormone involved in pleasure.
But with such readings difficult to obtain, they settled on heart rate, strapping monitors on fire-walkers and spectators to see whether the rates of spectators increased like those of people actually walking barefoot on hot coals.
Still, even persuading people to wear heart monitors was no easy feat. Before arriving, the research team of anthropologists, psychologists and religion experts had received permission from San Pedro Manrique’s mayor, but later he demurred, Ms. Konvalinka said.
“He said to us, if we are able to recruit people, then fine,” she said, “but he didn’t approve, and he told people not to participate.”
Some people dropped out or refused, including the people the fire-walkers carry on their backs, a group researchers considered monitoring. But others approached researchers at the last minute. Ultimately, they monitored 12 fire-walkers, 9 spectators related to fire-walkers, and 17 unrelated spectators who were just visiting. The mayor also required monitors to be concealed so they were invisible to the crowd, which filled the town’s special fire-walking amphitheater, built for 3,000 spectators, five times the number of villagers.
The researchers wanted to investigate what draws people to communal rituals like fire-walking.
“There’s the idea about rituals that they enhance group cohesion, but what creates this group?” Ms. Konvalinka said. “We figured there was some kind of autonomic nervous system measure that could capture the emotional effects of the ritual.”
The results surprised them. The heart rates of relatives and friends of the fire-walkers followed an almost identical pattern to the fire-walkers’ rates, spiking and dropping almost in synchrony. The heart rates of visiting spectators did not. The relatives’ rates synchronized throughout the event, which lasted 30 minutes, with 28 fire-walkers each making five-second walks. So relatives or friends’ heart rates matched a fire-walker’s rate before, during and after his walk. Even people related to other fire-walkers showed similar patterns.
Experts not involved in the study said despite the small number of participants, the results were intriguing. They build on research showing heart rates of fans of team sports surge when their teams score, and on studies demonstrating that people rocking in rocking chairs or tapping their fingers eventually synchronize their movements.
“It’s one study, but it’s a great study,” said Michael Richardson, an assistant professor ofpsychology at the University of Cincinnati. “It shows that being connected to someone is not just in the mind. There are these fundamental physiological behavioral moments that are occurring continuously with other people that we’re not aware of. There is a solid grounding of laboratory research which is completely consistent with their findings. It’s always hard to do these studies in the real world. This is the first study that has kind of done it on a big scale in a natural situation.”
Richard Sosis, an associate professor of anthropology at the University of Connecticut, said the study was “quite exciting,” contradicting the “assumption that rituals produce cohesion and solidarity only if there are shared movements, shared vocalizations or shared rhythms,” activities like singing, dancing or marching together. With fire-walking, spectators simply watched, without sharing activity or rhythm with the walkers. And different types of spectators had different results, with villagers in sync but out-of-towners not.
Dr. Sosis, co-editor of a new journal, Religion, Brain and Behavior, said there could be parallels with more common rituals, like weddings, baptisms or bar mitzvahs. He cited an experiment in which Paul Zak, a neuroeconomist, attended a wedding and measured oxytocin levels of the bride, groom and some relatives and friends, finding that several experienced surges in oxytocin as if bonding with the couple.
David Willey, a physicist at University of Pittsburgh at Johnstown, fire-walks himself and has reasoned that it does not normally burn because the embers do not transmit enough heat in their brief contact with feet. Heart-rate synchronization makes sense, he said, based on his fire-walking parties, where “there is very much a group feeling.”
Researchers might find similar heart-rate synchronization in other high-arousal rituals like “bending rebar with your throat, walking on broken glass, bungee jumping,” he said. “They can come to my backyard if they want.”
Ms. Konvalinka said the team plans another fire-walking study, this time in Mauritius. But they may also return to San Pedro Manrique. “At the end,” she said, “I think the mayor was O.K. with us being there.

>On Birth Certificates, Climate Risk and an Inconvenient Mind (N.Y. Times, Dot Earth Blog)

>
April 28, 2011, 9:23 AM
By ANDREW C. REVKIN

As Donald Trump tries to milk a last bit of publicity out of the failed “birther” challenge to President Obama, it’s worth reading a fresh take by an Australian psychologist on the deep roots of denial in people with fundamentalist passions of whatever stripe. Here’s an excerpt:

[I]deology trumps facts.
And it doesn’t matter what the ideology is, whether socialism, any brand of fundamentalist religion, or free-market extremism. The psychological literature shows quite consistently that a threat to one’s worldview is more than likely met by a dismissal of facts, however strong the evidence. Indeed, the stronger the evidence, the greater the threat — and hence the greater the denial.
In its own bizarre way, then, the rising noise level of climate denial provides further evidence that global warming resulting from human CO2 emissions is indeed a fact, however inconvenient it may be. Read the rest.
The piece, published today on the Australian news blog The Drum, is byStephan Lewandowsky of the School of Psychology at the University of Western Australia.
Of course, just being aware that ideology can deeply skew how people filter facts and respond to risks begs the question of how to make progress in the face of the wide societal divisions this pattern creates.
It’s easy to forget that there’s been plenty of climate denial to go around. It took a decade for those seeking a rising price on carbon dioxide emissions as a means to transform American and global energy norms to realize that a price sufficient to drive the change was a political impossibility.
As a new paper in the Proceedings of the National Academy of Sciences found, even when greenhouse-gas emissions caps were put in place, trade with unregulated countries simply shifted the brunt of the emissions elsewhere.
When he was Britain’s prime minister, Tony Blair put it this way in 2005: “The blunt truth about the politics of climate change is that no country will want to sacrifice its economy in order to meet this challenge.”
My choice, of course, is to attack the two-pronged energy challenge the world faces with a sustained energy quest, nudged and nurtured from the top but mainly fostered from the ground up.
And I’m aware I still suffer from a hint of “scientism,” even “rational optimism,” in expecting that this argument can catch on, but so be it.
10:11 a.m. | Updated For much more on the behavioral factors that shape the human struggle over climate policy, I encourage you to explore “Living in Denial: Climate Change, Emotions, and Everyday Life,” a new book by Kari Marie Norgaard, a sociologist who has just moved from Whitman College to the University of Oregon.
Robert Brulle of Drexel University brought the book to my attention several months ago, and I invited him to do a Dot Earth “Book Report,” to kick off a discussion of Norgaard’s insights, which emerge from years of research she conducted on climate attitudes in a rural community in western Norway. (I’d first heard of of Norgaard’s research while reporting my 2007 article on behavior and climate risk.)
(I also encourage you to read the review in the journal Nature Climate Changeby Mike Hulme, a professor of climate at the University of East Anglia and the author of “Why We Disagree about Climate Change.”)
Here’s Brulle’s reaction to Norgaard’s book:
As a sociologist and longtime student of human responses to environmental problems, I’ve seen reams of analysis come and go on why we get some things right and some very wrong. A new book by Kari Norgaard has done the best job yet of cutting to the core on our seeming inability to grasp and meaningfully respond to human-driven climate change.
As the science of climate change has become stronger and more dire, media coverage, public opinion, and government actions regarding this issue has declined. At the same time, climate denial positions have become increasingly accepted, despite a lack of scientific evidence. Even among the public that accepts the science of global climate change, the dire circumstances we now face in this regard are consistently downplayed, and the logical implications that follow from the scientific analysis of the necessity to enact swift and aggressive measures to combat climate change are not followed through either intellectually or politically.
Instead, at best, a series of half measures have been proposed, which though they may be comforting, are essentially symbolic measures that allow the status quo to continue unchanged, and thus will not adequately address the issue of global climate change. Thus attempts to address climate change have encountered significant cultural, political, and economic barriers that have not been overcome. While there have been several attempts to explain the lack of meaningful action regarding climate change, these models have not developed into an integrated and empirically supported approach. Additionally, many of these models are based in an individualistic perspective, and thus engage in a form of psychological reductionism. Finally, none of these models are able to coherently explain the inter-related phenomena regarding climate change that is occurring at the individual, small group, institutional, and societal levels.
To move beyond the limitations of these approaches, Dr. Norgaard develops a sociological model that views the response to global climate change as a social process. One of the fundamental insights of sociology is that individuals are part of a larger structure of cultural and social interactions. Thus through the socialization processes, we construct certain ways of life and understandings of the world that guide our everyday interactions. Individuals become the carriers of the orientations and practices that constitute our social order. A disjuncture between our taken-for-granted way of living, such as the new behaviors necessitated by climate change, are experienced at the individual level as identity threats, at the institutional level as challenges to social cohesion, and at the societal level as legitimation threats. When this occurs, there are powerful processes that work at the psychological, institutional, and overall society level to maintain the current orientations and ensure social stability. Taken together, these social processes create cultural and social stability. They also create, from the view of climate change, a form of social inertia that inhibits rapid social change.
From this sociological perspective, Dr. Norgaard takes on the apparent paradox of climate change and public awareness; as our knowledge about the nature and seriousness of climate change has increased, our political and social engagement with the issue has declined. Why? Dr. Norgaard’s answer (crudely put) is that our personality structures and social norms are so thoroughly enmeshed with a growth economy based on fossil fuels that any consideration of the need to change our way of life to deal with climate change evokes powerful emotions of anxiety and desires to avoid this issue. This avoidance behavior is socially reinforced by collective group norms, as well as the messages we receive from the mass media and the political elite. She develops this thesis through the use of an impressive array of sociological theory, including the sociology of the emotions, cultural sociology, and political economy. Additionally, she utilizes specific theoretical approaches regarding the social denial of catastrophic risk. Here she skillfully repurposes the literature on nuclear war and collective denial to the issue of climate change. This is a unique and insightful use of this literature. Thus her theoretical contribution is substantial and original. She then illustrates this process through a thick qualitative analysis based on participant observation in Norway. In her analysis of conversations, she illustrates how collective denial of climate change takes place through conversations. This provided powerful ground truth evidence of her theoretical framework.
This is an extremely important intellectual contribution. Research on climate change and culture has been primarily focused on individual attitudinal change. This work brings a sociological perspective to our understanding of individual and collective responses to climate change information, and opens up a new research area. It also has important practical implications. Most climate change communication efforts are based on conveying information to individuals. The assumption is that individuals will take in this information and then act rationally in their own interests. Dr. Norgaard’s analysis course charts a different approach. As she demonstrates, it is not a lack of information that inhibits action on climate change. Rather, the knowledge brings about unpleasant emotions and anxiety. Individuals and communities seek to restore a sense of equilibrium and stability, and thus engage in a form of denial which, although the basic facts of climate change are acknowledged, the logical conclusions and actions that follow from the information are minimized and not acted upon. This perspective calls for a much different approach to climate change communications, and defines a new agenda for this field.

[Note: people interested in this line of argument should follow the work done by researchers at the Center for Research on Environmental Decisions (CRED), at Columbia University, @ http://cred.columbia.edu.] 

>Climategate: What Really Happened? (Mother Jones)

>

>The Science of Why We Don’t Believe Science (Mother Jones)

>

Illustration: Jonathon Rosen
How our brains fool us on climate, creationism, and the vaccine-autism link.

— By Chris Mooney
Mon Apr. 18, 2011 3:00 AM PDT

“A MAN WITH A CONVICTION is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point.” So wrote the celebrated Stanford University psychologist Leon Festinger, in a passage that might have been referring to climate change denial—the persistent rejection, on the part of so many Americans today, of what we know about global warming and its human causes. But it was too early for that—this was the 1950s—and Festinger was actually describing a famous case study in psychology.

Festinger and several of his colleagues had infiltrated the Seekers, a small Chicago-area cult whose members thought they were communicating with aliens—including one, “Sananda,” who they believed was the astral incarnation of Jesus Christ. The group was led by Dorothy Martin, a Dianetics devotee who transcribed the interstellar messages through automatic writing.

Through her, the aliens had given the precise date of an Earth-rending cataclysm: December 21, 1954. Some of Martin’s followers quit their jobs and sold their property, expecting to be rescued by a flying saucer when the continent split asunder and a new sea swallowed much of the United States. The disciples even went so far as to remove brassieres and rip zippers out of their trousers—the metal, they believed, would pose a danger on the spacecraft.

Festinger and his team were with the cult when the prophecy failed. First, the “boys upstairs” (as the aliens were sometimes called) did not show up and rescue the Seekers. Then December 21 arrived without incident. It was the moment Festinger had been waiting for: How would people so emotionally invested in a belief system react, now that it had been soundly refuted?

At first, the group struggled for an explanation. But then rationalization set in. A new message arrived, announcing that they’d all been spared at the last minute. Festinger summarized the extraterrestrials’ new pronouncement: “The little group, sitting all night long, had spread so much light that God had saved the world from destruction.” Their willingness to believe in the prophecy had saved Earth from the prophecy!

From that day forward, the Seekers, previously shy of the press and indifferent toward evangelizing, began to proselytize. “Their sense of urgency was enormous,” wrote Festinger. The devastation of all they had believed had made them even more certain of their beliefs.

In the annals of denial, it doesn’t get much more extreme than the Seekers. They lost their jobs, the press mocked them, and there were efforts to keep them away from impressionable young minds. But while Martin’s space cult might lie at on the far end of the spectrum of human self-delusion, there’s plenty to go around. And since Festinger’s day, an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called “motivated reasoning” helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, “death panels,” the birthplace and religion of the president, and much else. It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.

The theory of motivated reasoning builds on a key insight of modern neuroscience: Reasoning is actually suffused with emotion (or what researchers often call “affect”). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we’re aware of it. That shouldn’t be surprising: Evolution required us to react very quickly to stimuli in our environment. It’s a “basic human survival skill,” explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.

“We apply fight-or-flight reflexes not only to predators, but to data itself.”

We’re not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn’t take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that’s highly biased, especially on topics we care a great deal about.

Consider a person who has heard about a scientific discovery that deeply challenges her belief in divine creation—a new hominid, say, that confirms our evolutionary origins. What happens next, explains political scientist Charles Taber of Stony Brook University, is a subconscious negative response to the new information—and that response, in turn, guides the type of memories and associations formed in the conscious mind. “They retrieve thoughts that are consistent with their previous beliefs,” says Taber, “and that will lead them to build an argument and challenge what they’re hearing.”

In other words, when we think we’re reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we’re being scientists, but we’re actually being lawyers. Our “reasoning” is a means to a predetermined end—winning our “case”—and is shot through with biases. They include “confirmation bias,” in which we give greater heed to evidence and arguments that bolster our beliefs, and “disconfirmation bias,” in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.

That’s a lot of jargon, but we all understand these mechanisms when it comes to interpersonal relationships. If I don’t want to believe that my spouse is being unfaithful, or that my child is a bully, I can go to great lengths to explain away behavior that seems obvious to everybody else—everybody who isn’t too emotionally invested to accept it, anyway. That’s not to suggest that we aren’t also motivated to perceive the world accurately—we are. Or that we never change our minds—we do. It’s just that we have other important goals besides accuracy—including identity affirmation and protecting one’s sense of self—and often those make us highly resistant to changing our beliefs when the facts say we should.

Modern science originated from an attempt to weed out such subjective lapses—what that great 17th century theorist of the scientific method, Francis Bacon, dubbed the “idols of the mind.” Even if individual researchers are prone to falling in love with their own theories, the broader processes of peer review and institutionalized skepticism are designed to ensure that, eventually, the best ideas prevail.

“Scientific evidence is highly susceptible to misinterpretation. Giving ideologues scientific data that’s relevant to their beliefs is like unleashing them in the motivated-reasoning equivalent of a candy store.”

Our individual responses to the conclusions that science reaches, however, are quite another matter. Ironically, in part because researchers employ so much nuance and strive to disclose all remaining sources of uncertainty, scientific evidence is highly susceptible to selective reading and misinterpretation. Giving ideologues or partisans scientific data that’s relevant to their beliefs is like unleashing them in the motivated-reasoning equivalent of a candy store.

Sure enough, a large number of psychological studies have shown that people respond to scientific or technical evidence in ways that justify their preexisting beliefs. In a classic 1979 experiment, pro- and anti-death penalty advocates were exposed to descriptions of two fake scientific studies: one supporting and one undermining the notion that capital punishment deters violent crime and, in particular, murder. They were also shown detailed methodological critiques of the fake studies—and in a scientific sense, neither study was stronger than the other. Yet in each case, advocates more heavily criticized the study whose conclusions disagreed with their own, while describing the study that was more ideologically congenial as more “convincing.”

Since then, similar results have been found for how people respond to “evidence” about affirmative action, gun control, the accuracy of gay stereotypes, and much else. Even when study subjects are explicitly instructed to be unbiased and even-handed about the evidence, they often fail.

And it’s not just that people twist or selectively read scientific evidence to support their preexisting views. According to research by Yale Law School professor Dan Kahan and his colleagues, people’s deep-seated views about morality, and about the way society should be ordered, strongly predict whom they consider to be a legitimate scientific expert in the first place—and thus where they consider “scientific consensus” to lie on contested issues.

In Kahan’s research, individuals are classified, based on their cultural values, as either “individualists” or “communitarians,” and as either “hierarchical” or “egalitarian” in outlook. (Somewhat oversimplifying, you can think of hierarchical individualists as akin to conservative Republicans, and egalitarian communitarians as liberal Democrats.) In one study, subjects in the different groups were asked to help a close friend determine the risks associated with climate change, sequestering nuclear waste, or concealed carry laws: “The friend tells you that he or she is planning to read a book about the issue but would like to get your opinion on whether the author seems like a knowledgeable and trustworthy expert.” A subject was then presented with the résumé of a fake expert “depicted as a member of the National Academy of Sciences who had earned a Ph.D. in a pertinent field from one elite university and who was now on the faculty of another.” The subject was then shown a book excerpt by that “expert,” in which the risk of the issue at hand was portrayed as high or low, well-founded or speculative. The results were stark: When the scientist’s position stated that global warming is real and human-caused, for instance, only 23 percent of hierarchical individualists agreed the person was a “trustworthy and knowledgeable expert.” Yet 88 percent of egalitarian communitarians accepted the same scientist’s expertise. Similar divides were observed on whether nuclear waste can be safely stored underground and whether letting people carry guns deters crime. (The alliances did not always hold. In another study, hierarchs and communitarians were in favor of laws that would compel the mentally ill to accept treatment, whereas individualists and egalitarians were opposed.)

“Head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.”

In other words, people rejected the validity of a scientific source because its conclusion contradicted their deeply held views—and thus the relative risks inherent in each scenario. A hierarchal individualist finds it difficult to believe that the things he prizes (commerce, industry, a man’s freedom to possess a gun to defend his family) could lead to outcomes deleterious to society. Whereas egalitarian communitarians tend to think that the free market causes harm, that patriarchal families mess up kids, and that people can’t handle their guns. The study subjects weren’t “anti-science”—not in their own minds, anyway. It’s just that “science” was whatever they wanted it to be. “We’ve come to a misadventure, a bad situation where diverse citizens, who rely on diverse systems of cultural certification, are in conflict,” says Kahan.

And that undercuts the standard notion that the way to persuade people is via evidence and argument. In fact, head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.

Take, for instance, the question of whether Saddam Hussein possessed hidden weapons of mass destruction just before the US invasion of Iraq in 2003. When political scientists Brendan Nyhan and Jason Reifler showed subjects fake newspaper articles in which this was first suggested (in a 2004 quote from President Bush) and then refuted (with the findings of the Bush-commissioned Iraq Survey Group report, which found no evidence of active WMD programs in pre-invasion Iraq), they found that conservatives were more likely than before to believe the claim. (The researchers also tested how liberals responded when shown that Bush did not actually “ban” embryonic stem-cell research. Liberals weren’t particularly amenable to persuasion, either, but no backfire effect was observed.)

Another study gives some inkling of what may be going through people’s minds when they resist persuasion. Northwestern University sociologist Monica Prasad and her colleagues wanted to test whether they could dislodge the notion that Saddam Hussein and Al Qaeda were secretly collaborating among those most likely to believe it—Republican partisans from highly GOP-friendly counties. So the researchers set up a study in which they discussed the topic with some of these Republicans in person. They would cite the findings of the 9/11 Commission, as well as a statement in which George W. Bush himself denied his administration had “said the 9/11 attacks were orchestrated between Saddam and Al Qaeda.”

“One study showed that not even Bush’s own words could change the minds of Bush voters who believed there was an Iraq-Al Qaeda link.”

As it turned out, not even Bush’s own words could change the minds of these Bush voters—just 1 of the 49 partisans who originally believed the Iraq-Al Qaeda claim changed his or her mind. Far more common was resisting the correction in a variety of ways, either by coming up with counterarguments or by simply being unmovable:

Interviewer: [T]he September 11 Commission found no link between Saddam and 9/11, and this is what President Bush said. Do you have any comments on either of those? 

Respondent: Well, I bet they say that the Commission didn’t have any proof of it but I guess we still can have our opinions and feel that way even though they say that.

The same types of responses are already being documented on divisive topics facing the current administration. Take the “Ground Zero mosque.” Using information from the political myth-busting site FactCheck.org, a team at Ohio State presented subjects with a detailed rebuttal to the claim that “Feisal Abdul Rauf, the Imam backing the proposed Islamic cultural center and mosque, is a terrorist-sympathizer.” Yet among those who were aware of the rumor and believed it, fewer than a third changed their minds.

A key question—and one that’s difficult to answer—is how “irrational” all this is. On the one hand, it doesn’t make sense to discard an entire belief system, built up over a lifetime, because of some new snippet of information. “It is quite possible to say, ‘I reached this pro-capital-punishment decision based on real information that I arrived at over my life,'” explains Stanford social psychologist Jon Krosnick. Indeed, there’s a sense in which science denial could be considered keenly “rational.” In certain conservative communities, explains Yale’s Kahan, “People who say, ‘I think there’s something to climate change,’ that’s going to mark them out as a certain kind of person, and their life is going to go less well.”

This may help explain a curious pattern Nyhan and his colleagues found when they tried to test the fallacy that President Obama is a Muslim. When a nonwhite researcher was administering their study, research subjects were amenable to changing their minds about the president’s religion and updating incorrect views. But when only white researchers were present, GOP survey subjects in particular were more likely to believe the Obama Muslim myth than before. The subjects were using “social desirabililty” to tailor their beliefs (or stated beliefs, anyway) to whoever was listening.

Which leads us to the media. When people grow polarized over a body of evidence, or a resolvable matter of fact, the cause may be some form of biased reasoning, but they could also be receiving skewed information to begin with—or a complicated combination of both. In the Ground Zero mosque case, for instance, a follow-up study showed that survey respondents who watched Fox News were more likely to believe the Rauf rumor and three related ones—and they believed them more strongly than non-Fox watchers.

Okay, so people gravitate toward information that confirms what they believe, and they select sources that deliver it. Same as it ever was, right? Maybe, but the problem is arguably growing more acute, given the way we now consume information—through the Facebook links of friends, or tweets that lack nuance or context, or “narrowcast” and often highly ideological media that have relatively small, like-minded audiences. Those basic human survival skills of ours, says Michigan’s Arthur Lupia, are “not well-adapted to our information age.”

“A predictor of whether you accept the science of global warming? Whether you’re a Republican or a Democrat.”

If you wanted to show how and why fact is ditched in favor of motivated reasoning, you could find no better test case than climate change. After all, it’s an issue where you have highly technical information on one hand and very strong beliefs on the other. And sure enough, one key predictor of whether you accept the science of global warming is whether you’re a Republican or a Democrat. The two groups have been growing more divided in their views about the topic, even as the science becomes more unequivocal.

So perhaps it should come as no surprise that more education doesn’t budge Republican views. On the contrary: In a 2008 Pew survey, for instance, only 19 percent of college-educated Republicans agreed that the planet is warming due to human actions, versus 31 percent of non-college educated Republicans. In other words, a higher education correlated with an increased likelihood of denying the science on the issue. Meanwhile, among Democrats and independents, more education correlated with greater acceptance of the science.

Other studies have shown a similar effect: Republicans who think they understand the global warming issue best are least concerned about it; and among Republicans and those with higher levels of distrust of science in general, learning more about the issue doesn’t increase one’s concern about it. What’s going on here? Well, according to Charles Taber and Milton Lodge of Stony Brook, one insidious aspect of motivated reasoning is that political sophisticates are prone to be more biased than those who know less about the issues. “People who have a dislike of some policy—for example, abortion—if they’re unsophisticated they can just reject it out of hand,” says Lodge. “But if they’re sophisticated, they can go one step further and start coming up with counterarguments.” These individuals are just as emotionally driven and biased as the rest of us, but they’re able to generate more and better reasons to explain why they’re right—and so their minds become harder to change.

That may be why the selectively quoted emails of Climategate were so quickly and easily seized upon by partisans as evidence of scandal. Cherry-picking is precisely the sort of behavior you would expect motivated reasoners to engage in to bolster their views—and whatever you may think about Climategate, the emails were a rich trove of new information upon which to impose one’s ideology.

Climategate had a substantial impact on public opinion, according to Anthony Leiserowitz, director of the Yale Project on Climate Change Communication. It contributed to an overall drop in public concern about climate change and a significant loss of trust in scientists. But—as we should expect by now—these declines were concentrated among particular groups of Americans: Republicans, conservatives, and those with “individualistic” values. Liberals and those with “egalitarian” values didn’t lose much trust in climate science or scientists at all. “In some ways, Climategate was like a Rorschach test,” Leiserowitz says, “with different groups interpreting ambiguous facts in very different ways.”

“Is there a case study of science denial that largely occupies the political left? Yes: the claim that childhood vaccines are causing an epidemic of autism.”

So is there a case study of science denial that largely occupies the political left? Yes: the claim that childhood vaccines are causing an epidemic of autism. Its most famous proponents are an environmentalist (Robert F. Kennedy Jr.) and numerous Hollywood celebrities (most notably Jenny McCarthy and Jim Carrey). The Huffington Post gives a very large megaphone to denialists. And Seth Mnookin, author of the new book The Panic Virus, notes that if you want to find vaccine deniers, all you need to do is go hang out at Whole Foods.

Vaccine denial has all the hallmarks of a belief system that’s not amenable to refutation. Over the past decade, the assertion that childhood vaccines are driving autism rates has been undermined by multiple epidemiological studies—as well as the simple fact that autism rates continue to rise, even though the alleged offending agent in vaccines (a mercury-based preservative called thimerosal) has long since been removed.

Yet the true believers persist—critiquing each new study that challenges their views, and even rallying to the defense of vaccine-autism researcher Andrew Wakefield, after his 1998 Lancet paper—which originated the current vaccine scare—was retracted and he subsequently lost his license (PDF) to practice medicine. But then, why should we be surprised? Vaccine deniers created their own partisan media, such as the website Age of Autism, that instantly blast out critiques and counterarguments whenever any new development casts further doubt on anti-vaccine views.

It all raises the question: Do left and right differ in any meaningful way when it comes to biases in processing information, or are we all equally susceptible?

There are some clear differences. Science denial today is considerably more prominent on the political right—once you survey climate and related environmental issues, anti-evolutionism, attacks on reproductive health science by the Christian right, and stem-cell and biomedical matters. More tellingly, anti-vaccine positions are virtually nonexistent among Democratic officeholders today—whereas anti-climate-science views are becoming monolithic among Republican elected officials.

Some researchers have suggested that there are psychological differences between the left and the right that might impact responses to new information—that conservatives are more rigid and authoritarian, and liberals more tolerant of ambiguity. Psychologist John Jost of New York University has further argued that conservatives are “system justifiers”: They engage in motivated reasoning to defend the status quo.

This is a contested area, however, because as soon as one tries to psychoanalyze inherent political differences, a battery of counterarguments emerges: What about dogmatic and militant communists? What about how the parties have differed through history? After all, the most canonical case of ideologically driven science denial is probably the rejection of genetics in the Soviet Union, where researchers disagreeing with the anti-Mendelian scientist (and Stalin stooge) Trofim Lysenko were executed, and genetics itself was denounced as a “bourgeois” science and officially banned.

The upshot: All we can currently bank on is the fact that we all have blinders in some situations. The question then becomes: What can be done to counteract human nature itself?

“We all have blinders in some situations. The question then becomes: What can be done to counteract human nature?”

Given the power of our prior beliefs to skew how we respond to new information, one thing is becoming clear: If you want someone to accept new evidence, make sure to present it to them in a context that doesn’t trigger a defensive, emotional reaction.

This theory is gaining traction in part because of Kahan’s work at Yale. In one study, he and his colleagues packaged the basic science of climate change into fake newspaper articles bearing two very different headlines—”Scientific Panel Recommends Anti-Pollution Solution to Global Warming” and “Scientific Panel Recommends Nuclear Solution to Global Warming”—and then tested how citizens with different values responded. Sure enough, the latter framing made hierarchical individualists much more open to accepting the fact that humans are causing global warming. Kahan infers that the effect occurred because the science had been written into an alternative narrative that appealed to their pro-industry worldview.

You can follow the logic to its conclusion: Conservatives are more likely to embrace climate science if it comes to them via a business or religious leader, who can set the issue in the context of different values than those from which environmentalists or scientists often argue. Doing so is, effectively, to signal a détente in what Kahan has called a “culture war of fact.” In other words, paradoxically, you don’t lead with the facts in order to convince. You lead with the values—so as to give the facts a fighting chance.

[Original link with access to mentioned studies here.]

>Evento promove o Ciber-Ciência Cidadã no Brasil (JC/Gestão C&T)

>
JC e-mail 4240, de 18 de Abril de 2011

Ele é voltado aos interessados em participar de projetos científicos relacionados aos mais diversos temas, como mudanças climáticas, física de partículas, além de desenvolvedores, programadores e cientistas.

No mês de maio, alguns estados receberão o evento Brasil@home, uma iniciativa para promover o Ciber-Ciência Cidadã, que é a participação da sociedade em projetos científicos via internet, no Brasil e na América Latina. É uma introdução aos conceitos e prática de computação voluntária, inteligência distribuída e sensoriamento remoto voluntário.

No encontro, cientistas chefes dos principais projetos de Ciber-Ciência Cidadã no mundo ministrarão palestras e contribuirão para fomentar novos projetos no Brasil. O evento é voltado para interessados em participar de projetos científicos relacionados aos mais diversos temas, como mudanças climáticas, física de partículas, digitalização de documentos históricos, além de desenvolvedores, programadores e cientistas.

Alguns dos palestrantes são David Anderson, da University of Berkeley, criador do Seti@home, primeiro projeto de computação voluntária; Philip Brohan, da UK Meteorological Office, cientista do projeto Old Weather, no qual cidadãos ajudam a digitalizar dados climáticos registrados em antigos diários de bordo de navios; Francois Grey, diretor do Centro de Ciber-Ciência Cidadã; entre outros.

O Brasil@home será no dia 2 de maio em Brasília (DF); de 3 a 5 do mesmo mês no Rio de Janeiro (RJ); e no dia 6 em São Paulo (SP). A programação completa e mais informações estão no site http://www.citizencyberscience.net/brasilathome.
(Gestão C&T)

>Teia neural da esquizofrenia (FAPESP)

>
14/4/2011

Revista Pesquisa FAPESP – Pesquisadores norte-americanos deram um passo importante para identificar as causas biológicas da esquizofrenia, conjunto de transtornos mentais graves que atingem cerca de 60 milhões de pessoas no mundo – por volta de 1,8 milhão no Brasil – e se caracterizam por distanciamento emocional da realidade, pensamento desordenado, crenças falsas (delírios) e ilusões (alucinações) visuais ou auditivas.

Alguns desses sinais são semelhantes aos apresentados pelo jovem Wellington Menezes de Oliveira, de 23 anos, que no início de abril matou 12 crianças em uma escola no bairro do Realengo, no Rio de Janeiro, antes de se suicidar.

A equipe coordenada pelo neurocientista Fred Gage, do Instituto Salk de Estudos Biológicos, na Califórnia, conseguiu transformar células da pele de pessoas com esquizofrenia em células mais imaturas e versáteis. Chamadas de células-tronco de pluripotência induzida (iPS, na sigla em inglês), essas células foram depois convertidas em neurônios, uma das variedades de células do tecido cerebral. O estudo foi publicado nesta quinta-feira no site da revista Nature.

Essa mudança forçada de função gerou o que os pesquisadores acreditam ser cópias fiéis, ao menos do ponto de vista genético, das células do cérebro de quem tem esquizofrenia, que, por óbvios motivos éticos, antes só podiam ser analisadas depois da morte.

Como são geneticamente idênticos às células cerebrais de quem desenvolveu esquizofrenia, esses neurônios fabricados em laboratório são importantes para compreender a enfermidade, que tem importante componente genético, porque permite aos pesquisadores desprezar a influência de fatores ambientais, como o uso de medicamentos ou o contexto social em que as pessoas vivem.

“Não se sabe quanto o ambiente contribui para a doença. Mas, ao fazer esses neurônios crescerem em laboratório, podemos eliminar o ambiente da equação e começar a focar nos problemas biológicos”, disse Kristen Brennand, pesquisadora do grupo de Gage e primeira autora do artigo.

Segundo Gage, é a primeira vez que se consegue criar, a partir de células de seres humanos vivos, um modelo experimental de uma doença mental complexa.

“Esse modelo não apenas nos dá a oportunidade de olhar para neurônios vivos de pacientes com esquizofrenia e de pessoas saudáveis, como também deve permitir entender melhor os mecanismos da doença e avaliar medicamentos que podem revertê-la”, disse o cientista que há alguns anos demonstrou que o cérebro adulto continua a produzir neurônios.

Depois de converter em laboratório células da pele em neurônios, Brennand realizou testes para verificar se eles se comportavam de fato como os neurônios originais e eram capazes de transmitir informação de uma célula a outra. As células cerebrais obtidas a partir de células da pele (fibroblastos) funcionavam, sim, como neurônios. “Em vários sentidos, os neurônios ‘esquizofrênicos’ são indistintos dos saudáveis”, disse.

Mas há diferenças. A pesquisadora notou que os novos neurônios de quem tinha esquizofrenia apresentavam menos ramificações do que os das pessoas saudáveis. Essas ramificações são importantes porque permitem a comunicação de uma célula cerebral com outra – e geralmente são encontradas em menor número em estudos feitos com modelo animal da doença e em análises de neurônios extraídos após a morte de pacientes com esquizofrenia.

Nos neurônios dos esquizofrênicos, a atividade genética diferiu daquela observada nas pessoas sem a doença. Os autores do estudo viram que o nível de ativação de 596 genes era desigual nos dois grupos: 271 genes eram mais ativos nas pessoas com esquizofrenia – e 325 menos expressos – do que nas pessoas sem o problema.

Em um estágio seguinte, Brennand deixou os fibroblastos convertidos em neurônios em cinco soluções diferentes, cada uma contendo um dos cinco medicamentos mais usados para tratar esquizofrenia – os antipsicóticos clozapina, loxapina, olanzapina, risperidona e tioridazina.

Dos cinco, apenas a loxapina foi capaz de reverter o efeito da ativação anormal dos genes e permitir o crescimento de mais ramificações nos neurônios. Esses resultados, porém, não indicam que os outros quatro compostos não sejam eficientes. “A otimização da concentração e do tempo de administração pode aumentar os efeitos das outras medicações antipsicóticas”, escreveram os pesquisadores.

“Esses medicamentos estão fazendo mais do que achávamos que fossem capazes de fazer. Pela primeira vez temos um modelo que permite estudar como os antipsicóticos agem em neurônios vivos e geneticamente idênticos aos de paciente”, disse a pesquisadora. Isso é importante porque torna possível comparar os sinais da evolução clínica da doença com os efeitos farmacológicos.

“Por muito tempo as doenças mentais foram vistas como um problema social ou ambiental, e as pessoas achavam que os pacientes poderiam superá-las caso se esforçassem. Estamos mostrando que algumas disfunções biológicas reais nos neurônios são independentes do ambiente”, disse Gage.

O artigo Modelling schizophrenia using human induced pluripotent stem cells (doi:10.1038/nature09915), de Fred Gage e outros, pode ser lido por assinantes da Nature em http://www.nature.com.

>A dor da rejeição (Fapesp)

>
Divulgação Científica
29/3/2011

Estudo indica que o sentimento de rejeição após o fim de um relacionamento amoroso e a dor física ao se machucar ativam as mesmas regiões no cérebro (reprodução)

Agência FAPESP – A dor da rejeição não é apenas uma figura de expressão ou de linguagem, mas algo tão real como a dor física. Segundo uma nova pesquisa, experiências intensas de rejeição social ativam as mesmas áreas no cérebro que atuam na resposta a experiências sensoriais dolorosas.

“Os resultados dão novo sentido à ideia de que a rejeição social ‘machuca’”, disse Ethan Kross, da Universidade de Michigan, que coordenou a pesquisa.

Os resultados do estudo serão publicados esta semana no site e em breve na edição impressa da revista Proceedings of the National Academy of Sciences.

“A princípio, derramar uma xícara de café quente em você mesmo ou pensar em uma pessoa com quem experimentou recentemente um rompimento inesperado parece que provocam tipos diferentes de dor, mas nosso estudo mostra que são mais semelhantes do que se pensava”, disse Kross.

Estudos anteriores indicaram que as mesmas regiões no cérebro apoiam os sentimentos emocionalmente estressantes que acompanham a experiência tando da dor física como da rejeição social.

A nova pesquisa destaca que há uma interrelação neural entre esses dois tipos de experiências em áreas do cérebro, uma parte em comum que se torna ativa quando uma pessoa experimenta sensações dolorosas, físicas ou não. Kross e colegas identificaram essas regiões: o córtex somatossensorial e a ínsula dorsal posterior.

Participaram do estudo 40 voluntários que haviam passado por um fim inesperado de relacionamento amoroso nos últimos seis meses e que disseram se sentir rejeitados por causa do ocorrido.

Cada participante completou duas tarefas, uma relacionada à sensação de rejeição e outra com respostas à dor física, enquanto tinham seus cérebros examinados por ressonância magnética funcional.

“Verificamos que fortes sensações induzidas de rejeição social ativam as mesmas regiões cerebrais envolvidas com a sensação de dor física, áreas que são raramente ativadas em estudos de neuroimagens de emoções”, disse Kross.

O artigo Social rejection shares somatosensory representations with physical pain (doi/10.1073/pnas.1102693108), de Ethan Kross e outros, poderá ser lido em breve por assinantes da PNAS em http://www.pnas.org/cgi/doi/10.1073/pnas.1102693108.

>Um novo furacão no Brasil (JC, O Globo)

>
Inmet, Marinha e Inpe divergem sobre tempestade Arani, que atinge litoral.

JC e-mail 4218, de 16 de Março de 2011.

Um fenômeno climático que tem provocado chuva intensa do Norte Fluminense ao sul da Bahia divide os principais órgãos meteorológicos do país. O Instituto Nacional de Meteorologia (Inmet) chama o Arani, como foi batizado, de furacão. Em um alerta especial, ressaltou a ocorrência de ventos de até 120 km/h sobre o Oceano Atlântico.

O diagnóstico, porém, não é compartilhado pela Marinha do Brasil, que define o mesmo fenômeno como tempestade subtropical – uma escala de gravidade abaixo -, nem pelo Instituto Nacional de Pesquisas Espaciais (Inpe), que afirma tratar-se de uma depressão tropical – outro degrau abaixo no nível de periculosidade.

Fenômeno se afasta da costa brasileira

O Arani (“tempo furioso”, em tupi) se formou pela conjunção de água e ar quentes em uma área de forte instabilidade próxima à costa do Espírito Santo. Esse sistema provocou uma circulação ciclônica de ventos, além de grandes volumes de chuva naquele estado. O perigo não foi maior porque a formação está sobre alto-mar e, nos próximos dois dias, deve se dirigir para sudeste, afastando-se ainda mais do litoral brasileiro.

De acordo com o Inmet, o Arani ganhou mais força quando se afastou do litoral, adquirindo as características de um furacão híbrido. Trata-se de uma formação diferente das que costumam devastar o Caribe e o Atlântico Norte, pois, em vez de um sistema independente, que se alimenta do aquecimento das águas do mar, está associado a um ciclone, originado de uma frente fria.

O furacão está a 110 quilômetros da costa brasileira e só representa ameaça a embarcações e aviões que sobrevoem a região do Cabo de São Tomé, litoral do Rio, que está em sua rota para o oceano. Nos próximos dias, o Arani deve atingir águas internacionais, e o monitoramento caberá à África do Sul.

O Inmet classificou o fenômeno com a ajuda de órgãos americanos de monitoramento de furacões. De acordo com a meteorologista Morgana Almeida, da equipe do instituto, não há risco de o movimento atual do fenômeno se inverter, trazendo prejuízos ao continente. O instituto alertou autoridades da Marinha, que tomaram providências para evitar o tráfego na área atingida pelos fortes ventos.

Mas o próprio Serviço Meteorológico da Marinha classifica o Arani de outra forma. O órgão identificou rajadas de, no máximo, 80 km/h. Há grande precipitação em alto-mar, mas as ondas provocadas por elas, de 3 a 4 metros, têm o mesmo tamanho daquelas formadas por uma frente fria.

– Formações como essa não são comuns, mas podem ocorrer no verão – ressalta a meteorologista Caroline Vidal Ferreira da Guia, do Inpe. – O Arani tem força para provocar transtornos à população, mas, segundo nossas medições, não chega a ser um furacão.
(O Globo)

>Never Say ‘Diagonal of the Covariance Matrix’: 6 Things Scientists Can Learn from Science Journalists

>
Never Say ‘Diagonal of the Covariance Matrix’: 6 Things Scientists Can Learn from Science Journalists
By Maggie Koerth-Baker
Science Editor, BoingBoing.net

The New York Times
February 26, 2011, 2:22 PM

Can Scientists Learn from Science Journalists?
By ANDREW C. REVKIN

Maggie Koerth-Baker, science editor of BoingBoing.net, gave a really good talk at the University of Wisconsin aiming to encourage scientists to communicate effectively with other human beings. A starting point: listening. Another: Start a blog.

Here’s a summary of the main points that I got from David Isenberg, who alerted me to the lecture:

  • Show, don’t tell.
  • Don’t just talk, ask.
  • Lay people know more (and less) than you think.
  • Not everything is news.
  • Be critical of your own work.
  • Mistakes last, but pedantry kills.

There are deep divisions between the cultures and norms of science and journalism.

One example: For scientists, peer review occurs before publication, for journalists, afterward.

Another: All lines in a newspaper story or broadcast, in theory at least, have to stand on their own as accurate; in a research paper, the inaccuracies produced by the compression in an abstract are seen as normal and acceptable by many scientists, with the nuance conveyed in the body of a paper.

In a recent conversation I had with Gavin Schmidt, a climate scientist and communicator, it was clear we had utterly different norms for interpreting summaries of a research paper.

Some of the differences were touched on in my recent coverage of new analysis attributing some changes in extreme precipitation in the Northern Hemisphere to human-driven global warming.

I would add that scientists (and science journalists) would do well to review the talk given by Thomas Lessl of the University of Georgia at the annual conference of the American Association for the Advancement of Science, on the limited role of science, even if communicated clearly, in shaping policy and human choices.

There’s a link and excerpt in my recent post “Do Fights Over Climate Communication Reflect the End of ‘Scientism’?”

The take-home thought:

As scientists and science journalists spar over who’s failing in climate communication, an outsider says they’re missing the point

>Ancient Catastrophic Drought Leads to Question: How Severe Can Climate Change Become? (NSF)

>
Press Release 11-039

Extreme megadrought in Afro-Asian region likely had consequences for Paleolithic cultures

A boat on Lake Tanganyika today; the lake’s ancient surface water level fell dramatically.
Credit: Curt Stager.

February 24, 2011
How severe can climate change become in a warming world?

Worse than anything we’ve seen in written history, according to results of a study appearing this week in the journal Science.

An international team of scientists led by Curt Stager of Paul Smith’s College, New York, has compiled four dozen paleoclimate records from sediment cores in Lake Tanganyika and other locations in Africa.

The records show that one of the most widespread and intense droughts of the last 50,000 years or more struck Africa and Southern Asia 17,000 to 16,000 years ago.

Between 18,000 and 15,000 years ago, large amounts of ice and meltwater entered the North Atlantic Ocean, causing regional cooling but also major drought in the tropics, says Paul Filmer, program director in the National Science Foundation’s (NSF) Division of Earth Sciences, which funded the research along with NSF’s Division of Atmospheric and Geospace Sciences and its Division of Ocean Sciences.

“The height of this time period coincided with one of the most extreme megadroughts of the last 50,000 years in the Afro-Asian monsoon region with potentially serious consequences for the Paleolithic humans that lived there at the time,” says Filmer.

The “H1 megadrought,” as it’s known, was one of the most severe climate trials ever faced by anatomically modern humans.

Africa’s Lake Victoria, now the world’s largest tropical lake, dried out, as did Lake Tana in Ethiopia, and Lake Van in Turkey.

The Nile, Congo and other major rivers shriveled, and Asian summer monsoons weakened or failed from China to the Mediterranean, meaning the monsoon season carried little or no rainwater.

What caused the megadrought remains a mystery, but its timing suggests a link to Heinrich Event 1 (or “H1”), a massive surge of icebergs and meltwater into the North Atlantic at the close of the last ice age.

Previous studies had implicated southward drift of the tropical rain belt as a localized cause, but the broad geographic coverage in this study paints a more nuanced picture.

“If southward drift were the only cause,” says Stager, lead author of the Science paper, “we’d have found evidence of wetting farther south. But the megadrought hit equatorial and southeastern Africa as well, so the rain belt didn’t just move–it also weakened.”

Climate models have yet to simulate the full scope of the event.

The lack of a complete explanation opens the question of whether an extreme megadrought could strike again as the world warms and de-ices further.

“There’s much less ice left to collapse into the North Atlantic now,” Stager says, “so I’d be surprised if it could all happen again–at least on such a huge scale.”

Given what such a catastrophic megadrought could do to today’s most densely populated regions of the globe, Stager hopes he’s right.

Stager also holds an adjunct position at the Climate Change Institute, University of Maine, Orono.

Co-authors of the paper are David Ryves of Loughborough University in the United Kingdom; Brian Chase of the Institut des Sciences de l’Evolution de Montpellier in France and the Department of Archaeology, University of Bergen, Norway; and Francesco Pausata of the Geophysical Institute, University of Bergen, Norway.

-NSF-

>Robôs fazem ciência (FAPESP)

>
Especiais

25/2/2011
Por Mônica Pileggi

Sistemas automatizados desenvolvidos por pesquisadores do Reino Unido podem ser peça-chave para a criação de fármacos mais eficazes a custo reduzido (divulgação).

Agência FAPESP – Criar máquinas capazes de realizar novas descobertas é algo que está saindo do campo da ficção científica. Um dos maiores exemplos na atualidade está no Reino Unido, onde a equipe do professor Ross King, do Departamento de Ciências da Computação da Universidade de Gales, trabalha há mais de uma década no desenvolvimento de Adam e Eve (Adão e Eva).

O objetivo da dupla automatizada é diminuir o tempo dos ensaios em laboratório para o desenvolvimento de novos fármacos. Além disso, Eve, o modelo de segunda geração, permite encontrar drogas cujos compostos químicos são mais efetivos no tratamento de uma doença e de forma mais rápida e econômica.

Tal façanha é possível graças à capacidade que o robô tem de selecionar compostos, dentre os milhares armazenados em sua biblioteca, que surtirão mais efeito durante os ensaios no combate a determinada doença. E Eve consegue testar mais de um ao mesmo tempo. “Depois, o pesquisador humano analisa os resultados obtidos”, disse King à Agência FAPESP.

“Mas mesmo com todos esses recursos é importante destacar que Eve ainda não possui inteligência artificial”, completou o professor, que participou nesta quinta-feira (24/2) do Workshop on Synthetic Biology and Robotics, em São Paulo. O evento, organizado pela FAPESP e pelo Consulado Britânico em São Paulo, integra a Parceria Brasil–Reino Unido em Ciência e Inovação.

“Hoje, o robô testa os compostos químicos disponíveis na biblioteca, mas não identifica padrões. A partir da próxima semana trabalharemos para que entenda o trabalho que executa”, revelou.

Nessa fase final de desenvolvimento, a meta é tornar Eve capaz o suficiente para identificar novos padrões – combinações de moléculas – que possam vir a ajudar no desenvolvimento de drogas mais eficazes para, em seguida, testá-las.

Teste em larga escala

Embora incompleto, o robô cientista já mostrou do que é capaz. Ao realizar experimentos em larga escala, Eve reduziu de forma expressiva o escopo de fármacos que a engenheira agrônoma Elizabeth Bilsland, da Universidade de Cambridge, precisaria testar em sua pesquisa com os parasitas Schistosoma, Plasmodium vivax e P. falciparum, e Trypanosoma cruzi e T. brucei, além da Leishmania.

“Cada parasita se desenvolve em diferentes condições. E, para criar novos fármacos, é preciso testar novos métodos. Eve testou mais de 15 mil compostos químicos de sua biblioteca para encontrar aqueles capazes de inibir as enzimas dos parasitas, sem danificar os genes humanos”, disse Elizabeth.

De acordo com a pesquisadora, com base nos ensaios para as doenças causadas pelos parasitas listados, o robô teceu uma rede de hipóteses até chegar a um fármaco com potencial para tratar de todas ao mesmo tempo, exceto a leishmaniose. “É o que podemos chamar de droga miraculosa”, ressaltou.

Mas ainda falta muito para a droga chegar ao mercado, uma vez que a hipótese criada pelo robô precisa ser validada. Essa fase do trabalho contará com a colaboração de cientistas da Unicamp e da Unesp.

Por conta do período que uma nova droga leva para ser lançada, Elizabeth destacou as pesquisas que vem realizando com remédios já disponíveis e aprovados pela Food and Drug Administration do governo dos Estados Unidos.

“Algumas delas são aprovadas e indicadas para determinadas doenças, mas também têm potencial para o tratamento de outras. Testamos essas drogas no sistema que criamos e encontramos cerca de cinco que atacam também as enzimas de Trypanosoma e outras que atingem as enzimas do Plasmodium vivax”, explicou.

A finalidade desse estudo é reaproveitar medicamentos já existentes e aprovados para uso humano que sejam eficientes também em outras doenças.

“Durante uma visita a um hospital em Campinas, observei um caso em que um medicamento prescrito para problemas do coração foi utilizado para o tratamento da doença de Chagas, com bons resultados”, disse Elizabeth.

>Yale Project on Knowledge of Climate Change Across Global Warming’s Six Americas

>
From Anthony Leiserowitz, Yale Project on Climate Change Communication

“Today we are pleased to announce the release of a new report entitled “Knowledge of Climate Change Across Global Warming’s Six Americas.” This report draws from a national study we conducted last year on what Americans understand about how the climate system works, and the causes, impacts, and potential solutions to global warming and is available here.

Overall, we found that knowledge about climate change varies widely across the Six Americas – 49 percent of the Alarmed received a passing grade (A, B, or C), compared to 33 percent of the Concerned, 16 percent of the Cautious, 17 percent of the Doubtful, 4 percent of the Dismissive, and 5 percent of the Disengaged. In general, the Alarmed and the Concerned better understand how the climate system works and the causes, consequences, and solutions to climate change than the Disengaged, the Doubtful and the Dismissive. For example:

· 87% of the Alarmed and 76% of the Concerned understand that global warming is caused mostly by human activities compared to 37% of the Disengaged, 6% of the Doubtful and 3% of the Dismissive;
· 86% of the Alarmed and 71% of the Concerned understand that emissions from cars and trucks contribute substantially to global warming compared to 18% of the Disengaged, 16% of the Doubtful and 10% of the Dismissive;
· 89% of the Alarmed and 64% of the Concerned understand that a transition to renewable energy sources is an important solution compared to 12% of the Disengaged, 13% of the Doubtful and 7% of the Dismissive.

However, this study also found that occasionally the Doubtful and Dismissive have as good or a better understanding than the Alarmed or Concerned. For example:

· 79% of the Dismissive and 74% of the Doubtful correctly understand that the greenhouse effect refers to gases in the atmosphere that trap heat, compared to 66% of the Alarmed and 64% of the Concerned;
· The Dismissive are less likely to incorrectly say that “the greenhouse effect” refers to the Earth’s protective ozone layer than all other groups, including the Alarmed (13% vs. 24% respectively);
· 50% of the Dismissive and 57% of the Doubtful understand that carbon dioxide traps heat from the Earth’s surface, compared to 59% of the Alarmed, and 45% of the Concerned.

This study also identified numerous gaps between expert and public knowledge about climate change. For example, only:

· 13% of the Alarmed know how much carbon dioxide there is in the atmosphere today (approximately 390 parts per million) compared to 5% of the Concerned, 9% of the Cautious, 4% of the Disengaged, 6% of the Doubtful and 7% of the Dismissive;
· 52% of the Alarmed have heard of coral bleaching, vs. 24% of the Concerned, 23% of the Cautious, 5% of the Disengaged, 21% of the Doubtful and 24% of the Dismissive;
· 46% of the Alarmed have heard of ocean acidification, vs. 22% of the Concerned, 25% of the Cautious, 6% of the Disengaged, 23% of the Doubtful and 16% of the Dismissive.

This study also found important misconceptions leading many to misunderstand the causes and therefore the solutions to climate change. For example, many Americans confuse climate change and the hole in the ozone layer. Such misconceptions were particularly apparent for the Alarmed and Concerned segments:

· 63% of the Alarmed and 49% of the Concerned believe that the hole in the ozone layer is a significant contributor to global warming compared to 32% of the Cautious, 12% of the Disengaged, 6% of the Doubtful and 7% of the Dismissive;
· 49% of the Alarmed and 36% of the Concerned believe that aerosol spray cans are a significant contributor to global warming compared to 20% of the Cautious, 9% of the Disengaged, 7% of the Doubtful and 5% of the Dismissive;
· 39% of the Alarmed and 23% of the Concerned believe that banning aerosol spray cans would reduce global warming compared to 13% of the Cautious, 3% of the Disengaged, 4% of the Doubtful and 1% of the Dismissive.

Concerned, Cautious and Disengaged Americans also recognize their own limited understanding of the issue. Fewer than 1 in 10 say they are “very well informed” about climate change, and 75 percent or more say they would like to know more. The Alarmed also say they need more information (76%), while the Dismissive say they do not need any more information about global warming (73%).

Overall, these and other results within this report demonstrate that most Americans both need and desire more information about climate change. While information alone is not sufficient to engage the public in the issue, it is often a necessary precursor of effective action.”

>Modelo climático brasileiro mostrará o clima sob o olhar do Brasil (Fapesp, JC)

>
JC e-mail 4200, de 15 de Fevereiro de 2011

Nos modelos climáticos globais divulgados no mais recente relatório do Painel Intergovernamental sobre Mudança Climática (IPCC), divulgado em 2007, o Pantanal e o Cerrado são retratados como se fossem savanas africanas

Já fenômenos como as queimadas, que podem intensificar o efeito estufa e mudar as características das chuvas e nuvens de uma determinada região, por exemplo, não são caracterizados por não serem considerados relevantes para os países que elaboraram os modelos numéricos utilizados.

É por isso, e para auxiliar nas pesquisas mundiais sobre as mudanças climáticas e avaliar o impacto que as atividades humanas têm sobre elas, que cientistas brasileiros estão desenvolvendo o Modelo Brasileiro do Sistema Climático Global (MBSCG).

O esforço congrega cientistas do Instituto Nacional de Ciência e Tecnologia sobre Mudanças Climáticas (INCT-MC), do Programa FAPESP de Pesquisa em Mudanças Climáticas Globais e da Rede Brasileira de Pesquisa em Mudanças Climáticas Globais (Rede Clima).

Modelo brasileiro

Com conclusão estimada para 2013, o modelo climático brasileiro deverá permitir aos climatologistas realizar estudos sobre mudanças climáticas com base em um modelo que represente processos importantes para o Brasil e que são considerados secundários nos modelos climáticos estrangeiros.

“Boa parte desses modelos internacionais não atende às nossas necessidades. Temos muitos problemas associados ao clima em virtude de ações antropogênicas, como as queimadas e o desmatamento, que não são retratados e que agora serão incluídos no modelo que estamos desenvolvendo no Brasil”, explica Gilvan Sampaio de Oliveira, pesquisador Inpe e um dos coordenadores do MBSCG.

Segundo ele, o modelo brasileiro incorporará processos e interações hidrológicas, biológicas e físico-químicas relevantes do sistema climático regional e global.

Dessa forma, possibilitará gerar cenários, com resolução de 10 a 50 quilômetros, de mudanças ambientais regionais e globais que poderão ocorrer nas próximas décadas para prever seus possíveis impactos em setores como agricultura e energia.

“Com esse modelo, teremos capacidade e autonomia para gerar cenários futuros confiáveis, de modo que o país possa se preparar para enfrentar os fenômenos climáticos extremos”, disse Sampaio.

Impactos do clima na agricultura

A primeira versão do modelo brasileiro com indicações do que pode ocorrer com o clima no Brasil nos próximos 50 anos deverá ficar pronta até o fim de 2011. Para isso, os pesquisadores estão instalando e começarão a rodar em fevereiro, no supercomputador Tupã, instalado no Centro de Previsão do Tempo e Estudos Climáticos (Cptec), em Cachoeira Paulista (SP), uma versão preliminar do modelo, com módulos computacionais que analisam os fenômenos climáticos que ocorrem na atmosfera, no oceano e na superfície terrestre.

Os módulos computacionais serão integrados gradualmente a outros componentes do modelo, que avaliarão os impactos da vegetação, do ciclo de carbono terrestre, do gelo marinho e da química atmosférica no clima.

Em contrapartida, um outro componente apontará as influências das mudanças climáticas em cultivares agrícolas como a cana-de-açúcar, soja, milho e café.
“No futuro, poderemos tentar estimar a produtividade da cana-de-açúcar e da soja, por exemplo, frente ao aumento da concentração de gases de efeito estufa na atmosfera”, disse Sampaio.

Contribuição ao IPCC

Segundo o cientista, como a versão final do MSBCG só ficará pronta em 2013, o modelo climático brasileiro não será utilizado no próximo relatório que o IPCC divulgará em 2014, o AR-5. Mas o modelo que será utilizado pelo Painel Intergovernamental para realizar as simulações do AR5, o HadGEM2, contará com participação brasileira.

Por meio de uma cooperação entre o Hadley Center, no Reino Unido, e o Inpe, os pesquisadores brasileiros introduziram no modelo internacional módulos computacionais que avaliarão o impacto das plumas de fumaça produzidas por queimadas e do fogo florestal sobre o clima global, que até então não eram levados em conta nas projeções climáticas.

Com isso, o modelo passou a ser chamado HadGEM2-ES/Inpe. “Faremos simulações considerando esses componentes que introduzimos nesse modelo”, contou Sampaio.

Uso da terra e meteorologia

Em 2013, quando será concluída a versão final do Modelo Brasileiro do Sistema Climático Global, o sistema ganhará um módulo computacional de uso da terra e outro meteorológico, com alta resolução espacial. No mesmo ano, também serão realizadas as primeiras simulações de modelos regionais de alta resolução para a elaboração de um modelo climático para América do Sul com resolução de um a 10 km. “Até hoje, levávamos meses e até anos para gerar cenários regionais. Com o novo sistema de supercomputação os esforços em modelagem climática regional ganharão outra escala”, afirmou Sampaio.

(Site da Inovação Tecnológica, com informações da Agência Fapesp)

>O composto antiesquizofrenia (O Globo, JC)

>
JC e-mail 4196, de 09 de Fevereiro de 2011.

Substância de plantas e frutas se une a células-tronco para tratar doenças mentais

Uma substância encontrada em plantas e frutas como o maracujá, a laranja e o limão pode ser a chave para o tratamento de transtornos mentais como a esquizofrenia, que afeta 1% da população mundial.

O primeiro passo para essa terapia seria um estudo publicado por oito pesquisadores brasileiros na próxima edição da revista “Stem cells and development”, uma das principais do mundo na área de células-tronco.

A equipe, coordenada pelo Instituto de Ciências Biomédicas da UFRJ, é a primeira a estudar o efeito de flavonoides – um composto fartamente encontrado na natureza – em células-tronco embrionárias ou reprogramadas. O grupo não almejava o combate de doenças psíquicas, mas já avalia que esta poderia ser uma aplicação de seu trabalho.

Uma série de pesquisas já havia identificado os efeitos antioxidantes dos flavonoides. Trata-se de uma característica benéfica, que reduz o risco de diversas doenças e até retarda o envelhecimento.

– É um composto estudado há muito tempo. Já se descreveu sua ação hormonal, anti-hemorrágica, anticâncer – lembra Stevens Rehen, diretor do Laboratório Nacional de Células-Tronco Embrionárias. – O flavonoide também está presente em alimentos processados, como chá e vinho. Ainda assim, nunca haviam experimentado o seu efeito sobre o metabolismo de células-tronco.

Rehen, então, resolveu incluir a substância em sua pesquisa. E o fez de duas formas. A primeira, com células-tronco embrionárias, que têm o potencial de se transformar em diversos tipos. Para isso, são diferenciadas em laboratório, de modo que ocupem adequadamente o tecido onde são necessárias.

A segunda forma, e a mais importante para ele, é com células reprogramadas.

– Extraímos célula da pele de indivíduos adultos, reprogramamos este material e transformamos em neurônios- explica Rehen.

Aí entra o flavonoide. O composto, retirado da catingueira – uma planta típica do Nordeste semiárido -, impediu a morte celular e praticamente triplicou o número de neurônios gerado pelas células.

– Podemos dizer que o flavonóide deixa a célula-tronco mais propensa a se transformar em um neurônio – comemora Rehen. – Para haver essa conversão, a célula precisa ter acesso a uma substância chamada ácido retinoico. Chegar a ela exige um receptor. E o flavonóide aumenta justamente o número desses receptores. Como a célula reprogramada vem da pele do próprio paciente, será possível criarmos uma medicina individualizada.

Uma substância que, como o flavonoide, é um antioxidante e favorece a formação de neurônios, poderia ser usada para aumentar a memória de um cérebro já formado. Também reduziria a possibilidade de qualquer transtorno que prejudicasse o desenvolvimento daquele órgão. Essas utilidades, no entanto, ainda dependem de novos estudos. Mas a prioridade da equipe é outra: testar o componente natural no combate a doenças mentais. Algumas enfermidades, como a esquizofrenia, destroem a via que liga
o ácido retinoico às células. O flavonoide poderia reparar essa ligação, facilitando a produção de neurônios – e, assim, combatendo os sintomas característicos daquele transtorno.

Para isso, no entanto, é preciso conhecer melhor o material que protagoniza esse tratamento. A equipe de Rehen prepara-se para estudar os efeitos antioxidantes do flavonoide sobre neurônios reprogramados de pacientes com transtornos mentais.

– O desenvolvimento da esquizofrenia tem como característica disfunções no sistema antioxidante – lembra o pesquisador. – Portanto, ao estudarmos as propriedades desse
composto encontrado em frutas e plantas, estamos nos informando sobre como podemos usá-lo para combater os transtornos mentais.

O estudo foi realizado com camundongos, a partir de células embrionárias e reprogramadas daquele animal. Nos últimos seis meses, no entanto, os pesquisadores já ensaiam o início de um levantamento semelhante com humanos.
(Renato Grandelle)
(O Globo, 9/2)

>Can We Trust Climate Models? Increasingly, the Answer is ‘Yes’

>

18 JAN 2011: ANALYSIS

Yale Environment 360

Forecasting what the Earth’s climate might look like a century from now has long presented a huge challenge to climate scientists. But better understanding of the climate system, improved observations of the current climate, and rapidly improving computing power are slowly leading to more reliable methods.

by michael d. lemonick

A chart appears on page 45 of the 2007 Synthesis Report of the Intergovernmental Panel on Climate Change (IPCC), laying out projections for what global temperature and sea level should look like by the end of this century. Both are projected to rise, which will come as no surprise to anyone who’s been paying even the slightest attention to the headlines over the past decade or so. In both cases, however, the projections span a wide range of possibilities. The temperature, for example, is likely to rise anywhere from 1.8 C to 6.4 C (3.2 F to 11.5 F), while sea level could increase by as little as 7 inches or by as much as 23 — or anywhere in between.

It all sounds appallingly vague, and the fact that it’s all based on computer models probably doesn’t reassure the general public all that much. For many people, “model” is just another way of saying “not the real world.” In fairness, the wide range of possibilities in part reflects uncertainty about human behavior: The chart lays out different possible scenarios based on how much CO2 and other greenhouse gases humans might emit over the coming century. Whether the world adopts strict emissions controls or decides to ignore the climate problem entirely will make a huge difference to how much warming is likely to happen.

But even when you factor out the vagaries of politics and economics, and assume future emissions are known perfectly, the projections from climate models still cover a range of temperatures, sea levels, and other manifestations of climate change. And while there’s just one climate, there’s more than one way to simulate it. The IPCC’s numbers come from averaging nearly two dozen individual models produced by institutions including the National Center for Atmospheric Research (NCAR), the Geophysical Fluid Dynamics Laboratory (GFDL), the U.K.’s Met Office, and more. All of these models have features in common, but they’re constructed differently — and all of them leave some potentially important climate processes out entirely. So the question remains: How much can we really trust climate models to tell us about the future?

The answer, says Keith Dixon, a modeler at GFDL, is that it all depends on questions you’re asking. “If you want to know ‘is climate change something that should be on my radar screen?’” he says, “then you end up with some very solid results. The climate is warming, and we can say why. Looking to the 21st century, all reasonable projections of what humans will be doing suggest that not only will the climate continue to warm, you have a good chance of it accelerating. Those are global-scale issues, and they’re very solid.”

The reason they’re solid is that, right from the emergence of the first crude versions back in the 1960s, models have been at their heart a series of equations that describe airflow, radiation and energy balance as the Sun

The problem is that warming causes changes that act to accelerate or slow the warming.

warms the Earth and the Earth sends some of that warmth back out into space. “It literally comes down to mathematics,” says Peter Gleckler, a research scientist with the Program for Climate Model Diagnosis and Intercomparison at Livermore National Laboratory, and the basic equations are identical from one model to another. “Global climate models,” he says, echoing Dixon, “are designed to deal with large-scale flow of the atmosphere, and they do very well with that.”

The problem is that warming causes all sorts of changes — in the amount of ice in the Arctic, in the kind of vegetation on land, in ocean currents, in permafrost and cloud cover and more — that in turn can either cause more warming, or cool things off. To model the climate accurately, you have to account for all of these factors. Unfortunately, says James Hurrell, who led the NCAR’s most recent effort to upgrade its own climate model, you can’t. “Sometimes you don’t include processes simply because you don’t understand them well enough,” he says. “Sometimes it’s because they haven’t even been discovered yet.”

A good example of the former, says Dixon, is the global carbon cycle — the complex interchange of carbon between oceans, atmosphere, and biosphere. Since atmospheric carbon dioxide is driving climate change, it’s obviously important, but until about 15 years ago, it was too poorly understood to be included in the models. “Now,” says Dixon, “we’re including it — we’re simulating life, not just physics.” Equations representing ocean dynamics and sea ice also have been added to climate models as scientists have understood these crucial processes better.

Other important phenomena, such as changes in clouds, are still too complex to model accurately. “We can’t simulate individual cumulus clouds,” says Dixon, because they’re much smaller than the 200-kilometer grid boxes that make up climate models’ representation of the world. The same applies to aerosols — tiny particles, including natural dust and manmade soot — that float around in the atmosphere and can cool or warm the planet, depending on their size and composition.

But there’s no one right way to model these small-scale phenomena. “We don’t have the observations and don’t have the theory,” says Gleckler. The best they can do on this point is to simulate the net effect of all the clouds or aerosols in a grid box, a process known as “parameterization.” Different

‘It’s not a science for which everything is known, by definition,’ says one expert.

modeling centers go about it in different ways, which, unsurprisingly, leads to varying results. “It’s not a science for which everything is known, by definition,” says Gleckler. “Many groups around the world are pursuing their own research pathways to develop improved models.” If the past is any guide, modelers will be able to abandon parameterizations one by one, replacing them with mathematical representations of real physical processes.

Sometimes, modelers don’t understand a process well enough to include it at all, even if they know it could be important. One example is a caveat that appears on that 2007 IPCC chart. The projected range of sea-level rise, it warns, explicitly excludes “future rapid dynamical changes in ice flow.” In other words, if land-based ice in Greenland and Antarctica starts moving more quickly toward the sea than it has in the past — something glaciologists knew was possible, but hadn’t yet been documented — these estimates would be incorrect. And sure enough, satellites have now detected such movements. “The last generation of NCAR models,” says Hurrell, “had no ice sheet dynamics at all. The model we just released last summer does, but the representation is relatively crude. In a year or two, we’ll have a more sophisticated update.”

Sophistication only counts, however, if the models end up doing a reasonable job of representing the real world. It’s not especially useful to wait until 2100 to find out, so modelers do the next best thing: They perform “hindcasts,” which are the inverse of forecasts. “We start the models from the middle of the 1800s,” says Dixon, “and let them run through the present.” If a model reproduces the overall characteristics of the real-world climate record reasonably well, that’s a good sign.

What the models don’t try to do is to match the timing of short-term climate variations we’ve experienced. A model might produce a Dust Bowl like that of the 1930s, but in the model it might happen in the 1950s. It should produce the ups and downs of El Niño and La Niña currents in the Pacific with about the right frequency and intensity, but not necessarily at the same times as they happen in the real Pacific. Models should show slowdowns and accelerations in the overall warming trend, the result of natural fluctuations, at about the rate they happen in the real climate. But they won’t necessarily show the specific flattening of global warming we’ve observed during the past decade — a temporary slowdown that had skeptics declaring the end of climate change.

It’s also important to realize that climate represents what modelers call a boundary condition. Blizzards in the Sahara are outside the boundaries of our current climate, and so are stands of palm trees in Greenland next year. But within those boundaries, things can bounce around a great deal from year to year or decade to decade. What modelers aim to produce is a virtual climate that resembles the real one in a statistical sense, with El Niños, say, appearing about as often as they do in reality, or hundred-year storms coming once every hundred years or so.

This is one essential difference between weather forecasting and climate projection. Both use computer models, and in some cases, even the very same models. But weather forecasts start out with the observed state of the

Many decisions about how to adapt to climate change can’t wait for better climate models.

atmosphere and oceans at this very moment, then project it forward. It’s not useful for our day-to-day lives to know that September has this average high or that average low; we want to know what the actual temperature will be tomorrow, and the day after, and next week. Because the atmosphere is chaotic, anything less than perfect knowledge of today’s conditions (which is impossible, given that observations are always imperfect) will make the forecast useless after about two weeks.

Since climate projections go out not days or weeks, but decades, modelers don’t even try to make specific forecasts. Instead, they look for changes in averages — in boundary conditions. They want to know if Septembers in 2050 will be generally warmer than Septembers in 2010, or whether extreme weather events — droughts, torrential rains, floods — will become more or less frequent. Indeed, that’s the definition of climate: the average conditions in a particular place.

“Because models are put together by different scientists using different codes, each one has its strengths and weaknesses,” says Dixon. “Sometimes one [modeling] group ends up with too much or too little sea ice but does very well with El Niño and precipitation in the continental U.S., for example,” while another nails the ice but falls down on sea-level rise. When you average many models together, however, the errors tend to cancel.

Even when models reproduce the past reasonably well, however, it doesn’t guarantee that they’re equally reliable at projecting the future. That’s in part because some changes in climate are non-linear, which is to say that a small nudge can produce an unexpectedly large result. Again, ice sheets are a good example: If you look at melting alone, it’s pretty straightforward to calculate how much extra water will enter the sea for every degree of temperature rise. But because meltwater can percolate down to lubricate the undersides of glaciers, and because warmer oceans can lift the ends of glaciers up off the sea floor and remove a natural brake, the ice itself can end up getting dumped into the sea, unmelted. A relatively small temperature rise can thus lead to an unexpectedly large increase in sea level. That particular non-linearity was already suspected, if not fully understood, but there could be others lurking in the climate system.

Beyond that, says Dixon, if three-fourths of the models project that the Sahel (the area just south of the Sahara) will get wetter, for example, and a fourth says it will dry out, “there’s a tendency to go with the majority. But we can’t rule out without a whole lot of investigation whether the minority is doing something right. Maybe they have a better representation of rainfall patterns.” Even so, he says, if you have the vast majority coming up with similar results, and you go back to the underlying theory, and it makes physical sense, that tends to give you more confidence they’re right. The best confidence-builder of all, of course, is when a trend projected by models shows up in observations — warmer springs and earlier snowmelt in the Western U.S., for example, which not only makes physical sense in a warming world, but which is clearly happening.

Climate Forecasts: The Case For Living with Uncertainty

As climate science advances, predictions about the extent of future warming and its effects are likely to become less — not more — precise, journalist Fred Pearce writes. That may make it more difficult to convince the public of the reality of climate change, but it hardly diminishes the urgency of taking action.
READ MORE

And the models are constantly being improved. Climate scientists are already using modified versions to try and predict the actual timing of El Ninos and La Niñas over the next few years. They’re just beginning to wrestle with periods of 10, 20 and even 30 years in the future, the so-called decadal time span where both changing boundary conditions and natural variations within the boundaries have an influence on climate. “We’ve had a modest amount of skill with El Niños,” says Hurrell, “where 15-20 years ago we weren’t so skillful. That’s where we are with decadal predictions right now. It’s going to improve significantly.”

After two decades of evaluating climate models, Gleckler doesn’t want to downplay the shortcomings that remain in existing models. “But we have better observations as of late,” he says, “more people starting to focus on these things, and better funding. I think we have better prospects for making some real progress from now on.” 

POSTED ON 18 JAN 2011 IN BUSINESS & INNOVATION CLIMATE ENERGY SCIENCE & TECHNOLOGY NORTH AMERICA NORTH AMERICA 

>Clima sob o olhar do Brasil (Fapesp)

>

Especiais

1/2/2011

Por Elton Alisson


Agência FAPESP – Nos modelos climáticos globais divulgados no mais recente relatório do Painel Intergovernamental sobre Mudança Climática (IPCC), divulgado em 2007, o Pantanal e o Cerrado são retratados como se fossem savanas africanas.

Já fenômenos como as queimadas, que podem intensificar o efeito estufa e mudar as características de chuvas e nuvens de uma determinada região, por exemplo, não são caracterizados por não serem considerados relevantes para os países que elaboraram os modelos numéricos utilizados.

Para dispor de um modelo capaz de gerar cenários de mudanças climáticas com perspectiva brasileira, pesquisadores de diversas instituições, integrantes do Programa FAPESP de Pesquisa em Mudanças Climáticas Globais, da Rede Brasileira de Pesquisa em Mudanças Climáticas Globais (Rede Clima) e do Instituto Nacional de Ciência e Tecnologia sobre Mudanças Climáticas (INCT-MC), estão desenvolvendo o Modelo Brasileiro do Sistema Climático Global (MBSCG).

Com conclusão estimada para 2013, o MSBCG deverá permitir aos climatologistas brasileiros realizar estudos sobre mudanças climáticas com base em um modelo que represente processos importantes para o Brasil e que são considerados secundários nos modelos climáticos estrangeiros.

“Boa parte desses modelos internacionais não atende às nossas necessidades. Temos muitos problemas associados ao clima em virtude de ações antropogênicas, como as queimadas e o desmatamento, que não são retratados e que agora serão incluídos no modelo que estamos desenvolvendo no Brasil”, disse Gilvan Sampaio de Oliveira, pesquisador do Centro de Ciência do Sistema Terrestre (CCST) do Instituto Nacional de Pesquisas Espaciais (Inpe), um dos pesquisadores que coordena a construção do MBSCG.

Segundo ele, o modelo brasileiro incorporará processos e interações hidrológicas, biológicas e físico-químicas relevantes do sistema climático regional e global. Dessa forma, possibilitará gerar cenários, com resolução de 10 a 50 quilômetros, de mudanças ambientais regionais e globais que poderão ocorrer nas próximas décadas para prever seus possíveis impactos em setores como agricultura e energia.

“Com esse modelo, teremos capacidade e autonomia para gerar cenários futuros confiáveis, de modo que o país possa se preparar para enfrentar os fenômenos climáticos extremos”, disse Sampaio à Agência FAPESP.

A primeira versão do modelo brasileiro com indicações do que pode ocorrer com o clima no Brasil nos próximos 50 anos deverá ficar pronta até o fim de 2011.

Para isso, os pesquisadores estão instalando e começarão a rodar em fevereiro no supercomputador Tupã, instalado no Centro de Previsão do Tempo e Estudos Climáticos (CPTEC), em Cachoeira Paulista (SP), uma versão preliminar do modelo, com módulos computacionais que analisam os fenômenos climáticos que ocorrem na atmosfera, no oceano e na superfície terrestre.

Os módulos computacionais serão integrados gradualmente a outros componentes do modelo, que avaliarão os impactos da vegetação, do ciclo de carbono terrestre, do gelo marinho e da química atmosférica no clima. Em contrapartida, um outro componente apontará as influências das mudanças climáticas em cultivares agrícolas como a cana-de-açúcar, soja, milho e café.

“No futuro, poderemos tentar estimar a produtividade da cana-de-açúcar e da soja, por exemplo, frente ao aumento da concentração de gases de efeito estufa na atmosfera”, disse Sampaio.


Classe IPCC

Segundo o cientista, como a versão final do MSBCG só ficará pronta em 2013, o modelo climático brasileiro não será utilizado no próximo relatório que o IPCC divulgará em 2014, o AR-5. Mas o modelo que será utilizado pelo Painel Intergovernamental para realizar as simulações do AR5, o HadGEM2, contará com participação brasileira.

Por meio de uma cooperação entre o Hadley Center, no Reino Unido, e o Inpe, os pesquisadores brasileiros introduziram no modelo internacional módulos computacionais que avaliarão o impacto das plumas de fumaça produzidas por queimadas e do fogo florestal sobre o clima global, que até então não eram levados em conta nas projeções climáticas.

Com isso, o modelo passou a ser chamado HadGEM2-ES/Inpe. “Faremos simulações considerando esses componentes que introduzimos nesse modelo”, contou Sampaio.

Em 2013, quando será concluída a versão final do Modelo Brasileiro do Sistema Climático Global, o sistema ganhará um módulo computacional de uso da terra e outro metereológico, com alta resolução espacial. No mesmo ano, também serão realizadas as primeiras simulações de modelos regionais de alta resolução para a elaboração de um modelo climático para América do Sul com resolução de 1 a 10 km.

“Até hoje, levávamos meses e até anos para gerar cenários regionais. Com o novo sistema de supercomputação os esforços em modelagem climática regional ganharão outra escala”, afirmou Sampaio.

Leia reportagem publicada pela revista Pesquisa FAPESP sobre o modelo climático brasileiro. 

>Estudo europeu aperfeiçoará modelos para predizer o clima (FSP, JC)

>
JC e-mail 4187, de 27 de Janeiro de 2011

“É importante registrar que o estudo não abala a constatação de que a Groenlândia está, de qualquer forma, perdendo mais gelo do que acumula”

Marcelo Leite é jornalista. Artigo publicado na “Folha de SP”:

O artigo sobre geleiras da Groenlândia na atual edição da “Nature” constitui um bom exemplo das complexidades envolvidas nas previsões climáticas.

O trabalho científico é um exemplo, também, da dificuldade de apresentar ao público resultados incrementais da pesquisa.

À primeira vista, o estudo tira força da ideia de que o aquecimento global esteja acelerando a contribuição do gelo groenlandês para a elevação do nível dos mares.

O raciocínio era plausível. Com a atmosfera mais quente, ocorre mais derretimento na superfície. O líquido adicional fica disponível para penetrar por fendas até a base da geleira e lubrificar seu escorregamento.

Agora se sabe, graças ao grupo britânico e belga, que o fenômeno comporta um efeito de limiar.

Até um certo ponto de aquecimento, o derretimento superficial provoca aceleração. A partir desse ponto, a água passa a escoar melhor, sem lubrificar a base da geleira. Menos blocos gigantes de desprendem.

Isso não significa que seja nulo o efeito sobre a geleira groenlandesa. Algum aumento de temperatura de fato acelera sua ruptura.

O que foi posto em dúvida pelo estudo -até que novas pesquisas o confirmem ou refutem- é a hipótese de que o aumento contínuo de temperatura vá produzir uma perda linear de massa de gelo, sempre crescente.

Com esse conhecimento, os modelos de computador para predizer o comportamento do clima serão aperfeiçoados. Mas é importante registrar que o estudo não abala a constatação de que a Groenlândia está, de qualquer forma, perdendo mais gelo do que acumula.
(Folha de SP, 27/1)

>A essência da realidade física (JC, FSP)

>
JC e-mail 4184, de 24 de Janeiro de 2011

“Não enxergamos o que ocorre na essência da realidade física. Temos apenas nossos experimentos, e eles nos dão uma imagem incompleta do que ocorre”

Marcelo Gleiser é professor de física teórica no Dartmouth College, em Hanover (EUA). Artigo publicado na “Folha de SP”:

Vivemos num mundo quântico. Talvez não seja óbvio, mas sob nossa experiência do real -contínua e ordenada- existe uma outra realidade, que obedece a regras bem diferentes. A questão é, então, como conectar as duas, isto é, como começar falando de coisas que sequer são “coisas” -no sentido de que não têm extensão espacial, como uma cadeira ou um carro- e chegar em cadeiras e carros.

Costumo usar a imagem da “praia vista à distância” para ilustrar a transição da realidade quântica até nosso dia a dia: de longe, a praia parece contínua. Mas de perto, vemos sua descontinuidade, a granularidade da areia. A imagem funciona até pegarmos um grão de areia. Não vemos sua essência quântica, porque cada grão é composto de trilhões de bilhões de átomos. Com esses números, um grão é um objeto “comum”, ou “clássico”.

Portanto, não enxergamos o que ocorre na essência da realidade física. Temos apenas nossos experimentos, e eles nos dão uma imagem incompleta do que ocorre.

A mecânica quântica (MQ) revolve em torno do Princípio de Incerteza (PI). Na prática, o PI impõe uma limitação fundamental no quanto podemos saber sobre as partículas que compõem o mundo. Isso não significa que a MQ é imprecisa; pelo contrário, é a teoria mais precisa que há, explicando resultados de experimentos ao nível atômico e sendo responsável pela tecnologia digital que define a sociedade moderna.

O problema com a MQ não é com o que sabemos sobre ela, mas com o que não sabemos. E, como muitos fenômenos quânticos desafiam nossa intuição, há uma certa tensão entre os físicos a respeito da sua interpretação. A MQ estabelece uma relação entre o observador e o que é observado que não existe no dia a dia. Uma mesa é uma mesa, independentemente de olharmos para ela. No mundo quântico, não podemos afirmar que um elétron existe até que um detector interaja com ele e determine sua energia ou posição.

Como definimos a realidade pelo que existe, a MQ parece determinar que o artefato que detecta é responsável por definir a realidade. E como ele é construído por nós, é a mente humana que determina a realidade.

Vemos aqui duas consequências disso. Primeiro, que a mente passa a ocupar uma posição central na concepção do real. Segundo, como o que medimos vem em termos de informação adquirida, informação passa a ser o arcabouço do que chamamos de realidade. Vários cientistas, sérios e menos sérios, veem aqui uma espécie de teleologia: se existimos num cosmo que foi capaz de gerar a mente humana, talvez o cosmo tenha por objetivo criar essas mentes: em outras palavras, o cosmo vira uma espécie de deus!

Temos que tomar muito cuidado com esse tipo de consideração. Primeiro, porque em praticamente toda a sua existência (13,7 bilhões de anos), não havia qualquer mente no cosmo. E, mesmo sem elas, as coisas progrediram perfeitamente. Segundo, porque a vida, especialmente a inteligente, é rara. Terceiro, porque a informação decorre do uso da razão para decodificar as propriedades da matéria.

Atribuir a ela uma existência anterior à matéria, a meu ver, não faz sentido. Não há dúvida de que a MQ tem os seus mistérios. Mas é bom lembrar que ela é uma construção da mente humana.
(Folha de SP, 23/1)

>Pânico pode alimentar ceticismo da população a respeito do aquecimento global (FSP, JC)

>
Clima de alarmismo, artigo de Marcelo Leite

JC e-mail 4180, de 18 de Janeiro de 2011

Marcelo Leite é jornalista. Artigo publicado na “Folha de SP Online”:

A nova hecatombe na região serrana do Rio sugere que as previsões sobre desastres inomináveis no futuro, em decorrência da mudança do clima, podem não ser tão exagerados quanto afirmam os céticos do aquecimento global. Tudo depende de conectar causalmente esse tipo de desastre, e sua frequência, com as predições dos modelos climáticos de que uma atmosfera mais quente trará mais eventos meteorológicos extremos como esses – o que não é coisa trivial de fazer.

Pressupor tal conexão, no entanto, já foi muito criticado por pesquisadores do clima. Não haveria uma tragédia planetária por acontecer de imediato, como fantasiou o filme “O Dia Depois de Amanhã”. Agora, uma pesquisa de psicologia aplicada vem corroborar essa percepção, dizendo que mensagens alarmistas sobre a mudança climática podem ser contraproducentes e alimentar o ceticismo na população a respeito do aquecimento global.

O estudo foi publicado por Matthew Feinberg e Robb Willer em dezembro no periódico Psychological Science. Usaram dois experimentos para “provar” que mensagens alarmistas de fato aumentam o ceticismo por contradizerem a tendência das pessoas a acreditar que o mundo é justo.

Se a mudança do clima vai matar, empobrecer ou prejudicar também pessoas inocentes, como as crianças afogadas em lama no Rio, uma reação natural das pessoas seria duvidar de que o aquecimento global seja uma realidade. Li rapidamente o artigo e os dois experimentos não me convenceram muito, mas fica o convite para o leitor formar sua própria opinião.

A tese, porém, é boa. Com efeito, é de pasmar a capacidade de muita gente de não enxergar – ou não querer ver – como são abundantes os indicativos da ciência de que há, sim, uma mudança climática em curso.

Uma explicação, obviamente, é político-ideológica. Muitos optam por não acreditar em aquecimento global porque acham que é uma conspiração dos socialistas para extinguir a liberdade empresarial (por meio de regulamentação) ou a liberdade individual de dirigir jipões movidos a diesel, mas também há socialistas e comunistas – como no Brasil – convencidos de que a conspiração é de imperialistas americanos para impedir o desenvolvimento de países emergentes como o Brasil.

Quem reage irracional e psicologicamente ao alarmismo ou ideológica e canhestramente a fantasmas conspiradores vai ter razão de sobra para se tornar ainda mais cético diante do sítio de internet Global Warning (um trocadilho intraduzível entre Global Warming – aquecimento global – e Global Warning – alerta global).

Trata-se de um esforço para vincular aquecimento global com ameaças à segurança doméstica dos EUA – de bases militares ameaçadas de inundação à dependência de combustíveis fósseis importados. Ou seja, para sensibilizar o americano médio, conservador e republicano e diminuir seu ceticismo diante do fenômeno.

Se Feinberg e Willer estiverem certos, o tiro vai sair pela culatra. E os socialistas céticos tupiniquins vão babar um pouco mais de raiva dos imperialistas.
(Folha de SP Online, 17/1)

>Clima ajudou queda de Império Romano (FSP, JC)

>
JC e-mail 4178, de 14 de Janeiro de 2011

Estudo mostra que, a partir do século 3, esfriou e parou de chover no Império, e a agricultura entrou em colapso

Não é a resposta que quem corrige vestibulares espera, mas um novo estudo diz que, entre os fatores que levaram o Império Romano ao fim, está uma mudança climática.

Pesquisadores da Universidade Harvard e de várias instituições europeias mostraram que, no auge da expansão de Roma, o clima era quente e chuvoso. Isso fortalece a agricultura e, assim, ajuda a alimentar grandes exércitos, além de permitir uma economia pujante, evitando insatisfações internas.

Isso aconteceu por volta do ano 100 d.C., quando o Mediterrâneo virou um “lago romano”, e o Império chegou a colocar os pés até no Norte da atual Inglaterra, onde concluiu, em 126 d.C., a muralha de Adriano, para manter os inimigos afastados.

Uma hora, porém, a prosperidade acabou. A partir do meio do século 3, mudanças climáticas tornaram o Império Romano mais seco e frio.

Segundo o grupo internacional de pesquisadores, que publicou suas conclusões na “Science”, isso certamente afetou a produção de alimentos e pode ter estimulado causas tradicionalmente relacionadas à decadência de Roma, como a inflação.

Certamente políticas monetárias erradas colaboraram para piorar o cenário de crise econômica, dizem, mas não é por isso que se deve, nas palavras de Jan Esper, da Universidade Johannes Gutenberg (Alemanha), “seguir a crença comum de que civilizações estão isoladas de variações ambientais”.

Para saber como era o clima há tanto tempo, os cientistas analisaram 9.000 pedaços grandes de madeira antiga. A maioria veio de restos de construções e artefatos de madeira na Europa.

Cada ano cria um anel único no tronco da árvore. Pacientemente, os cientistas foram retrocedendo, comparando pedaços de madeira cada vez mais antigos.

O desafio era ir criando uma sequência história de troncos: quando não havia mais como retroceder nos anéis de um, ter um novo pedaço de madeira, mais antigo, para seguir.

Conforme a grossura desses anéis, é possível saber quanto choveu e se fez frio ou calor naquele ano.

Os cientistas destacam que a existência de mudanças climáticas em um período pré-Revolução Industrial não significa que o aquecimento global contemporâneo seja natural. “O que está acontecendo agora não tem precedentes, é muito mais rápido”, dizem os cientistas.

A ideia de que fatores ambientais, mais do que políticos, levam sociedades ao colapso ganhou força em 2005, quando o biogeógrafo americano Jared Diamond lançou o livro “Colapso”. Nele, Diamond mostra como coisas como a exploração excessiva da madeira ou da pesca levaram sociedades à crise.

Não existia grande material científico, em 2005, sobre como o ambiente tinha atingido Roma. Os romanos, ao menos, não tiveram culpa pelas mudanças no clima que atingiram seu Império.
(Ricardo Mioto)
(Folha de SP, 14/1)

>"La Niña" explica inundações em vários países do mundo (FSP, JC)

>
JC e-mail 4178, de 14 de Janeiro de 2011

Fenômeno tem causado chuvas torrenciais desde o ano passado

Menos conhecido e menos frequente que o “El Niño”, o “La Niña” é um fenômeno natural que resfria as águas do oceano Pacífico e produz mudanças na dinâmica atmosférica. Assim como o primeiro, também pode impor um padrão distinto de comportamento climático em todo o mundo.

O último episódio do “La Niña” atinge agora seu pico e, segundo estudiosos, pode se estender até o meio deste ano. Seus primeiros efeitos, avaliados como sendo de intensidade moderada a forte, começaram a ser percebidos em meados de 2010.

O fenômeno pode ser o responsável por inundações na Austrália e nas Filipinas, onde dez pessoas morreram desde o início deste mês.

As chuvas torrenciais que mataram centenas de pessoas na Venezuela e na Colômbia, em novembro e dezembro, também são reflexos do “La Niña”.

A inundação no Paquistão, em agosto do ano passado, encaixa-se nos efeitos do fenômeno.

Naquele país, os reflexos do “La Niña” foram particularmente ruins. Na região, o “La Niña” foi imediatamente seguido pelo “El Niño”, que tende a deixar as temperaturas no oceano Índico mais altas que o normal.

O ar mais quente contém mais vapor de água e assim pode produzir mais chuva.

“Os padrões altos de precipitação do “La Niña”, aliados ao calor após o “El Niño”, ajudam a explicar por que as inundações no Paquistão foram tão devastadoras”, diz o especialista Kevin Trenberth, do Centro Americano para Pesquisa Atmosférica.

Mares mais quentes na Austrália também podem explicar a dimensão das atuais inundações.

Devido ao aquecimento das águas, as inundações, em associação ao “El Niño”, devem se agravar em breve. E esses não são os únicos danos que o fenômeno pode causar. Nos próximos meses, a corrente “La Niña” pode fazer mais vítimas em outras partes do mundo.

De acordo com um estudo da Cruz Vermelha e do Instituto Internacional de Pesquisas de Clima e Sociedade, chuvas fortes podem ser esperadas no norte da América do Sul e no sudoeste da África nos próximos dois meses.

Em fevereiro de 2000, as enchentes devastadoras em Moçambique, na África, ocorreram exatamente quando o “La Niña” estava próximo do seu pico.
(Folha de SP, 14/1)

>El Niño violento dá lugar a uma poderosa La Niña e intriga cientistas (JC)

>
JC e-mail 4175, de 11 de Janeiro de 2011.

Clima de extremos

Quando o assunto é clima, o planeta tem sido regido por extremos. Ou El Niño – ocasião em que o Pacífico Equatorial se aquece acima da média – ou La Niña, quando a mesma região fica anormalmente fria. Ambos os fenômenos causam distúrbios de alcance global.

A passagem de um extremo a outro sempre ocorreu. O que preocupa é a velocidade com que ela tem acontecido. Nos últimos 60 anos, houve apenas uma vez em que um desses fenômenos foi imediatamente sucedido pelo outro, em 1972-1973. Agora, a mudança drástica voltou a ocorrer no biênio 2010-2011, intrigando especialistas.

Até o meio do ano passado, o planeta aquecia sob a influência do El Niño. O fenômeno foi o responsável por 2010 ter chegado ao fim como o ano mais quente desde o início dos registros meteorológicos, em 1850. Logo depois, para a surpresa dos pesquisadores, o clima esfriou sob a La Niña, cuja duração deve se estender até abril.

– Não há uma explicação definitiva para uma flutuação tão grande das temperaturas no Pacífico – admite Paulo Cesar Rosman, professor de Engenharia Oceânica e Costeira da Coppe. – Alguns pesquisadores atribuem este fenômeno a problemas geológicos, de vulcanismo, mas nenhuma hipótese foi comprovada até agora.

Transição no ano passado foi anormal

A resposta deverá vir da Organização Meteorológica Mundial (OMM), num relatório com lançamento previsto para as próximas semanas. Pesquisadores da instituição passaram os últimos meses na América do Sul, analisando as mudanças climáticas e eventos extremos do continente. Também foi revisado o histórico de medições de temperatura do Pacífico da última década.

– A transição de um desses fenômenos para outro não é normal; não de uma forma tão rápida – ressalta Ghassem Asrar, codiretor do Departamento de Pesquisas da OMM e diretor do Programa Mundial de Pesquisa Climática. – Normalmente El Niño e La Niña estão separados por períodos significativos de situações neutras, ou seja, anos em que a temperatura da água no Pacífico Equatorial não está muito quente ou muito fria.

De acordo com o último relatório do Painel Intergovernamental de Mudanças Climáticas (IPCC), de 2007, não há indícios de que os meninos do clima terão frequência ou extensão aumentadas este século. Suas principais consequências, no entanto, podem dar as caras mais vezes.

– É possível que cresçam os eventos extremos, associados ou não ao El Niño e La Niña – alerta Asrar. – As ondas de calor e as chuvas rigorosas, por exemplo, vão se tornar cada vez mais frequentes.

Cheias devastadoras no verão da Austrália

A mais recente catástrofe do clima está associada à La Niña. Em nota divulgada na semana passada, o governo australiano culpou o fenômeno pelas enchentes do estado de Queensland, que já deixaram cinco mortos e 75 mil pessoas isoladas, devido ao alagamento de estradas, ferrovias e aeroportos. O ano passado, quando começou o atual La Niña, foi o terceiro entre aqueles com mais precipitações da história do país.

– Embora a intensidade de El Niño e La Niña seja o mais importante fator para avaliar riscos climáticos, os eventos extremos podem se desenvolver como consequência de outras interações entre os oceanos e atmosfera – ressalta Rupa Kumar Kolli, pesquisador da OMM e especialista nos dois fenômenos. – Nem tudo depende da temperatura do Pacífico Equatorial.

Por romperem os padrões normais de circulação atmosférica, El Niño e La Niña mudam modelos climáticos típicos de cada região. Sua força é ainda mais visível nas regiões tropicais. O Brasil, país de dimensões continentais, recebe sinais nítidos de ambos os fenômenos.

– O La Niña leva chuva ao Nordeste, o que é bom; em compensação, o Sul fica seco. No El Niño, ocorre o inverso – explica Rosman. – Até a ressaca do mar segue um desses modelos. Durante o El Niño, o Posto 5,
em Copacabana, desaparece. A Marina da Glória é tomada por ondas. Os fenômenos são lentos, progressivos e cumulativos. Acontece que as pessoas só notam os problemas quando eles atingem o seu ápice. Quando cai a mureta da Praia do Arpoador, por exemplo.

No Rio, a La Niña facilitará a entrada de frentes frias neste verão. Ainda assim, segundo o Instituto Nacional de Pesquisas Espaciais, a temperatura pode ficar acima da média histórica. O índice de chuvas está dentro do normal, mas haverá “grande variabilidade espacial e temporal em sua distribuição” – ou seja, as precipitações podem ocorrer mais concentradas (portanto mais rigorosas), e períodos de seca não são descartados.

Mil anos de efeito estufa

Uma pesquisa publicada ontem na revista “Nature Geoscience” revela que o acúmulo de gases-estufa na atmosfera causará efeitos ininterruptos no clima global por, no mínimo, mil anos. A longa duração do evento será suficiente para provocar o colapso da camada de gelo da
Antártica Ocidental até o ano 3000. Com isso, haverá um aumento do nível dos oceanos de, no mínimo, quatro metros.

Esta é a primeira simulação de modelo climático a fazer previsões de um período tão grande. Para trabalhar com um cenário de 1.000 anos a partir de agora, os pesquisadores basearam-se em hipóteses mais otimistas, ou seja, de emissões de gases-estufa zeradas a partir de determinado momento, que variava entre 2010 e 2100.

– Criamos uma série de cenários – explica Shawn Marshall, professor de geografia da Universidade de Calgary, do Canadá, que desenvolveu a pesquisa junto ao Centro Canadense de Modelagem e Análise Climática. – “E se interrompêssemos completamente o uso de combustíveis fósseis e não emitíssemos mais gás carbônico na atmosfera? Quanto tempo demoraríamos para reverter os padrões atuais de mudanças climáticas? Primeiro a situação ficaria pior?”.

Outra pesquisa publicada pela revista indica que as geleiras de montanha podem perder entre 15% e 27% de seu volume até o fim do século, o que afetaria fortemente a disponibilidade de água para os centros urbanos. Entre as regiões mais afetadas estão a Nova Zelândia, que pode perder 72% de suas geleiras, e os Alpes (75%). O degelo provocaria um aumento médio do nível do mar em 12 centímetros.
(Renato Grandelle)
(O Globo, 11/1)