Arquivo da tag: ciência

Conhecimento não é fator determinante para formação de opinião sobre ciência (Fapesp)

Códigos morais e políticos podem influenciar muito mais as atitudes em relação a questões científicas e tecnológicas, aponta pesquisador (FAPESP)

14/11/2012

Por Elton Alisson

Agência FAPESP – As pesquisas sobre percepção pública da ciência e tecnologia realizadas em diferentes países, incluindo o Brasil, com o objetivo de avaliar a opinião dos cidadãos sobre temas científicos e tecnológicos deparam com o desafio de explicar quais fatores influenciam atitudes, interesse e engajamento em relação a esses assuntos.

Isso porque, do conjunto de indicadores utilizados nessas pesquisas para analisar quais fatores são mais relevantes na formação de interesses e atitudes dos cidadãos sobre ciência e tecnologia – como renda, educação, idade e escolaridade –, nenhum deles consegue explicar minimamente a variabilidade das respostas.

“Tem alguma outra variável que não estamos medindo que determina o tipo de atitude das pessoas sobre ciência e tecnologia em geral”, disse Juri Castelfranchi, professor do Departamento de Sociologia e Antropologia da Faculdade de Filosofia e Ciências Humanas (Fafich) da Universidade Federal de Minas Gerais (UFMG), durante conferência sobre os desafios interpretativos e metodológicos para o estudo da percepção pública da ciência e tecnologia que proferiu no dia 27 de outubro no 2º Seminário Internacional Empírika.

Realizado nos dias 26 e 27 de outubro no Instituto de Estudos da Linguagem (IEL) da Universidade Estadual de Campinas (Unicamp), o evento integrou a programação da Feira Ibero-americana de Ciência, Tecnologia e Inovação (Empírika).

De acordo com Castelfranchi, um dos fatores que contribuem para a dificuldade de as pesquisas sobre percepção pública da ciência e tecnologia determinarem qual ou quais processos contribuem para a construção da opinião pública sobre o tema é que elas estão “baseadas na hipótese mal fundada e fundamentada de que as atitudes das pessoas em relação aos assuntos científicos e tecnológicos são moduladas pelo conhecimento que têm sobre esses temas”.

Tradicionalmente, segundo Castelfranchi, a maioria dos estudos realizados sobre o que faz com que as pessoas aceitem ou rejeitem a realização de uma pesquisa científica ou uma nova tecnologia focalizou o interesse, o conhecimento e as atitudes dos entrevistados em relação à ciência e tecnologia, baseado na ideia de que esses três aspectos estariam relacionados.

Dessa forma, as pessoas não interessadas teriam baixo nível de informação e tenderiam, em geral, a ter atitudes mais negativas em relação à ciência e tecnologia. Por outro lado, ao estimular o interesse dessas pessoas por temas científicos e tecnológicos seria possível melhorar o nível de conhecimento delas sobre essas áreas e, consequentemente, suas atitudes em relação à ciência e tecnologia se tornariam mais positivas.

Entretanto, pesquisas de campo demonstraram que essas premissas são falsas e que a situação real é muito mais complexa do que a defendida por esse modelo, que foi derrubado.

Em geral, de acordo com os resultados de estudos recentes na área, existe um grande interesse de boa parte da população sobre os temas de ciência e tecnologia, mas que não corresponde à busca de informação.

“Há grupos de público com baixa escolaridade, principalmente em países em desenvolvimento, que não conhecem e não buscam informação sobre ciência e que têm atitudes bastante positivas em relação à ciência e tecnologia”, disse Castelfranchi.

“Em contrapartida, alguns estudos detectaram que não é verdade que, ao aumentar o conhecimento, a atitude das pessoas se torna mais positiva. Em alguns casos ocorre o contrário, elas tendem a ser mais cautelosas e críticas”, disse.

Paradoxo do conhecimento versus atitude

Segundo Castelfranchi, um dos exemplos que ilustram essa suposta contradição, batizada de “paradoxo do conhecimento versus atitude”, é a questão dos transgênicos na Europa.

O continente, que é um dos que mais investem em ciência e tecnologia, decretou no início dos anos 2000 uma moratória contra os alimentos transgênicos após intensos debates entre segmentos da sociedade favoráveis e outros contrários à tecnologia, baseados no apelo emocional e argumentos mais de cunho econômico e político do que científico.

Uma pesquisa realizada em 1998 e replicada em 2010 em toda a Comunidade Europeia sobre o conhecimento e atitudes dos europeus em relação a aplicações biotecnológicas, incluindo alimentos e vacinas transgênicas, apontou que o fator risco não era determinante para a rejeição ou não da população à nova tecnologia.

Em muitos casos, os entrevistados responderam que algumas aplicações biotecnológicas eram perigosas, mas que eram úteis, moralmente aceitáveis e que deveriam ser encorajadas. Em outros casos, os participantes da pesquisa apontaram determinadas aplicações biotecnológicas como não tão perigosas, mas politicamente e moralmente questionáveis – como os transgênicos –, o que fez com que a tecnologia fosse rejeitada.

“Não foi o risco o fator mais relevante que levou à rejeição dos transgênicos na Europa, mas considerações políticas como, entre elas, o fato de a tecnologia ser controlada por multinacionais, ser patenteada e porque os países europeus eram contrários a monoculturas”, avaliou Castelfranchi.

A pesquisa também apontou que os cidadãos europeus que tinham conhecimento mais baixo não rejeitavam os transgênicos, mas não tinham uma opinião formada sobre eles. Por outro lado, os participantes com maior escolaridade tinham opiniões favoráveis ou contrárias mais definidas.

“O conhecimento não mudou a atitude dos cidadãos europeus em relação aos transgênicos, mas sim o fato de terem uma atitude mais definida em relação à tecnologia, a exemplo do que também pode ser observado no Brasil e em outros países ibero-americanos onde foram realizadas pesquisas do gênero”, disse Castelfranchi.

Na mais recente pesquisa Percepção pública da ciência e tecnologia, realizada no fim de 2010 pelo Ministério da Ciência, Tecnologia e Inovação (MCTI) com mais de 2 mil pessoas em todo o país, nenhum dos grupos específicos, de diferentes níveis sociais e de escolaridade, respondeu que as tecnologias trazem mais malefícios do que benefícios, quando perguntados sobre isso.

Porém, os participantes que mais conheciam cientistas e instituições de pesquisa foram justamente os que declararam em maior proporção que os cientistas podem ser perigosos em função do conhecimento que possuem.

“Não há nenhuma associação entre baixa escolaridade e achar que a ciência é perigosa. Mas, pelo contrário: pessoas de alta escolaridade tendem a ter uma postura mais cautelosa tanto em relação aos benefícios como sobre os malefícios apresentados pela ciência e tecnologia”, afirmou Castelfranchi.

Valores morais e políticos

No caso do Brasil, um dos fatores relevantes que influenciam as atitudes dos brasileiros em relação à ciência e tecnologia, identificado por Castelfranchi e outros pesquisadores que analisaram os dados da pesquisa realizada pelo MCTI, é o porte das cidades onde os entrevistados moram.

Os pesquisadores constataram que os participantes da pesquisa que moram em cidades brasileiras de grande porte tendem a avaliar melhor os prós e contras do desenvolvimento tecnocientífico para responder se a ciência e tecnologia trazem só benefícios ou malefícios. Já as pessoas que residem em cidades pequenas têm uma chance ligeiramente maior de apontar que a ciência só traz benefícios.

Contudo, tanto essa variável como nenhuma outra, como o sexo dos entrevistados, não consegue explicar, por si só, a variabilidade das respostas se a ciência e a tecnologia trazem mais benefícios ou malefícios.

“Nenhum dos fatores analisados até agora implica as pessoas terem uma posição mais otimista ou pessimista sobre a ciência e a tecnologia. Tem outros pontos, que precisamos descobrir, que influenciam essa resposta”, avaliou Castelfranchi.

Uma das hipóteses levantadas pelo pesquisador é que os códigos morais e políticos das pessoas, como a religião, podem ser mais determinantes do que o conhecimento que elas possuem ou não para formar suas opiniões sobre aspectos específicos da ciência e da tecnologia.

Entre os participantes da pesquisa sobre percepção pública da ciência e tecnologia realizada pelo MCTI, os que se declararam católicos concordaram mais do que os evangélicos com uma das afirmações feitas durante o estudo de que por causa de seu conhecimento os cientistas têm poderes que os tornam perigosos e que a ciência tem que ser controlada socialmente.

“A trajetória e a orientação de vida e os valores morais das pessoas, provavelmente, exercem uma influência muito maior na modulação de suas atitudes em relação à ciência e tecnologia em geral e sobre aspectos específicos da pesquisa do que o nível de conhecimento que elas têm”, estima Castelfranchi.

Para comprovar essa hipótese, de acordo com o pesquisador, é preciso desenvolver novas metodologias qualitativas e quantitativas e grandes quantidades de observações etnográficas para verificar como as pessoas se posicionam em relação à ciência e tecnologia, abolindo a ideia de que isso está relacionado apenas ao nível de conhecimento.

“Precisamos renovar nossas metodologias de pesquisa e a forma como olhamos e interpretamos os dados das pesquisas de percepção pública da ciência e tecnologia para entender como as pessoas atribuem sentido e constroem suas opiniões sobre questões científicas e tecnológicas, para termos uma visão dinâmica de como formam suas atitudes”, afirmou Castelfranchi.

How To Think About Science, Part 1 – 24 (CBC)

Friday, January 2, 2009

If science is neither cookery, nor angelic virtuosity, then what is it?
Modern societies have tended to take science for granted as a way of knowing, ordering and controlling the world. Everything was subject to science, but science itself largely escaped scrutiny. This situation has changed dramatically in recent years. Historians, sociologists, philosophers and sometimes scientists themselves have begun to ask fundamental questions about how the institution of science is structured and how it knows what it knows. David Cayley talks to some of the leading lights of this new field of study.

Episode Guide

Episode 1 – Steven Shapin and Simon Schaffer
Episode 2 – Lorraine Daston

Episode 3 – Margaret Lock
Episode 4 – Ian Hacking and Andrew Pickering
Episode 5 – Ulrich Beck and Bruno Latour
Episode 6 – James Lovelock
Episode 7 – Arthur Zajonc
Episode 8 – Wendell Berry
Episode 9 – Rupert Sheldrake
Episode 10 – Brian Wynne
Episode 11 – Sajay Samuel
Episode 12 – David Abram
Episode 13 – Dean Bavington
Episode 14 – Evelyn Fox Keller
Episode 15 – Barbara Duden
 and Silya Samerski 
Episode 16 – Steven Shapin
Episode 17 – Peter Galison
Episode 18 – Richard Lewontin
Episode 19 – Ruth Hubbard
Episode 20 – Michael Gibbons, Peter Scott, & Janet Atkinson Grosjean
Episode 21 -Christopher Norris and Mary Midgely
Episode 22 – Allan Young
Episode 23 – Lee Smolin
Episode 24 – Nicholas Maxwell

It’s Global Warming, Stupid (Bloomberg)

By  on November 01, 2012

http://www.businessweek.com/articles/2012-11-01/its-global-warming-stupid

Yes, yes, it’s unsophisticated to blame any given storm on climate change. Men and women in white lab coats tell us—and they’re right—that many factors contribute to each severe weather episode. Climate deniers exploit scientific complexity to avoid any discussion at all.

Clarity, however, is not beyond reach. Hurricane Sandy demands it: At least 40 U.S. deaths. Economic losses expected to climb as high as $50 billion. Eight million homes without power. Hundreds of thousands of people evacuated. More than 15,000 flights grounded. Factories, stores, and hospitals shut. Lower Manhattan dark, silent, and underwater.

An unscientific survey of the social networking literature on Sandy reveals an illuminating tweet (you read that correctly) from Jonathan Foley, director of the Institute on the Environment at the University of Minnesota. On Oct. 29, Foley thumbed thusly: “Would this kind of storm happen without climate change? Yes. Fueled by many factors. Is storm stronger because of climate change? Yes.” Eric Pooley, senior vice president of the Environmental Defense Fund (and former deputy editor of Bloomberg Businessweek), offers a baseball analogy: “We can’t say that steroids caused any one home run by Barry Bonds, but steroids sure helped him hit more and hit them farther. Now we have weather on steroids.”

In an Oct. 30 blog post, Mark Fischetti of Scientific American took a spin through Ph.D.-land and found more and more credentialed experts willing to shrug off the climate caveats. The broadening consensus: “Climate change amps up other basic factors that contribute to big storms. For example, the oceans have warmed, providing more energy for storms. And the Earth’s atmosphere has warmed, so it retains more moisture, which is drawn into storms and is then dumped on us.” Even those of us who are science-phobic can get the gist of that.

Sandy featured a scary extra twist implicating climate change. An Atlantic hurricane moving up the East Coast crashed into cold air dipping south from Canada. The collision supercharged the storm’s energy level and extended its geographical reach. Pushing that cold air south was an atmospheric pattern, known as a blocking high, above the Arctic Ocean. Climate scientists Charles Greene and Bruce Monger of Cornell University, writing earlier this year in Oceanography, provided evidence that Arctic icemelts linked to global warming contribute to the very atmospheric pattern that sent the frigid burst down across Canada and the eastern U.S.

If all that doesn’t impress, forget the scientists ostensibly devoted to advancing knowledge and saving lives. Listen instead to corporate insurers committed to compiling statistics for profit.

On Oct. 17 the giant German reinsurance company Munich Re issued a prescient report titled Severe Weather in North America. Globally, the rate of extreme weather events is rising, and “nowhere in the world is the rising number of natural catastrophes more evident than in North America.” From 1980 through 2011, weather disasters caused losses totaling $1.06 trillion. Munich Re found “a nearly quintupled number of weather-related loss events in North America for the past three decades.” By contrast, there was “an increase factor of 4 in Asia, 2.5 in Africa, 2 in Europe, and 1.5 in South America.” Human-caused climate change “is believed to contribute to this trend,” the report said, “though it influences various perils in different ways.”

Global warming “particularly affects formation of heat waves, droughts, intense precipitation events, and in the long run most probably also tropical cyclone intensity,” Munich Re said. This July was the hottest month recorded in the U.S. since record-keeping began in 1895, according to the National Oceanic and Atmospheric Administration. The U.S. Drought Monitor reported that two-thirds of the continental U.S. suffered drought conditions this summer.

Granted, Munich Re wants to sell more reinsurance (backup policies purchased by other insurance companies), so maybe it has a selfish reason to stir anxiety. But it has no obvious motive for fingering global warming vs. other causes. “If the first effects of climate change are already perceptible,” said Peter Hoppe, the company’s chief of geo-risks research, “all alerts and measures against it have become even more pressing.”

Which raises the question of what alerts and measures to undertake. In his book The Conundrum, David Owen, a staff writer at theNew Yorker, contends that as long as the West places high and unquestioning value on economic growth and consumer gratification—with China and the rest of the developing world right behind—we will continue to burn the fossil fuels whose emissions trap heat in the atmosphere. Fast trains, hybrid cars, compact fluorescent light bulbs, carbon offsets—they’re just not enough, Owen writes.

Yet even he would surely agree that the only responsible first step is to put climate change back on the table for discussion. The issue was MIA during the presidential debates and, regardless of who wins on Nov. 6, is unlikely to appear on the near-term congressional calendar. After Sandy, that seems insane.

Mitt Romney has gone from being a supporter years ago of clean energy and emission caps to, more recently, a climate agnostic. On Aug. 30, he belittled his opponent’s vow to arrest climate change, made during the 2008 presidential campaign. “President Obama promised to begin to slow the rise of the oceans and heal the planet,” Romney told the Republican National Convention in storm-tossed Tampa. “My promise is to help you and your family.” Two months later, in the wake of Sandy, submerged families in New Jersey and New York urgently needed some help dealing with that rising-ocean stuff.

Obama and his strategists clearly decided that in a tight race during fragile economic times, he should compete with Romney by promising to mine more coal and drill more oil. On the campaign trail, when Obama refers to the environment, he does so only in the context of spurring “green jobs.” During his time in office, Obama has made modest progress on climate issues. His administration’s fuel-efficiency standards will reduce by half the amount of greenhouse gas emissions from new cars and trucks by 2025. His regulations and proposed rules to curb mercury, carbon, and other emissions from coal-fired power plants are forcing utilities to retire some of the dirtiest old facilities. And the country has doubled the generation of energy from renewable sources such as solar and wind.

Still, renewable energy accounts for less than 15 percent of the country’s electricity. The U.S. cannot shake its fossil fuel addiction by going cold turkey. Offices and factories can’t function in the dark. Shippers and drivers and air travelers will not abandon petroleum overnight. While scientists and entrepreneurs search for breakthrough technologies, the next president should push an energy plan that exploits plentiful domestic natural gas supplies. Burned for power, gas emits about half as much carbon as coal. That’s a trade-off already under way, and it’s worth expanding. Environmentalists taking a hard no-gas line are making a mistake.

Conservatives champion market forces—as do smart liberals—and financial incentives should be part of the climate agenda. In 2009 the House of Representatives passed cap-and-trade legislation that would have rewarded more nimble industrial players that figure out how to use cleaner energy. The bill died in the Senate in 2010, a victim of Tea Party-inspired Republican obstructionism and Obama’s decision to spend his political capital to push health-care reform.

Despite Republican fanaticism about all forms of government intervention in the economy, the idea of pricing carbon must remain a part of the national debate. One politically plausible way to tax carbon emissions is to transfer the revenue to individuals. Alaska, which pays dividends to its citizens from royalties imposed on oil companies, could provide inspiration (just as Romneycare in Massachusetts pointed the way to Obamacare).

Ultimately, the global warming crisis will require global solutions. Washington can become a credible advocate for moving the Chinese and Indian economies away from coal and toward alternatives only if the U.S. takes concerted political action. At the last United Nations conference on climate change in Durban, South Africa, the world’s governments agreed to seek a new legal agreement that binds signatories to reduce their carbon emissions. Negotiators agreed to come up with a new treaty by 2015, to be put in place by 2020. To work, the treaty will need to include a way to penalize countries that don’t meet emission-reduction targets—something the U.S. has until now refused to support.

If Hurricane Sandy does nothing else, it should suggest that we need to commit more to disaster preparation and response. As with climate change, Romney has displayed an alarmingly cavalier attitude on weather emergencies. During one Republican primary debate last year, he was asked point-blank whether the functions of the Federal Emergency Management Agency ought to be turned back to the states. “Absolutely,” he replied. Let the states fend for themselves or, better yet, put the private sector in charge. Pay-as-you-go rooftop rescue service may appeal to plutocrats; when the flood waters are rising, ordinary folks welcome the National Guard.

It’s possible Romney’s kill-FEMA remark was merely a pander to the Right, rather than a serious policy proposal. Still, the reconfirmed need for strong federal disaster capability—FEMA and Obama got glowing reviews from New Jersey Governor Chris Christie, a Romney supporter—makes the Republican presidential candidate’s campaign-trail statement all the more reprehensible.

The U.S. has allowed transportation and other infrastructure to grow obsolete and deteriorate, which poses a threat not just to public safety but also to the nation’s economic health. With once-in-a-century floods now occurring every few years, New York Governor Andrew Cuomo and New York City Mayor Michael Bloomberg said the country’s biggest city will need to consider building surge protectors and somehow waterproofing its enormous subway system. “It’s not prudent to sit here and say it’s not going to happen again,” Cuomo said. “I believe it is going to happen again.”

David Rothkopf, the chief executive and editor-at-large of Foreign Policy, noted in an Oct. 29 blog post that Sandy also brought his hometown, Washington, to a standstill, impeding affairs of state. To lessen future impact, he suggested burying urban and suburban power lines, an expensive but sensible improvement.

Where to get the money? Rothkopf proposed shifting funds from post-Sept. 11 bureaucratic leviathans such as the Department of Homeland Security, which he alleges is shot through with waste. In truth, what’s lacking in America’s approach to climate change is not the resources to act but the political will to do so. A Pew Research Center poll conducted in October found that two-thirds of Americans say there is “solid evidence” the earth is getting warmer. That’s down 10 points since 2006. Among Republicans, more than half say it’s either not a serious problem or not a problem at all.

Such numbers reflect the success of climate deniers in framing action on global warming as inimical to economic growth. This is both shortsighted and dangerous. The U.S. can’t afford regular Sandy-size disruptions in economic activity. To limit the costs of climate-related disasters, both politicians and the public need to accept how much they’re helping to cause them.

Far from random, evolution follows a predictable genetic pattern, Princeton researchers find (Princeton)

Posted October 25, 2012; 12:00 p.m.

by Morgan Kelly, Office of Communications

Evolution, often perceived as a series of random changes, might in fact be driven by a simple and repeated genetic solution to an environmental pressure that a broad range of species happen to share, according to new research.

Princeton University research published in the journal Science suggests that knowledge of a species’ genes — and how certain external conditions affect the proteins encoded by those genes — could be used to determine a predictable evolutionary pattern driven by outside factors. Scientists could then pinpoint how the diversity of adaptations seen in the natural world developed even in distantly related animals.

Andolfatto bug

The Princeton researchers sequenced the expression of a poison-resistant protein in insect species that feed on plants such as milkweed and dogbane that produce a class of steroid-like cardiotoxins called cardenolides as a natural defense. The insects surveyed spanned three orders: butterflies and moths (Lepidoptera); beetles and weevils (Coleoptera); and aphids, bed bugs, milkweed bugs and other sucking insects (Hemiptera). Above: Dogbane beetle(Photo courtesy of Peter Andolfatto)

“Is evolution predictable? To a surprising extent the answer is yes,” said senior researcher Peter Andolfatto, an assistant professor in Princeton’s Department of Ecology and Evolutionary Biology and the Lewis-Sigler Institute for Integrative Genomics. He worked with lead author and postdoctoral research associate Ying Zhen, and graduate students Matthew Aardema and Molly Schumer, all from Princeton’s ecology and evolutionary biology department, as well as Edgar Medina, a biological sciences graduate student at the University of the Andes in Colombia.

The researchers carried out a survey of DNA sequences from 29 distantly related insect species, the largest sample of organisms yet examined for a single evolutionary trait. Fourteen of these species have evolved a nearly identical characteristic due to one external influence — they feed on plants that produce cardenolides, a class of steroid-like cardiotoxins that are a natural defense for plants such as milkweed and dogbane.

Though separated by 300 million years of evolution, these diverse insects — which include beetles, butterflies and aphids — experienced changes to a key protein called sodium-potassium adenosine triphosphatase, or the sodium-potassium pump, which regulates a cell’s crucial sodium-to-potassium ratio. The protein in these insects eventually evolved a resistance to cardenolides, which usually cripple the protein’s ability to “pump” potassium into cells and excess sodium out.

Andolfatto lab

Lead author Ying Zhen (foreground), Andolfatto (far left), fourth author and graduate student Molly Schumer (near left), and their co-authors sequenced and assembled all the expressed genes in 29 distantly related insect species, the largest sample of organisms yet examined for a single evolutionary trait. They used these sequences to predict how a certain protein would be encoded in the genes of 14 distantly related species that evolved a similar resistance to toxic plants. Similar techniques could be used to trace protein changes in a species’ DNA to understand how many diverse organisms evolved as a result of environmental factors. At right is research assistant Ilona Ruhl, who was not involved in the research. (Photo by Denise Applewhite)

Andolfatto and his co-authors first sequenced and assembled all the expressed genes in the studied species. They used these sequences to predict how the sodium-potassium pump would be encoded in each of the species’ genes based on cardenolide exposure.

Scientists using similar techniques could trace protein changes in a species’ DNA to understand how many diverse organisms evolved as a result of environmental factors, Andolfatto said. “To apply this approach more generally a scientist would have to know something about the genetic underpinnings of a trait and investigate how that trait evolves in large groups of species facing a common evolutionary problem,” Andolfatto said.

“For instance, the sodium-potassium pump also is a candidate gene location related to salinity tolerance,” he said. “Looking at changes to this protein in the right organisms could reveal how organisms have or may respond to the increasing salinization of oceans and freshwater habitats.”

Andolfatto bug

Milkweed tussock moth (Photo courtesy of Peter Andolfatto)

Jianzhi Zhang, a University of Michigan professor of ecology and evolutionary biology, said that the Princeton-based study shows that certain traits have a limited number of molecular mechanisms, and that numerous, distinct species can share the few mechanisms there are. As a result, it is likely that a cross-section of certain organisms can provide insight into the development of other creatures, he said.

“The finding of parallel evolution in not two, but numerous herbivorous insects increases the significance of the study because such frequent parallelism is extremely unlikely to have happened simply by chance,” said Zhang, who is familiar with the study but had no role in it.

“It shows that a common molecular mechanism is used by many different insects to defend themselves against the toxins in their food, suggesting that perhaps the number of potential mechanisms for achieving this goal is very limited,” he said. “That many different insects independently evolved the same molecular tricks to defend themselves against the same toxin suggests that studying a small number of well-chosen model organisms can teach us a lot about other species. Yes, evolution is predictable to a certain degree.”

Andolfatto and his co-authors examined the sodium-potassium pump protein because of its well-known sensitivity to cardenolides. In order to function properly in a wide variety of physiological contexts, cells must be able to control levels of potassium and sodium. Situated on the cell membrane, the protein generates a desired potassium to sodium ratio by “pumping” three sodium atoms out of the cell for every two potassium atoms it brings in.

Cardenolides disrupt the exchange of potassium and sodium, essentially shutting down the protein, Andolfatto said. The human genome contains four copies of the pump protein, and it is a candidate gene for a number of human genetic disorders, including salt-sensitive hypertension and migraines. In addition, humans have long used low doses of cardenolides medicinally for purposes such as controlling heart arrhythmia and congestive heart failure.

Andolfatto bug

Large milkweed bugs (Photo courtesy of Peter Andolfatto)

The Princeton researchers used the DNA microarray facility in the University’s Lewis-Sigler Institute for Integrative Genomics to sequence the expression of the sodium-potassium pump protein in insect species spanning three orders: butterflies and moths (Lepidoptera); beetles and weevils (Coleoptera); and aphids, bed bugs, milkweed bugs and other sucking insects (Hemiptera).

The researchers found that the genes of cardenolide-resistant insects incorporated various mutations that allowed it to resist the toxin. During the evolutionary timeframe examined, the sodium-potassium pump of insects feeding on dogbane and milkweed underwent 33 mutations at sites known to affect sensitivity to cardenolides. These mutations often involved similar or identical amino-acid changes that reduced susceptibility to the toxin. On the other hand, the sodium-potassium pump mutated just once in insects that do not feed on these plants.

Significantly, the researchers found that multiple gene duplications occurred in the ancestors of several of the resistant species. These insects essentially wound up with one conventional sodium-potassium pump protein and one “experimental” version, Andolfatto said. In these insects, the newer, hardier versions of the sodium-potassium pump are mostly expressed in gut tissue where they are likely needed most.

“These gene duplications are an elegant solution to the problem of adapting to environmental changes,” Andolfatto said. “In species with these duplicates, the organism is free to experiment with one copy while keeping the other constant, avoiding the risk that the new version of the protein will not perform its primary job as well.”

The researchers’ findings unify the generally separate ideas of what predominately drives genetic evolution: protein evolution, the evolution of the elements that control protein expression or gene duplication. This study shows that all three mechanisms can be used to solve the same evolutionary problem, Andolfatto said.

Central to the work is the breadth of species the researchers were able to examine using modern gene sequencing equipment, Andolfatto said.

“Historically, studying genetic evolution at this level has been conducted on just a handful of ‘model’ organisms such as fruit flies,” Andolfatto said. “Modern sequencing methods allowed us to approach evolutionary questions in a different way and come up with more comprehensive answers than had we examined one trait in any one organism.

“The power of what we’ve done is to survey diverse organisms facing a similar problem and find striking evidence for a limited number of possible solutions,” he said. “The fact that many of these solutions are used over and over again by completely unrelated species suggests that the evolutionary path is repeatable and predictable.”

The paper, “Parallel Molecular Evolution in an Herbivore Community,” was published Sept. 28 by Science. The research was supported by grants from the Centre for Genetic Engineering and Biotechnology, the National Science Foundation and the National Institutes of Health.

Nate Silver’s ‘Signal and the Noise’ Examines Predictions (N.Y.Times)

Mining Truth From Data Babel

By LEONARD MLODINOW

Published: October 23, 2012

A friend who was a pioneer in the computer games business used to marvel at how her company handled its projections of costs and revenue. “We performed exhaustive calculations, analyses and revisions,” she would tell me. “And we somehow always ended with numbers that justified our hiring the people and producing the games we had wanted to all along.” Those forecasts rarely proved accurate, but as long as the games were reasonably profitable, she said, you’d keep your job and get to create more unfounded projections for the next endeavor.

Alessandra Montalto/The New York Times

THE SIGNAL AND THE NOISE

Why So Many Predictions Fail — but Some Don’t

By Nate Silver

Illustrated. 534 pages. The Penguin Press. $27.95.

This doesn’t seem like any way to run a business — or a country. Yet, as Nate Silver, a blogger for The New York Times, points out in his book, “The Signal and the Noise,” studies show that from the stock pickers on Wall Street to the political pundits on our news channels, predictions offered with great certainty and voluminous justification prove, when evaluated later, to have had no predictive power at all. They are the equivalent of monkeys tossing darts.

As one who has both taught and written about such phenomena, I have long felt like leaning out my window to shout, “Network”-style, “I’m as mad as hell and I’m not going to take this anymore!” Judging by Mr. Silver’s lively prose — from energetic to outraged — I think he feels the same way.

Nate Silver. Robert Gauldin

The book’s title comes from electrical engineering, where a signal is something that conveys information, while noise is an unwanted, unmeaningful or random addition to the signal. Problems arise when the noise is as strong as, or stronger than, the signal. How do you recognize which is which?

Today the data we have available to make predictions has grown almost unimaginably large: it represents 2.5 quintillion bytes of data each day, Mr. Silver tells us, enough zeros and ones to fill a billion books of 10 million pages each. Our ability to tease the signal from the noise has not grown nearly as fast. As a result, we have plenty of data but lack the ability to extract truth from it and to build models that accurately predict the future that data portends.

Mr. Silver, just 34, is an expert at finding signal in noise. He is modest about his accomplishments, but he achieved a high profile when he created a brilliant and innovative computer program for forecasting the performance of baseball players, and later a system for predicting the outcome of political races. His political work had such success in the 2008 presidential election that it brought him extensive media coverage as well as a home at The Times for his blog, FiveThiryEight.com, though some conservatives have been critical of his methods during this election cycle.

His knack wasn’t lost on book publishers, who, as he puts it, approached him “to capitalize on the success of books such as ‘Moneyball’ and ‘Freakonomics.’ ” Publishers are notorious for pronouncing that Book A will sell just a thousand copies, while Book B will sell a million, and then proving to have gotten everything right except for which was A and which was B. In this case, to judge by early sales, they forecast Mr. Silver’s potential correctly, and to judge by the friendly tone of the book, it couldn’t have happened to a nicer guy.

Healthily peppered throughout the book are answers to its subtitle, “Why So Many Predictions Fail — but Some Don’t”: we are fooled into thinking that random patterns are meaningful; we build models that are far more sensitive to our initial assumptions than we realize; we make approximations that are cruder than we realize; we focus on what is easiest to measure rather than on what is important; we are overconfident; we build models that rely too heavily on statistics, without enough theoretical understanding; and we unconsciously let biases based on expectation or self-interest affect our analysis.

Regarding why models do succeed, Mr. Silver provides just bits of advice (other than to avoid the failings listed above). Mostly he stresses an approach to statistics named after the British mathematician Thomas Bayes, who created a theory of how to adjust a subjective degree of belief rationally when new evidence presents itself.

Suppose that after reading a review, you initially believe that there is a 75 percent chance that you will like a certain book. Then, in a bookstore, you read the book’s first 10 pages. What, then, are the chances that you will like the book, given the additional information that you liked (or did not like) what you read? Bayes’s theory tells you how to update your initial guess in light of that new data. This may sound like an exercise that only a character in “The Big Bang Theory” would engage in, but neuroscientists have found that, on an unconscious level, our brains do naturally use Bayesian prediction.

Mr. Silver illustrates his dos and don’ts through a series of interesting essays that examine how predictions are made in fields including chess, baseball, weather forecasting, earthquake analysis and politics. A chapter on poker reveals a strange world in which a small number of inept but big-spending “fish” feed a much larger community of highly skilled sharks competing to make their living off the fish; a chapter on global warming is one of the most objective and honest analyses I’ve seen. (Mr. Silver concludes that the greenhouse effect almost certainly exists and will be exacerbated by man-made CO2 emissions.)

So with all this going for the book, as my mother would say, what’s not to like?

The main problem emerges immediately, in the introduction, where I found my innately Bayesian brain wondering: Where is this going? The same question came to mind in later essays: I wondered how what I was reading related to the larger thesis. At times Mr. Silver reports in depth on a topic of lesser importance, or he skates over an important topic only to return to it in a later chapter, where it is again discussed only briefly.

As a result, I found myself losing the signal for the noise. Fortunately, you will not be tested on whether you have properly grasped the signal, and even the noise makes for a good read.

Leonard Mlodinow is the author of “Subliminal: How Your Unconscious Mind Rules Your Behavior” and “The Drunkard’s Walk: How Randomness Rules Our Lives.”

Itália condena sete cientistas por não prever terremoto (Folha de São Paulo)

JC e-mail 4609, de 23 de Outubro de 2012.

Em 2009, o abalo sísmico em L’Aquila matou mais de 300 pessoas e deixou cerca de 65 mil desabrigadas. Justiça alega que os especialistas foram negligentes.

Um tribunal da Itália condenou ontem (22) sete cientistas a cumprir seis anos de prisão por não terem previsto o terremoto que atingiu o país em 2009, na cidade de L’Aquila, região de Abruzzo. Mais de 300 pessoas morreram.

Todos os cientistas, que vão recorrer em liberdade, eram membros da Comissão Nacional para Previsão e Prevenção de Riscos. Foram acusados de negligência, por não terem analisado corretamente as possibilidades do terremoto acontecer e, assim, alertar as autoridades.

Entre os sete condenados estão grandes nomes da ciência italiana, como o professor Enzo Boschi, que presidiu o Instituto Nacional de Geofísica e Vulcanologia, e o vice-diretor da Defesa Civil, Bernardo de Bernardinis.

Cientistas de diversas partes do mundo protestaram contra a decisão do tribunal em condená-los por homicídio culposo (quando não há intenção de matar). Em protesto, uma carta com mais de 5.000 assinaturas de cientistas foi entregue ao presidente italiano, Giorgio Napolitano, alegando que a ciência não possui meios para prever terremotos, e que o processo pode impedir que futuramente especialistas aconselhem governos a respeito de riscos sísmicos.

Imprevisível – Segundo a técnica de sismologia do Instituto de Astronomia, Geofísica e Ciências Atmosféricas da USP (IAG-USP) Célia Fernandes, é muito difícil identificar o momento exato em que irá acontecer um abalo sísmico. “Todos os profissionais de sismologia trabalham com o objetivo de prever terremotos, mas não existe regra na natureza. Mesmo a recorrência de sismos não é garantia de que um terremoto de grande magnitude está prestes a acontecer”, afirma.

Os cientistas se reuniram na cidade de L’Aquila em 31 de março de 2009, seis dias antes do terremoto, e não comunicaram sobre a chance de um abalo sísmico. Para o tribunal, eles falharam por terem subestimado os riscos, limitando a ação das autoridades públicas, que não tiveram tempo suficiente para tomar medidas necessárias para proteger a população.

Segundo os promotores, uma série de tremores de baixo nível atingiu a região nos meses que antecederam o terremoto e isso deveria ter sido interpretado pelos especialistas como um sinal do que estava para acontecer.

O terremoto de magnitude 6,3 graus atingiu L’Aquila em abril de 2009. Além das mortes, também feriu outras 1.500 pessoas. Estima-se que 65 mil tenham ficado desabrigadas. A condenação dos cientistas ainda não é definitiva. Eles devem entrar com um recurso.

*   *   *

Artigos:

David Alexander. An evaluation of medium-term recovery processes after the 6 April 2009 earthquake in L’Aquila, Central Italy. Environmental Hazards, iFirst.

Abstract

This article uses the earthquake of 6 April 2009 at L’Aquila, central Italy (magnitude 6.3) as a case history of processes of recovery from disaster. These are evaluated according to criteria linked to both vulnerability analysis and disaster risk-reduction processes. The short- and medium-term responses to the disaster are evaluated, and 11 criticisms are made of the Italian Government’s policy on transitional shelter, which has led to isolation, social fragmentation and deprivation of services. Government policy on disaster risk is further evaluated in the light of the UNISDR Hyogo Framework for Action. Lack of governance and democratic participation is evident in the response to disasters. It is concluded that without an adequately planned strategy for managing the long-term recovery process, events such as the L’Aquila earthquake open up Pandora’s box of unwelcome consequences, including economic stagnation, stalled reconstruction, alienation of the local population, fiscal deprivation and corruption. Such phenomena tend to perpetuate rather than reduce vulnerability to disasters.

“[…] science and scientists were not on trial. The hypothesis of culpability being tested in the courts referred to the failure to adopt a precautionary approach in the face of clear indications of impending seismic impact, not failure to predict an earthquake, and this is amply documented in official records”.

David E. Alexander. The L’Aquila Earthquake of 6 April 2009 and Italian Government Policy on Disaster Response. Journal of Natural Resources Policy Research, Vol. 2, Iss. 4, 2010

Abstract

This paper describes the impact of the earthquake that struck the central Italian city of L’Aquila on 6 April 2009, killing 308 people and leaving 67 500 homeless. The pre-impact, emergency, and early recovery phases are discussed in terms of the nature and effectiveness of government policy. Disaster risk reduction (DRR) in Italy is evaluated in relation to the structure of civil protection and changes wrought by both the L’Aquila disaster and public scandals connected with the misappropriation of funds. Six of the most important lessons are derived from this analysis and related to DRR needs both in Italy and elsewhere in the world.

“As articulated at the meeting of the Commission on Major Risks on 31 March 2009, the Italian Government’s position was unequivocal: there was no cause for alarm. This attitude permeated its way down the ranks of the civil protection system. Then, at 00:30 hrs on Monday 6 April 2010, a tremor that was larger than usual shook L’Aquila. Residents rushed out of their houses in alarm. The strategy adopted by civil protection authorities was to tour the streets with loudspeakers advising people to calm down and return home. In the town of Pagánica, less than 10 km northeast of L’Aquila, residents did exactly that: in the ensuing main shock three hours later, eight of them died and 40 were seriously injured. In L’Aquila city I investigated one case in which a young lady had decided to remain out of doors after the foreshock, while her parents returned home. Their bodies were recovered by firemen from a space barely 15 cm wide into which the building had compressed as it collapsed”.

L’Aquila quake: Italy scientists guilty of manslaughter (BBC)

22 October 2012

The BBC’s Alan Johnston in Rome says the prosecution argued that the scientists were “just too reassuring”

Six Italian scientists and an ex-government official have been sentenced to six years in prison over the 2009 deadly earthquake in L’Aquila.

A regional court found them guilty of multiple manslaughter.

Prosecutors said the defendants gave a falsely reassuring statement before the quake, while the defence maintained there was no way to predict major quakes.

The 6.3 magnitude quake devastated the city and killed 309 people.

Many smaller tremors had rattled the area in the months before the quake that destroyed much of the historic centre.

It took Judge Marco Billi slightly more than four hours to reach the verdict in the trial, which had begun in September 2011.

Lawyers have said that they will appeal against the sentence. As convictions are not definitive until after at least one level of appeal in Italy, it is unlikely any of the defendants will immediately face prison.

‘Alarming’ case

The seven – all members of the National Commission for the Forecast and Prevention of Major Risks – were accused of having provided “inaccurate, incomplete and contradictory” information about the danger of the tremors felt ahead of 6 April 2009 quake, Italian media report.

In addition to their sentences, all have been barred from ever holding public office again, La Repubblica reports.

In the closing statement, the prosecution quoted one of its witnesses, whose father died in the earthquake.

It described how Guido Fioravanti had called his mother at about 11:00 on the night of the earthquake – straight after the first tremor.

“I remember the fear in her voice. On other occasions they would have fled but that night, with my father, they repeated to themselves what the risk commission had said. And they stayed.”

‘Hasty sentence’

The judge also ordered the defendants to pay court costs and damages.

Reacting to the verdict against him, Bernardo De Bernardinis said: “I believe myself to be innocent before God and men.”

“My life from tomorrow will change,” the former vice-president of the Civil Protection Agency’s technical department said, according to La Repubblica.

“But, if I am judged by all stages of the judicial process to be guilty, I will accept my responsibility.”

Another, Enzo Boschi, described himself as “dejected” and “desperate” after the verdict was read.

“I thought I would have been acquitted. I still don’t understand what I was convicted of.”

One of the lawyers for the defence, Marcello Petrelli, described the sentences as “hasty” and “incomprehensible”.

‘Inherently unpredictable’

The case has alarmed many in the scientific community, who feel science itself has been put on trial.

Some scientists have warned that the case might set a damaging precedent, deterring experts from sharing their knowledge with the public for fear of being targeted in lawsuits, the BBC’s Alan Johnston in Rome reports.

Among those convicted were some of Italy’s most prominent and internationally respected seismologists and geological experts.

Earlier, more than 5,000 scientists signed an open letter to Italian President Giorgio Napolitano in support of the group in the dock.

After the verdict was announced, David Rothery, of the UK’s Open University, said earthquakes were “inherently unpredictable”.

“The best estimate at the time was that the low-level seismicity was not likely to herald a bigger quake, but there are no certainties in this game,” he said.

Malcolm Sperrin, director of medical physics at the UK’s Royal Berkshire Hospital said that the sentence was surprising and could set a worrying precedent.

“If the scientific community is to be penalised for making predictions that turn out to be incorrect, or for not accurately predicting an event that subsequently occurs, then scientific endeavour will be restricted to certainties only and the benefits that are associated with findings from medicine to physics will be stalled.”

Analysis

by Jonathan Amos – Science correspondent

The Apennines, the belt of mountains that runs down through the centre of Italy, is riddled with faults, and the “Eagle” city of L’Aquila has been hammered time and time again by earthquakes. Its glorious old buildings have had to be patched up and re-built on numerous occasions.

Sadly, the issue is not “if” but “when” the next tremor will occur in L’Aquila. But it is simply not possible to be precise about the timing of future events. Science does not possess that power. The best it can do is talk in terms of risk and of probabilities, the likelihood that an event of a certain magnitude might occur at some point in the future.

The decision to prosecute some of Italy’s leading geophysicists drew condemnation from around the world. The scholarly bodies said it had been beyond anyone to predict exactly what would happen in L’Aquila on 6 April 2009.

But the authorities who pursued the seven defendants stressed that the case was never about the power of prediction – it was about what was interpreted to be an inadequate characterisation of the risks; of being misleadingly reassuring about the dangers that faced their city.

Nonetheless, the verdicts will come as a shock to all researchers in Italy whose expertise lies in the field of assessing natural hazards. Their pronouncements will be scrutinised as never before, and their fear will be that they too could find themselves embroiled in legal action over statements that are inherently uncertain.

THOSE CONVICTED

Bernardo De Bernardinis, former deputy chief of Italy's civil protection department

Franco Barberi, head of Serious Risks Commission

Enzo Boschi, former president of the National Institute of Geophysics

Giulio Selvaggi, director of National Earthquake Centre

Gian Michele Calvi, director of European Centre for Earthquake Engineering

Claudio Eva, physicist

Mauro Dolce, director of the the Civil Protection Agency’s earthquake risk office

Bernardo De Bernardinis, former vice-president of Civil Protection Agency’s technical department

 

*   *   *

Scientists in the dock over L’Aquila earthquake

By Susan Watts 

BBC Newsnight Science editor

20 September 2011

Next week six scientists and an official go on trial in Italy for manslaughter over the earthquake in L’Aquila that killed 309 people two years ago.

This extraordinary case has attracted international attention because science itself seemed to be on trial, with the seven defendants apparently charged for failing to predict the magnitude 6.3 earthquake that struck on the night of 6 April 2009.

Scientists cannot yet say when an earthquake is going to happen with any precision, even in a seismically active zone. And over 5,000 scientists from around the world have signed a letter supporting those on trial.

Quake damaged buildings in OnnaThe earthquake was felt throughout central Italy

“I’m afraid that like an earthquake, nothing in this case is predictable. Let’s not forget, this trial is happening in L’Aquila, where the entire population has been personally affected, and awaiting a sentence that should not happen, but could happen,” Marcello Milandri said.Yet the lawyer for one of the scientists, in an interview with Newsnight, said it is possible his client will be convicted:

Seismologists can assess only the probability that a quake may happen, and then with a large degree of uncertainty about its properties.

In some circumstances, they may be able to say that the likelihood of an event has gone up, to help authorities prepare for an emergency, perhaps by concentrating on particularly vulnerable buildings or sectors of the population, such as school-children.

Weighing the risks

The signatories to the letter say the authorities should focus on earthquake protection, instead of pursuing scientists in what some feel is a Galileo-style inquisition.

The Commission calmed the local population down following a number of earth tremors. After the quake, we heard people’s accounts and they told us they changed their behaviour following the advice of the commission 

Inspector Lorenzo Cavallo

Newsnight went to L’Aquila to find out why this case has come about.

The prosecution team said they never intended to put science on trial, that they know it is not possible to predict an earthquake.

What they are questioning is whether the six scientists and the official on trial, who together constitute Italy’s Commission of Grand Risks, did their jobs properly.

That is, did they weigh up all the risks, and communicate these clearly to the authorities seeking their advice?

The local investigator, Inspector Lorenzo Cavallo, said: “The Commission calmed the local population down following a number of earth tremors. After the quake, we heard people’s accounts and they told us they changed their behaviour following the advice of the commission.

“It is our duty to investigate what has been said in each case and pass it on to the legal authority.”

Radon gas claims

A local journalist, Giustino Parisse, who lived in Onna, a small hamlet outside L’Aquila at the time, is one of those bringing the case.

In the weeks leading up to the major quake there had been a series of tremors. On the night of 5 April, several large shocks kept his children awake.

They were anxious, but he told them to go back to bed, that there was no need to worry, the scientists had said so.

Rescuers carrying bodyThe quake was the deadliest to hit Italy since 1980

His 16-year-old daughter and 17-year-old son both died in the earthquake that night, along with his father, when the family home collapsed.

He told Newsnight that people had been becoming increasingly anxious, in part because of warnings from a local nuclear scientist, Giampaolo Giuliani, that raised levels of radon gas in the area suggested to him an earthquake might be imminent.

How valuable this is as an indicator is widely disputed, and most experts in this field believe it is unreliable.

At the time the head of Italy’s civil protection agency, Guido Bertolaso,took the unusual step of asking his Commission of Grand Risks to fly to L’Aquila to discuss the situation.

They held a meeting that lasted only an hour or so, then the official now on trial, Bernardo de Bernadinis, who was then deputy director of the civil protection department, held a hurried press briefing, in reassuring tones.

Two of those on trial are linked to Italy’s National institute of Geophysics and Volcanology (INGV).

The institute’s head of public affairs, Pasquale de Santis, told Newsnight that the trial is a distraction, that seismologists have been saying since 1998 that this is a high risk area, and that people should instead be focussing on those who failed properly to enforce building codes in L’Aquila.

Funding needed

We put this to the mayor of L’Aquila, Massimo Cialente. He hopes the trial will prompt a national debate, and make it easier for him to raise the funds and support he needs to protect people against future earthquakes.

He said six days before the major quake he moved local children from a school damaged in an earlier tremor. He said he had no official budget to do that, because prevention is not a national priority.

“We closed the school and we had to transfer 500 pupils. I needed money, but I started the work without the money. If the quake did not happen I would be charged for that.”

Those bringing the case say the people of L’Aquila have a right to know what happened. Many hope the trial will bring some peace of mind.

But some of those who signed the letter of support told Newsnight they fear the case will dissuade scientists from leaving their labs to engage with politicians and the public.

John McCloskey, professor of geophysics at Ulster University, said these scientists have spent their lives producing some of the most sophisticated seismic maps in the world.

He said it is an “outrage” that they are now on trial for manslaughter, adding that he signed the letter because “their peril is our peril”.

*   *   *

Can we predict when and where quakes will strike?

By Leila Battison – Science reporter

20 September 2011

l'Aquila earthquakeSeismologists try to manage the risk of building damage and loss of life

This week, six seismologists go on trial for the manslaughter of 309 people, who died as a result of the 2009 earthquake in l’Aquila, Italy.

The prosecution holds that the scientists should have advised the population of l’Aquila of the impending earthquake risk.

But is it possible to pinpoint the time and location of an earthquake with enough accuracy to guide an effective evacuation?

There are continuing calls for seismologists to predict where and when a large earthquake will occur, to allow complete evacuation of threatened areas.

Predicting an earthquake with this level of precision is extremely difficult, because of the variation in geology and other factors that are unique to each location.

Attempts have been made, however, to look for signals that indicate a large earthquake is about to happen, with variable success.

Historically, animals have been thought to be able to sense impending earthquakes.

Noticeably erratic behaviour of pets, and mass movement of wild animals like rats, snakes and toads have been observed prior to several large earthquakes in the past.

Following the l’Aquila quake, researchers published a study in the Journal of Zoology documenting the unusual movement of toads away from their breeding colony.

But scientists have been unable to use this anecdotal evidence to predict events.

The behaviour of animals is affected by too many factors, including hunger, territory and weather, and so their erratic movements can only be attributed to earthquakes in hindsight.

Precursor events

When a large amount of stress is built up in the Earth’s crust, it will mostly be released in a single large earthquake, but some smaller-scale cracking in the build-up to the break will result in precursor earthquakes.

“There is no scientific basis for making a prediction” – Richard Walker, University of Oxford

These small quakes precede around half of all large earthquakes, and can continue for days to months before the big break.

Some scientists have even gone so far as to try to predict the location of the large earthquake by mapping the small tremors.

The “Mogi Doughnut Hypothesis” suggests that a circular pattern of small precursor quakes will precede a large earthquake emanating from the centre of that circle.

While half of the large earthquakes have precursor tremors, only around 5% of small earthquakes are associated with a large quake.

So even if small tremors are felt, this cannot be a reliable prediction that a large, devastating earthquake will follow.

“There is no scientific basis for making a prediction”, said Dr Richard Walker of the University of Oxford.

In several cases, increased levels of radon gas have been observed in association with rock cracking that causes earthquakes.

Leaning buildingSmall ground movements sometimes precede a large quake

Radon is a natural and relatively harmless gas in the Earth’s crust that is released to dissolve into groundwater when the rock breaks.

Similarly, when rock cracks, it can create new spaces in the crust, into which groundwater can flow.

Measurements of groundwater levels around earthquake-prone areas see sudden changes in the level of the water table as a result of this invisible cracking.

Unfortunately for earthquake prediction, both the radon emissions and water level changes can occur before, during, or after an earthquake, or not at all, depending on the particular stresses a rock is put under.

Advance warning systems

The minute changes in the movement, tilt, and the water, gas and chemical content of the ground associated with earthquake activity can be monitored on a long term scale.

Measuring devices have been integrated into early warning systems that can trigger an alarm when a certain amount of activity is recorded.

Prediction will only become possible with a detailed knowledge of the earthquake process. Even then, it may still be impossible” – Dr Dan Faulkner, University of Liverpool

Such early warning systems have been installed in Japan, Mexico and Taiwan, where the population density and high earthquake risk pose a huge threat to people’s lives.

But because of the nature of all of these precursor reactions, the systems may only be able to provide up to 30 seconds’ advance warning.

“In the history of earthquake study, only one prediction has been successful”, explains Dr Walker.

The magnitude 7.3 earthquake in 1975 in Haicheng, North China was predicted one day before it struck, allowing authorities to order evacuation of the city, saving many lives.

But the pattern of seismic activity that this prediction was based on has not resulted in a large earthquake since, and just a year later in 1976 a completely unanticipated magnitude 7.8 earthquake struck nearby Tangshan causing the death of over a quarter of a million people.

The “prediction” of the Haicheng quake was therefore just a lucky unrepeatable coincidence.

A major problem in the prediction of earthquake events that will require evacuation is the threat of issuing false alarms.

Scientists could warn of a large earthquake every time a potential precursor event is observed, however this would result in huge numbers of false alarms which put a strain on public resources and might ultimately reduce the public’s trust in scientists.

“Earthquakes are complex natural processes with thousands of interacting factors, which makes accurate prediction of them virtually impossible,” said Dr Walker.

Seismologists agree that the best way to limit the damage and loss of life resulting from a large earthquake is to predict and manage the longer-term risks in an earthquake-prone area. These include the likelihood of building collapsing and implementing emergency plans.

“Detailed scientific research has told us that each earthquake displays almost unique characteristics, preceded by foreshocks or small tremors, whereas others occur without warning. There simply are no rules to utilise in order to predict earthquakes,” said Dr Dan Faulkner, senior lecturer in rock mechanics at the University of Liverpool.

“Earthquake prediction will only become possible with a detailed knowledge of the earthquake process. Even then, it may still be impossible.”

What causes an earthquake?

An earthquake is caused when rocks in the Earth’s crust fracture suddenly, releasing energy in the form of shaking and rolling, radiating out from the epicentre.

The rocks are put under stress mostly by friction during the slow, 1-10 cm per year shuffling of tectonic plates.

The release of this friction can happen at any time, either through small frequent fractures, or rarer breaks that release a lot more energy, causing larger earthquakes.

It is these large earthquakes that have devastating consequences when they strike in heavily populated areas.

Attempts to limit the destruction of buildings and the loss of life mostly focus on preventative measures and well-communicated emergency plans.

*   *   *

Long-range earthquake prediction – really?

By Megan Lane – BBC News

11 May 201

Model figures on shaky jigsaw

In Italy, Asia and New Zealand, long-range earthquake predictions from self-taught forecasters have recently had people on edge. But is it possible to pinpoint when a quake will strike?

It’s a quake prediction based on the movements of the moon, the sun and the planets, and made by a self-taught scientist who died in 1979.

But on 11 May 2011, many people planned to stay away from Rome, fearing a quake forecast by the late Raffaele Bendandi – even though his writings contained no geographical location, nor a day or month.

In New Zealand too, the quake predictions of a former magician who specialises in fishing weather forecasts have caused unease.

“The date is not there, nor is the place” – Paola Lagorio, of the foundation that honours Bendandi

After a 6.3 quake scored a direct hit on Christchurch in February, Ken Ring forecast another on 20 March, caused by a “moon-shot straight through the centre of the earth”. Rattled residents fled the city.

Predicting quakes is highly controversial, says Brian Baptie, head of seismology at the British Geological Survey. Many scientists believe it is impossible because of the quasi-random nature of earthquakes.

“Despite huge efforts and great advances in our understanding of earthquakes, there are no good examples of an earthquake being successfully predicted in terms of where, when and how big,” he says.

Many of the methods previously applied to earthquake prediction have been discredited, he says, adding that predictions such as that in Rome “have little basis and merely cause public alarm”.

Woman holding pet cat in a tsunami devastated street in JapanCan animals pick up quake signals?

Seismologists do monitor rock movements around fault lines to gauge where pressure is building up, and this can provide a last-minute warning in the literal sense, says BBC science correspondent Jonathan Amos.

“In Japan and California, there are scientists looking for pre-cursor signals in rocks. It is possible to get a warning up to 30 seconds before an earthquake strikes your location. That’s enough time to get the doors open on a fire station, so the engines can get out as soon as it is over.”

But any longer-range prediction is much harder.

“It’s like pouring sand on to a pile, and trying to predict which grain of sand on which side of the pile will cause it to collapse. It is a classic non-linear system, and people have been trying to model it for centuries,” says Amos.

In Japan, all eyes are on the faults that lace its shaky islands.

On Monday, Trade and Industry Minister Banri Kaieda urged that the Hamaoka nuclear plant near a fault line south-west of Tokyo be shut down, pending the construction of new tsunami defences.

Seismologists have long warned that a major earthquake is overdue in this region.

But overdue earthquakes can be decades, if not centuries, in coming. And this makes it hard to prepare, beyond precautions such as construction standards and urging the populace to lay in emergency supplies that may never be needed.

Later this year, a satellite is due to launch to test the as-yet unproven theory that there is a link between electrical disturbances on the edge of our atmosphere and impending quakes on the ground below.

Toad warning

Then there are the hypotheses that animals may be able to sense impending earthquakes.

Last year, the Journal of Zoology published a study into a population of toads that left their breeding colony three days before a 6.3 quake struck L’Aquila, Italy, in 2009. This was highly unusual behaviour.

But it is hard to objectively and quantifiably study how animals respond to seismic activity, in part because earthquakes are rare and strike without warning.

A man in Christchurch carrying a young girl through stricken streetsCountries in the Pacific’s “Ring of Fire”, like New Zealand, are regularly shaken by quakes

“At the moment, we know the parts of the world where earthquakes happen and how often they happen on average in these areas,” says Dr Baptie.

This allows seismologists to make statistical estimates of probable ground movements that can be use to plan for earthquakes and mitigate their effects. “However, this is still a long way from earthquake prediction,” he says.

And what of the “prophets” who claim to predict these natural disasters?

“Many regions, such as Indonesia and Japan, experience large earthquakes on a regular basis, so vague predictions of earthquakes in these places requires no great skill.”

 

Who was Raffaele Bendandi?

  • Born in 1893 in central Italy
  • In November 1923, he predicted a quake would strike on January 2, 1924
  • Two days after this date, it did, in Italian province of Le Marche
  • Mussolini made him a Knight of the Order of the Crown of Italy
  • But he also banned Bendandi from making public predictions, on pain of exile

Life span of humans took a huge jump in past century (MSNBC)

Researchers credit environmental improvements, not genetics, for the increaseBy Trevor Stokes

updated 10/15/2012 7:11:26 PM ET

Humans are living longer than ever, a life-span extension that occurred more rapidly than expected and almost solely from environmental improvements as opposed to genetics, researchers said Monday.

Four generations ago, the average Swede had the same probability of dying as a hunter-gatherer, but improvements in our living conditions through medicine, better sanitation and clean drinking water (considered “environmental” changes) decreased mortality rates to modern levels in just 100 years, researchers found.

In Japan, 72 has become the new 30, as the likelihood of a 72-year-old modern-day person dying is the same as a 30-year-old hunter-gatherer ancestor who lived 1.3 million years ago. Though the researchers didn’t specifically look at the United States, they say the trends are not country-specific and not based in genetics.

Quick jump in life span
The same progress of decreasing average probability of dying at a certain age in hunters-gatherers that took 1.3 million years to achieve was made in 30 years during the 21st century.

“I pictured a more gradual transition from a hunter-gatherer mortality profile to something like we have today, rather than this big jump, most of which occurred in the last four generations, to me that was surprise,” lead author Oskar Burger, postdoctoral fellow at the Max Planck Institute for Demographic Research in Germany, told LiveScience.

Biologists have lengthened life spans of worms, fruit flies and mice in labs by selectively breeding for old-age survivorship or tweaking their endocrine system, a network of glands that affects every cell in the body. However, the longevity gained in humans over the past four generations is even greater than can be created in labs, researchers concluded. [Extending Life: 7 Ways to Live Past 100]

Genetics vs. environment
In the new work, Burger and colleagues analyzed previously published mortality data from Sweden, France and Japan, from present-day hunter-gatherers and from wild chimpanzees, the closet living relative to humans.

Humans have lived for an estimated 8,000 generations, but only in the past four have mortalities decreased to modern-day levels. Hunter-gatherers today have average life spans on par with wild chimpanzees.

The research suggests that while genetics plays a small role in shaping human mortality, the key in driving up our collective age lies with the advent of medical technologies, improved nutrition, higher education, better housing and several other improvements to the overall standards of living.

“This recent progress has been just astronomically fast compared to what we made since the split from chimpanzees,” Burger said.

Most of the brunt of decreased mortality comes in youth: By age 15, hunters and gatherers have more than 100 times the chance of dying as modern-day people.

What’s next?
“In terms of what’s going on in the next four generations, I want to be very clear that I don’t make any forecasts,” Burger said. “We’re in a period of transition and we don’t know what the new stable point will be.”

However, some researchers say that humans may have maxed out their old age.

“These mortality curves (that show the probability of dying by a certain age), they are now currently at their lowest possible value, which makes a very strong prediction that life span cannot increase much more,” Caleb Finch, a neurogerontology professor at the University of Southern California who studies the biological mechanisms of aging, told LiveScience in an email.

Further, Finch, who was not involved in the current study, argues that environmental degradation, including climate change and ozone pollution, combined with increased obesity “are working to throw us back to an earlier phase of our improvements, they’re regressive.”

“It’s impossible to make any reasonable predictions, but you can look, for example, in local environments in Los Angeles where the density of particles in the air predict the rate of heart disease and cancer,” Finch said, illustrating the link between the environment and health.

The study was detailed Monday in the journal Proceedings of the National Academy of Sciences.

NIH Decision Signals the Beginning of the End for Medical Research on Chimps (Wired)

By Brandon Keim

September 21, 2012

Henry, one of the chimps at Chimp Haven. Image: Steven Snodgrass/Flickr

With the retirement of 110 government-owned chimpanzees, the end of medical research on man’s closest living relative may be near.

Today, the National Institutes of Health announced that all of its chimps now living at the New Iberia Research Center would be permanently removed from the research population.

Long criticized by animal advocates for mistreating animals and illegally breeding chimps, New Iberia operates the largest research chimp colony in the United States and is a bastion of a practice abandoned in every other country.

“This is a major message from the NIH: that this era is coming to an end,” said John Pippin of the Physicians Committee for Responsible Medicine, an animal advocacy group. “This is huge.”

In December of last year, an expert panel convened by the Institute of Medicine, the nation’s medical science advisers, declared that medical research on chimpanzees was ethically problematic and, in most cases, scientifically unnecessary. The NIH announced a moratorium on new chimp research funding and agreed to review the status of its own animals. After years of fighting for an end to medical research on chimps, whose ability to think, feel and suffer is not far removed from our own, animal advocates greeted that news with cautious relief. The NIH’s intentions sounded good, but what they’d actually do remained to be seen.

With the decision to retire 110 chimps at New Iberia, the NIH leaves little doubt of its plans. “This is a significant step in winding down NIH’s investment in chimpanzee research based on the way science has evolved and our great sensitivity to the special nature of these remarkable animals, our closest relatives,” said NIH director Francis Collins to the Washington Post.

‘They do not have scientific or ethical justification to continue.’
Excluding the retired chimpanzees, the NIH still owns an estimated 475 chimps eligible for research. Another 500 or so are owned by pharmaceutical companies. The NIH’s decisions influence their fate as well, said Pippin.

“With this indication that the NIH is going to get out of chimp research, that’s going to drop the bottom out of the whole chimpanzee research enterprise,” Pippin said. “How are you going to justify your research in light of what the IOM and NIH have said? Even those not directly affected by this prohibition are going to give up. They do not have scientific or ethical justification to continue.”

Kathleen Conlee, animal research director with the Humane Society of the United States, was more measured in her response.

“They’re taking a step in the right direction by deeming these chimps ineligible for research,” she said. “But we’d rather see them go to sanctuary.” She noted that while 10 of the New Iberia retirees will be sent to the Chimp Haven sanctuary, the rest will go to the Texas Biomedical Institute’s Southwest National Primate Research Center.

Though the newly retired chimps won’t be used again in medical research, that type of research still occurs at Southwest. Indeed, it was an attempt to send retired chimps back into research at Southwestthat sparked the controversy that led to the IOM report and NIH review.

“Places like Southwest were built to be research labs. We’d urge the chimps to be sent somewhere where the mission is the well-being of chimps,” Conlee said. According to Conlee, housing animals at Chimp Haven costs the government $40 per day, compared to $60 per day at research laboratories.

Conlee said that some companies, including Abbott Labs and Idenix, have agreed to follow the IOM guidelines for chimp research or abandon it altogether. Others, including GlaxoSmithKline, have already given up.

Rather than relying on corporate goodwill, however, both Conlee and Pippin urged people to support the Great Ape Protection Act and Cost Savings Act. Now under Congressional consideration, the bill would end on medical research on chimps.

ONU quer garantir que temperatura global não se eleve mais que 2ºC (Globo Natureza)

JC e-mail 4582, de 13 de Setembro de 2012

As negociações climáticas da Organização das Nações Unidas (ONU) devem continuar pressionando por atitudes mais ambiciosas para garantir que o aquecimento global não ultrapasse os 2 graus, disse um negociador da União Europeia nesta semana, um mês depois de os EUA terem sido acusados de apresentar um retrocesso na meta.

Quase 200 países concordaram em 2010 em limitar o aumento das temperaturas para abaixo de 2 graus Celsius, acima da era pré-industrial para evitar os impactos perigos da mudança climática, como enchentes, secas e elevação do nível das marés.

Para desacelerar o ritmo do aquecimento global, as conversações climáticas da ONU na África do Sul concordaram em desenvolver um acordo climático legalmente vinculante até 2015, que poderia entrar em vigor no máximo até 2020.

Entretanto, especialistas advertem que a chance de limitar o aumento da temperatura global para menos de 2 graus está ficando cada vez menor, à medida que aumenta a emissão dos gases de efeito estufa por causa da queima de combustíveis fósseis.

“Está muito claro que devemos pressionar nas negociações de que a meta de 2 graus não é suficiente. A razão pela qual não estamos fazendo o bastante se deve à situação política em algumas partes do mundo”, disse Peter Betts, o diretor para mudança climática internacional da Grã-Bretanha e negociador sênior da UE, a um grupo de mudança climática no Parlamento britânico.

Na última semana, cientistas e diplomatas se reuniram em Bangcoc para a reunião da Convenção da ONU sobre Mudança Climática (UNFCCC, na sigla em inglês), a última antes do encontro anual que será realizado entre novembro e dezembro em Doha, no Qatar.

Flexibilidade nas metas – No mês passado, os EUA foram criticados por dizer que apoiavam uma abordagem mais flexível para um novo acordo climático – que não necessariamente manteria o limite de 2 graus -, mas depois acrescentaram que a flexibilidade daria ao mundo uma chance maior de chegar a um novo acordo.

Diversos países, incluindo alguns dos mais vulneráveis à mudança climática, dizem que o limite de 2 graus não é suficiente e que um limite de 1,5 graus seria mais seguro. As emissões do principal gás de efeito estufa, o dióxido de carbono, subiram 3,1% em 2011, em um recorde de alta. A China foi a maior emissora do mundo, seguida pelos EUA.

As negociações para a criação de um novo acordo global para o clima, nos mesmos moldes de Kyoto, já iniciaram. Na última conferência climática foi aprovada uma série de medidas que estabelece metas para países desenvolvidos e em desenvolvimento.

O documento denominado “Plataforma de Durban para Ação Aumentada” aponta uma série de medidas que deverão ser implementadas, mas na prática, não há medidas efetivas urgentes para conter em todo o planeta o aumento dos níveis de poluição nos próximos oito anos.

Obrigação para todos no futuro – Ele prevê a criação de um acordo global climático que vai compreender todos os países integrantes da UNFCCC e irá substituir o Protocolo de Kyoto. Será desenhado pelos países “um protocolo, outro instrumento legal ou um resultado acordado com força legal” para combater as mudanças climáticas.

Isso quer dizer que metas de redução de gases serão definidas para todas as nações, incluindo Estados Unidos e China, que não aceitavam qualquer tipo de negociação se uma das partes não fosse incluída nas obrigações de redução.

O delineamento deste novo plano começará a ser feito a partir das próximas negociações da ONU, o que inclui a COP 18, que vai acontecer em 2012 no Catar. O documento afirma que um grupo de trabalho será criado e que deve concluir o novo plano em 2015.

As medidas de contenção da poluição só deverão ser implementadas pelos países a partir de 2020, prazo estabelecido na Plataforma de Durban, e deverão levar em conta as recomendações do relatório do Painel Intergovernamental sobre Mudanças Climáticas (IPCC, na sigla em inglês), que será divulgado entre 2014 e 2015.

Em 2007, o organismo divulgou um documento que apontava para um aumento médio global das temperaturas entre 1,8 ºC e 4,0 ºC até 2100, com possibilidade de alta para 6,4 ºC se a população e a economia continuarem crescendo rapidamente e se for mantido o consumo intenso dos combustíveis fósseis.

Entretanto, a estimativa mais confiável fala em um aumento médio de 3ºC, assumindo que os níveis de dióxido de carbono se estabilizem em 45% acima da taxa atual. Aponta também, com mais de 90% de confiabilidade, que a maior parte do aumento de temperatura observado nos últimos 50 anos foi provocada por atividades humanas.

Evolution could explain the placebo effect (New Scientist)

06 September 2012 by Colin Barras

Magazine issue 2881

ON THE face of it, the placebo effect makes no sense. Someone suffering from a low-level infection will recover just as nicely whether they take an active drug or a simple sugar pill. This suggests people are able to heal themselves unaided – so why wait for a sugar pill to prompt recovery?

New evidence from a computer model offers a possible evolutionary explanation, and suggests that the immune system has an on-off switch controlled by the mind.

It all starts with the observation that something similar to the placebo effect occurs in many animals, says Peter Trimmer, a biologist at the University of Bristol, UK. For instance, Siberian hamsters do little to fight an infection if the lights above their lab cage mimic the short days and long nights of winter. But changing the lighting pattern to give the impression of summer causes them to mount a full immune response.

Likewise, those people who think they are taking a drug but are really receiving a placebo can have a response which is twice that of those who receive no pills (Annals of Family Medicinedoi.org/cckm8b). In Siberian hamsters and people, intervention creates a mental cue that kick-starts the immune response.

There is a simple explanation, says Trimmer: the immune system is costly to run – so costly that a strong and sustained response could dangerously drain an animal’s energy reserves. In other words, as long as the infection is not lethal, it pays to wait for a sign that fighting it will not endanger the animal in other ways.

Nicholas Humphrey, a retired psychologist formerly at the London School of Economics, first proposed this idea a decade ago, but only now has evidence to support it emerged from a computer model designed by Trimmer and his colleagues.

According to Humphrey’s picture, the Siberian hamster subconsciously acts on a cue that it is summer because food supplies to sustain an immune response are plentiful at that time of year. We subconsciously respond to treatment – even a sham one – because it comes with assurances that it will weaken the infection, allowing our immune response to succeed rapidly without straining the body’s resources.

Trimmer’s simulation is built on this assumption – that animals need to spend vital resources on fighting low-level infections. The model revealed that, in challenging environments, animals lived longer and sired more offspring if they endured infections without mounting an immune response. In more favourable environments, it was best for animals to mount an immune response and return to health as quickly as possible (Evolution and Human Behavior, doi.org/h8p). The results show a clear evolutionary benefit to switching the immune system on and off depending on environmental conditions.

“I’m pleased to see that my theory stands up to computational modelling,” says Humphrey. If the idea is right, he adds, it means we have misunderstood the nature of placebos. Farming and other innovations in the past 10,000 years mean that many people have a stable food supply and can safely mount a full immune response at any time – but our subconscious switch has not yet adapted to this. A placebo tricks the mind into thinking it is an ideal time to switch on an immune response, says Humphrey.

Paul Enck at the University of Tübingen in Germany says it is an intriguing idea, but points out that there are many different placebo responses, depending on the disease. It is unlikely that a single mechanism explains them all, he says.

People Merge Supernatural and Scientific Beliefs When Reasoning With the Unknown, Study Shows (Science Daily)

ScienceDaily (Aug. 30, 2012) — Reliance on supernatural explanations for major life events, such as death and illness, often increases rather than declines with age, according to a new psychology study from The University of Texas at Austin.

Reliance on supernatural explanations for major life events, such as death and illness, often increases rather than declines with age, according to a new psychology study. (Credit: © Nikki Zalewski / Fotolia)

The study, published in the June issue of Child Development, offers new insight into developmental learning.

“As children assimilate cultural concepts into their intuitive belief systems — from God to atoms to evolution — they engage in coexistence thinking,” said Cristine Legare, assistant professor of psychology and lead author of the study. “When they merge supernatural and scientific explanations, they integrate them in a variety of predictable and universal ways.”

Legare and her colleagues reviewed more than 30 studies on how people (ages 5-75) from various countries reason with three major existential questions: the origin of life, illness and death. They also conducted a study with 366 respondents in South Africa, where biomedical and traditional healing practices are both widely available.

As part of the study, Legare presented the respondents with a variety of stories about people who had AIDS. They were then asked to endorse or reject several biological and supernatural explanations for why the characters in the stories contracted the virus.

According to the findings, participants of all age groups agreed with biological explanations for at least one event. Yet supernatural explanations such as witchcraft were also frequently supported among children (ages 5 and up) and universally among adults.

Among the adult participants, only 26 percent believed the illness could be caused by either biology or witchcraft. And 38 percent split biological and scientific explanations into one theory. For example: “Witchcraft, which is mixed with evil spirits, and unprotected sex caused AIDS.” However, 57 percent combined both witchcraft and biological explanations. For example: “A witch can put an HIV-infected person in your path.”

Legare said the findings contradict the common assumption that supernatural beliefs dissipate with age and knowledge.

“The findings show supernatural explanations for topics of core concern to humans are pervasive across cultures,” Legare said. “If anything, in both industrialized and developing countries, supernatural explanations are frequently endorsed more often among adults than younger children.”

The results provide evidence that reasoning about supernatural phenomena is a fundamental and enduring aspect of human thinking, Legare said.

“The standard assumption that scientific and religious explanations compete should be re-evaluated in light of substantial psychological evidence,” Legare said. “The data, which spans diverse cultural contexts across the lifespan, shows supernatural reasoning is not necessarily replaced with scientific explanations following gains in knowledge, education or technology.”

Journal Reference:

  1. Cristine H. Legare, E. Margaret Evans, Karl S. Rosengren, Paul L. Harris. The Coexistence of Natural and Supernatural Explanations Across Cultures and DevelopmentChild Development, 2012; 83 (3): 779 DOI:10.1111/j.1467-8624.2012.01743.x

Gene That Predicts Happiness in Women Discovered (Science Daily)

ScienceDaily (Aug. 28, 2012) — A new study has found a gene that appears to make women happy, but it doesn’t work for men. The finding may help explain why women are often happier than men, the research team said.

A new study has found a gene that appears to make women happy, but it doesn’t work for men. The finding may help explain why women are often happier than men. (Credit: © Yuri Arcurs / Fotolia)

Scientists at the University of South Florida (USF), the National Institutes of Health (NIH), Columbia University and the New York State Psychiatric Institute reported that the low-expression form of the gene monoamine oxidase A (MAOA) is associated with higher self-reported happiness in women. No such association was found in men.

The findings appear online in the journal Progress in Neuro-Psychopharmacology & Biological Psychiatry.

“This is the first happiness gene for women,” said lead author Henian Chen, MD, PhD, associate professor in the Department of Epidemiology and Biostatistics, USF College of Public Health.

“I was surprised by the result, because low expression of MAOA has been related to some negative outcomes like alcoholism, aggressiveness and antisocial behavior,” said Chen, who directs the Biostatistics Core at the USF Health Morsani College of Medicine’s Clinical and Translational Sciences Institute. “It’s even called the warrior gene by some scientists, but, at least for women, our study points to a brighter side of this gene.”

While they experience higher rates of mood and anxiety disorders, women tend to report greater overall life happiness than do men. The reason for this remains unclear, Chen said. “This new finding may help us to explain the gender difference and provide more insight into the link between specific genes and human happiness.”

The MAOA gene regulates the activity of an enzyme that breaks down serontin, dopamine and other neurotransmitters in the brain — the same “feel-good” chemicals targeted by many antidepressants. The low-expression version of the MAOA gene promotes higher levels of monoamine, which allows larger amounts of these neurotransmitters to stay in the brain and boost mood.

The researchers analyzed data from a population-based sample of 345 individuals — 193 women and 152 men — participating in Children in the Community, a longitudinal mental health study. The DNA of study subjects had been analyzed for MAOA gene variation and their self-reported happiness was scored by a widely used and validated scale.

After controlling for various factors, ranging from age and education to income, the researchers found that women with the low-expression type of MAOA were significantly happier than others. Compared to women with no copies of the low-expression version of the MAOA gene, women with one copy scored higher on the happiness scale and those with two copies increased their score even more.

While a substantial number of men carried a copy of the “happy” version of the MAOA gene, they reported no more happiness than those without it.

So, why the genetic gender gap in feeling good?

The researchers suspect the difference may be explained in part by the hormone testosterone, found in much smaller amounts in women than in men. Chen and his co-authors suggest that testosterone may cancel out the positive effect of MAOA on happiness in men.

The potential benefit of MAOA in boys could wane as testosterone levels rise with puberty, Chen said. “Maybe men are happier before adolescence because their testosterone levels are lower.”

Chen emphasizes that more research is needed to identify which specific genes influence resilience and subjective well-being, especially since studies of twins estimate genetic factors account for 35 to 50 percent of the variance in human happiness.

While happiness is not determined by a single gene, there is likely a set of genes that, along with life experiences, shape our individual happiness levels, Chen said. “I think the time is right for more genetic studies that focus on well-being and happiness.”

“Certainly it could be argued that how well-being is enhanced deserves at least as much attention as how (mental) disorders arise; however, such knowledge remains limited.”

The study by Chen and colleagues was supported by the National Institutes of Health and a USF proposal enhancement grant.

Journal Reference:

  1. Henian Chen, Daniel S. Pine, Monique Ernst, Elena Gorodetsky, Stephanie Kasen, Kathy Gordon, David Goldman, Patricia Cohen. The MAOA gene predicts happiness in womenProgress in Neuro-Psychopharmacology and Biological Psychiatry, 2012; DOI:10.1016/j.pnpbp.2012.07.018

Skeptical Uses of ‘Religion’ in Debate on Climate Change (The Yale Forum on Climate Change & The Media)

Michael Svoboda   August 27, 2012

Religion’ and religion-inspired terms — savior, prophet, priests, heretic, dogma, crusade — are regularly used in efforts to influence public attitudes about climate change. But how does this language work, and on whom?

Over the past several months The Yale Forum has published a series of articles describing how major religious groups across America address climate change. Within the broader societal debate on this issue, however, the voices heard in these pieces may be outnumbered by those of a group with a very different take on the connections between religion and the environment: climate skeptics.

Since 2005, in op-eds published in newspapers (The Wall Street Journal, The Washington Examiner, The Washington Post, and The Washington Times), in magazines (Forbes, National Review, The Weekly Standard), and online (Fox News and Townhall and also climate-specific websites like Watts Up with That), conservative commentators have repeatedly described global warming as a religion.

So how does this use of religious language affect the public understanding of climate change? To answer this question, the Forum analyzed more than 250 op-eds, blog posts, and books published between 2005 and the present. The results suggest that this religious language may be most effective in fortifying the opinions of those using it: Calling global warming a “religion” effectively neutralizes appeals to “the scientific consensus.”

Taking the Measure of the Meme

To take your own quick measure of the global-warming-as-religion (hereafter GWAR) meme, try two related searches at Google: first search for “climate change” and “religion,” then for “global warming” and “religion.” The top ten items from the Forum‘s two most recent searches (20 items in all) broke down as follows:

  • 10% were by religious groups calling for action on climate change,
  • 25% were about religious groups calling for action on climate change,
  • 10% were against religious groups opposed to action on climate,
  • 50% described concern for global warming as a religion, and
  • 5% rebutted those who described concern for global warming as a religion.

Based on this sample, one is more likely to encounter an article or op-ed about global warming as a religion than an article or op-ed explaining how or whether a particular religious group addresses climate change.

The dominance of the GWAR meme is even greater when one looks specifically at conservative venues. Over the past year, approximately 100 op-ed pieces that touched on global warming were published in nationally recognized conservative newspapers and/or by nationally syndicated columnists whose work is aggregated by Townhall. Ten of these pieces equated accepting the science on global warming with religious belief; none offered a religious argument for action on climate change.

During the peak years of Al Gore’s An Inconvenient Truth (2006–2008), the ratio was far higher. Roughly 40% of the more than 150 conservative op-eds penned in response to the documentary, to its Academy Award, or to Al Gore’s Nobel Peace Prize included language (prophet, priests, savior, crusade, faith, dogma, heresy, faith, etc.) that framed concern for climate change as a religious belief. Some drew that analogy explicitly. (See, for example, Richard Lindzen’s March 8, 2007, op-ed piece forThe Wall Street Journal and The Daily Mail (UK) — “Global Warming: The Bogus Religion of Our Age.”)

And since then several climate skeptics — Christopher Horner (2007), Iain Murray (2008), Roy Spencer (2008), Christopher Booker (2009), Ian Wishart (2009), Steve Goreham (2010), Larry Bell (2011), Brian Sussman (2012), and Robert Zubrin (2012) — have included the GWAR meme in their books.

A Brief History of the Global-Warming-as-Religion Meme

The global-warming-as-religion meme is an offshoot of the environmentalism-as-religion meme, which, according to New American Foundation fellow and Arizona State University Law Professor Joel Garreau, can be traced back to religious critiques of Lynn White’s 1967 essay in Science, “The Historical Roots of Our Ecologic Crisis.” By pinning the ecological blame on the Judaeo-Christian tradition’s instrumental view of nature, these authors argued, White seemed to call for the revival of nature worship.

Elements of these early critiques were reworked in what is perhaps the most well-known instance of the environmentalism-as-religion argument, Michael Crichton’s speech to the Commonwealth Club of San Francisco in 2003.

The first* example of the more specific global-warming-as-religion claim appears to be the aside in Republican Senator James Inhofe’s January 4, 2005, “update” to his “greatest hoax” speech: “Put simply, man-induced global warming is an article of religious faith.” Using slightly different language, Inhofe repeated this charge a few months later in his “Four Pillars of Climate Alarmism” speech.**

In between these two speeches, in a February 16, 2005, editorial for Capitalism Magazine by American Policy Center President Tom DeWeese, the GWAR meme gained titular status: “The New Religion Is Global Warming.”

But the most fully developed version of the global-warming-as-religion analogy is the nearly 5,000-word essay published on the Web in 2007 by retired British mathematician John Brignell — who cites Crichton’s 2003 speech in his opening paragraph.

The more generic environmentalism-as-religion meme now seems confined to Earth Day, which Emory University economics professor Paul Rubin described in an April 22, 2010, WSJ op-ed piece as environmentalism’s “holy day.” Two recent examples, from this past April, were provided by former business consultant W.A. Beatty and by Dale Hurd, a “news veteran” for the Christian Broadcasting Network.

The GWAR meme appears as opportunities — cool summers; early, late, or heavy snowstorms; or scandals — arise. And its meaning can vary accordingly.

Nature/Climate as Sacred

Some of the first American “environmentalists” — David Thoreau, Ralph Waldo Emerson, John Muir — often used religious language. Nature was where they most vividly experienced the presence of God. But when contemporary environmentalists use quasi-religious language without explicitly avowing a particular faith, their opponents may suspect that nature itself has become the object of their worship. When James Lovelock named his homeostatic model of the planet and its atmosphere after the ancient Greek earth goddess, Gaia, he provided a new ground for this suspicion.

For conservatives, there are strong and weak versions of this charge.

The strong charge is “paganism,” that environmentalists or climate activists/scientists worship nature in ways akin to the practices of the Egyptian, Mesopotamian, Greek, and Roman empires in which the ancient Jews and early Christians lived. This strong charge is typically leveled by evangelicals who publicly profess their own faith. Physicist James Wanliss and his colleagues — whose book and dvd,Resisting the Green Dragon, offer “A Biblical Response to One of the Greatest Deceptions of the Day” — provide perhaps the most vivid example.

The weak version reduces the charge of paganism to misplaced values. Very arch religious language may still be used, but the meaning is now metaphorical. In these more frequent instances of the GWAR meme, conservatives accuse climate activists/scientists of essentializing climate, of being too willing to slow or even disable our economic engine because they believe Earth has an “optimal climate.”

Climate Science as Cult

“Cult” implies that a given set of beliefs or practices is arcane, outside the mainstream, and insular. Someone embedded in a cult will not acknowledge conflicting evidence. So whenever new facts or dramatic events challenge the validity of climate science, at least in the minds of conservative skeptics, “cult of global warming” op-eds appear. Major snowstorms, cold snaps, and years that fail to surpass 1998′s average annual temperature provide these new “facts.”

Odd religious news can also prompt “cult of global warming” op-eds. The third no-show of Harold Camping’s apocalypse provided the prompt, last fall, for op-eds by Michael Barone and Derek Hunter. (The “cult” in the title of Michael Barone’s piece, however, may be the work of the Post’s editor; thesame piece appeared under a different title in The Washington Examiner.)

Climate Science as Corrupt Orthodoxy

But it’s hard to depict a thoroughly institutionalized effort like climate science as a cult. The international undertaking that is science is more plausibly compared with the Roman Catholic Church. And for climate skeptics, the best of the many possible instances of that church is the Roman Catholic Church of the late Renaissance, the church that condemned both Luther and Galileo.

The very Nobel public profiles of Al Gore and the IPCC, from 2006 to 2008, prompted many comparisons with priests and popes, cardinals and curia. Add in carbon offsets and the Reformation riffs practically wrote themselves. Conservative columnist Charles Krauthammer’s March 16, 2007,column in Time exemplifies this subgenre:

In other words, the rich reduce their carbon output by not one ounce. But drawing on the hundreds of millions of net worth in the Kodak theatre [for the “carbon-neutral” 2007 Academy Awards], they pull out lunch money to buy ecological indulgences. The last time the selling of pardons was prevalent — in a predecessor religion to environmentalism called Christianity — Martin Luther lost his temper and launched the Reformation.

(It should be noted, however, that climate activists and environmental journalists have themselves sometimes written about their ecological “sins.”)

While green hypocrisy was the primary target of Krauthammer’s 2007 column, orthodoxy and dogma are always at least secondary targets in this use of the GWAR meme. And shots were taken at them in a February 9, 2007, National Review column by Rich Lowry; a May 30, 2008, Washington Post column by Charles Krauthammer; a March 9, 2009, Townhall piece by Robert Knight; a January 13, 2010, Townhallcolumn by Walter E. Williams; a November 29, 2011, Wall Street Journal column by Bret Stephens; and, most recently, an April 26, 2012, post by David Solway. This is the most common use of the GWAR meme.

Dissenting Religions and the Scientific Consensus

But one might argue that by depicting climate scientists and activists as members of an aloof and self-serving (and possibly self-deluding) priesthood, conservatives are themselves engaged in religious posturing, for self-righteous dissent is part of the DNA of the western religious tradition.

Ancient Israel was a small country surrounded by much more powerful empires. Some heroes of the Bible — e.g., Daniel, Shadrach, Meshach, and Abednego — worked as trusted bureaucrats within state-ecclesiastical systems based on cosmologies they did not believe in. When ordered to consent to the beliefs of their rulers, they refused.

During the Protestant Reformation religious dissent often became political dissent. Today’s evangelicals are dissenters from mainstream denominations that dissented first from the Church of England and then from King George. Now they dissent from Washington.

But in the U.S., Roman Catholics too can view themselves as a dissenting minority, as, for example, when the Catholic Bishops objected to parts of the new healthcare law.

In fact, Americans are so primed for dissensus that both sides in the climate debate find it plausible to claim the mantle of Galileo.


In the run-up to the December 2009 conference in Copenhagen, cartoonists Michael Ramirez and David Horsey published cartoons that drew exactly opposite conclusions from the history of science, including Galileo’s conflict with the Roman Catholic Church regarding Copernicus’s heliocentric model of the solar system.

Within this charged religious history, a steadfast minority (of Jews, early Christians, Protestants, or Puritans) has been correct more often than the majority, than the broader cultural consensus (of Egyptians/Assyrians/Babylonians/Persians, Greeks/Romans, Roman Catholics, or Anglicans). Thus the GWAR meme not only legitimizes dissent (because everyone is entitled to his or her own religious views), it also provides emotional reinforcement for it (because the “official” religion is almost always “false”). The Protestant vs. Catholic variant of the meme also reinforces climate skeptics’ narratives about greedy and scheming scientists and/or self-serving elites. For those who use it, the GWAR meme effectively inoculates them against “the scientific consensus.”

Managing the Meme

Much has been said and published by religious leaders trying to promote action on climate change. But these messages must compete against the global-warming-as-religion meme reinforced regularly in op-eds sent out by The Wall Street Journal to its two million plus subscribers and, more frequently, in columns posted by Townhall for its two million unique monthly visitors.

Are there counter-measures for this meme?

In his summer 2010 article in The New Atlantis, Joel Garreau, New American Foundation fellow and Lincoln Professor of Law, Culture and Values (Sandra Day O’Connor College of Law, Arizona State University), traced the emergence of environmentalism as a secular religion. In that piece, Gerreau speculated that “the two faces of religious environmentalism — the greening of mainstream religion and the rise of carbon Calvinism — may each transform the political and policy debate over climate change.” In response to an e-mailed query from The Yale Forum, after stressing that he did not “conflate faith-based environmentalism with the scientific study of climate,” Garreau explained his “pragmat[ic]” outlook:  ”I just lay out the facts (as startling as they may be to some), observe that faith-based systems are ubiquitous in history, and then ask, in public policy terms, how you deal with this situation.”

Garreau said he is not surprised that “climate change deniers [might] wish to point out the ironies of faith-based environmentalism rising up in parallel with scientific environmentalism.” But he said he does not think that would have much effect. He suggested no countermeasures but did anticipate a possible line of attack: “It would hardly be surprising if there were a few under-examined pieties in their own world view.”

From the title of University of Maryland School of Public Policy professor Robert H. Nelson’s 2010 book,The New Holy Wars: Economic Religion vs. Environmental Religion in Contemporary America, one might infer that the playing field for climate policy might be leveled by calling attention to the equally religious faith in economics, in economic growth in particular. But what would be gained from a “religious” standoff between economics and environmentalism? In response to an e-mail question, Nelson listed three benefits:

First, … it helps us to understand … the … intensity of the disagreements about climate policy. Second, it offers a note of caution to all participants, given [that past] religious disagreements have too often escalated beyond all reason …. Third, … [s]eeing economics and environmentalism as religions, and discussing them as such, [would bring their] core value assumptions to the surface.

In other words, pushing back with the same religious language might be an effective countermeasure, at least initially. Then, Nelson added,  ”a secular religious ‘ecumenical movement’” could perhaps resolve the tensions between economics and environmentalism.

One clearly should proceed with caution in pursuing any “religious” countermeasures. The cultural and historical associations evoked by religious language do not necessarily favor “consensus,” especially a consensus presented in authoritative terms. In American history, religious groups have splintered far more often than they have united.

Bottom line: Climate communicators should expect and prepare for religious language. But they should weigh the subtle cultural messages religious language carries before deciding whether or how to use or respond to it.

*If readers know of an earlier example, please send the reference and/or the link to the author.
**Brian McCammack’s September 2007 
American Quarterly article, “Hot Damned America: Evangelicalism and the Climate Policy Debate,” pointed the way to these two speeches by Senator Inhofe.

Michael Svoboda

Michael Svoboda (PhD, Hermeneutics) is an Asst. Prof. of Writing at The George Washington University. Previously the owner of an academic bookstore, he now tracks and analyzes efforts to communicate climate change, including the stream of research and policy published by NGOs. E-mail: msvoboda@yaleclimatemediaforum.org

Intriguing Habitats, and Careful Discussions of Climate Change (N.Y.Times)

THE ANIMAL LIFEBOAT

Gretchen Ertl for The New York TimesPacific Sea nettle jellyfish at the New England Aquarium in Boston. Zoos and aquariums are working to include educational elements about the environment without alienating visitors.

By 

Published: August 26, 2012

BOSTON — Sitting on an artificial mangrove island in the middle of the ray and shark “touch tank,” Lindsay Jordan, a staff member at the New England Aquarium, explained the rays’ eating habits as children and their parents trailed fingers through the water. “Does anyone know how we touch these animals when we are not at the aquarium?” she asked.

The children’s faces turned up expectantly.

“The ocean absorbs one-third of the world’s carbon dioxide emissions,” Ms. Jordan said, explaining that it upsets the food chain. “When you turn on your car, it affects them.”

Downstairs, next to the jellyfish tanks, a rhyming video told how the jellyfish population was exploding in the wild because they thrive in warmer waters. In the main room, a staff member pointed to a rare blue lobster, saying that some lobsters have been scuttling out of Massachusetts and settling in cooler climes to the north.

With many zoos and aquariums now working with conservation organizations and financed by individuals who feel strongly about threatened habitats and species, managers have been wrestling with how aggressive to be in educating visitors on the perils of climate change.

Surveys show that American zoos and aquariums enjoy a high level of public trust and are ideally positioned to teach.

Yet many managers are fearful of alienating visitors — and denting ticket sales — with tours or wall labels that dwell bleakly on damaged coral reefs, melting ice caps or dying trees.

“You don’t want them walking away saying, ‘I paid to get in, I bought my kid a hot dog, I just want to show my kid a fish — and you are making me feel bad about climate change,’ ” said Paul Boyle, the senior vice president for conservation and education at the Association of Zoos and Aquariums.

Some zoos and aquariums have therefore held back, relegating the theme to, say, a sign about Arctic melting in the polar bear exhibit. But many have headed in the other direction, putting climate change front and center in a way that they hope will inspire a young generation of zoogoers.

Working with cognitive scientists and experts in linguistics and anthropology, a coalition of aquariums set out in 2008 to develop a patter that would intrigue rather than daunt or depress the average visitor. After the group was pleased with the script, it secured a grant of about $1 million last year from the National Science Foundation to train staffs across the nation. This month, the foundation awarded the group an additional $5.5 million for a five-year education effort.

Dr. Boyle said that most of the association’s 224 members now have some sort of climate message.

The form varies from subtle to pointed. The zoos in Cincinnati and Toledo, Ohio, for instance, have installed prominent solar arrays over their parking lots to power exhibits and set an example. The San Diego Zoo and the Brookfield Zoo near Chicago have made their exhibits of polar bears and other Arctic species more direct about the threats posed by global warming.

So far the feedback has largely been positive, officials at most zoos say.

Ariella Camera, a counselor with a summer program run by Boston Rising, an antipoverty group, said some of her charges recently took part in a game at the New England Aquarium that taught them what emits carbon dioxide (many factories, most cars) and what absorbs it (trees and the ocean). They were then challenged to balance the two.

Afterward the students struck up a lively conversation about their carbon footprints, Ms. Camera said. “It was a very engaging presentation,” she said.

Such anecdotes gratify Howard Ris, the aquarium’s president. “We would like as many people, if not everyone, to leave encouraged to take action,” he said.

Others are dubious that it will work. “Zoos have been making claims about their educational value for 150 years,” said Jeffrey Hyson, a cultural historian and the director of the American studies program at St. Joseph’s University in Philadelphia. The zoos “say a lot more about what they think they are doing than they can really demonstrate.”

Zoo managers acknowledge that they initially struggled with the challenge of delivering bad news.

In the 1980s and ’90s, Dr. Boyle noted, some zoos and aquariums made a big push to emphasize threats like the depletion of the earth’s ozone layer, the razing of rain forests by loggers and farmers and the overfishing of the Pacific. Electronic boards toted up the numbers of acres being cleared, and enlarged photographs depicted denuded landscapes.

Surveys of visitors showed a backlash. “For lots of reasons, the institutions tended to approach the issues by talking about the huge scale of the problems,” Dr. Boyle said. “They wanted to attract people’s attention, but what we saw happening over time was that everyday people were overwhelmed.” It did not help that a partisan split had opened in the United States over whether global warming was under way, and whether human activity was the leading cause.

At the Georgia Aquarium in Atlanta, Brian Davis, the vice president for education and training, says to this day his institution ensures its guests will not hear the term global warming. Visitors are “very conservative,” he said. “When they hear certain terms, our guests shut down. We’ve seen it happen.”

Such hesitancy inspired the group of leading aquariums to develop, test and refine their model, which comes off as casual and chatty.

Word choices matter, research showed. The FrameWorks Institute, a nonprofit organization that studies how people process abstract concepts, found the phrase “greenhouse gas effect” perplexed people. “They think it is a nice place for plants to grow,” said FrameWorks’ president, Susan Bales. So her group advised substituting “heat-trapping blanket” to describe the accumulation of gases in the atmosphere.

Today’s guides also make a point of encouraging groups to focus first on the animals, leaving any unpleasant message for later.

At the New England Aquarium’s giant reef tank, visitors peered over the side and watched sand tiger sharks, sea turtles and tropical fish swim around a giant coral reef. As a diver entered the tank to feed the fish, a guide explained that the smaller ones tend to hide in coral for safety.

A few minutes passed before she told the crowd that corals around the world are bleaching and dying because of a pronounced rise in ocean temperature and acidity.

Upon leaving, the visitors were briefed on positive steps they could take, like using public transportation or bikes and being cautious about energy consumption.

Yet sometimes, the zoo animals are so entrancing that a climate-related message may fall on deaf ears.

Leanne Gaffney, who recently brought four high school students from a summer enrichment program to the New England Aquarium, said they were fascinated by creatures like leafy sea dragons and tropical snakes, but not so much by how their habitats were faring.

“They are teenage boys,” she said. “Mostly they just wanted to see the anacondas.”

The Cambridge Declaration on Consciousness

On this day of July 7, 2012, a prominent international group of cognitive neuroscientists, neuropharmacologists, neurophysiologists, neuroanatomists and computational neuroscientists gathered at The University of Cambridge to reassess the neurobiological substrates of conscious experience and related behaviors in human and non-human animals. While comparative research on this topic is naturally hampered by the inability of non-human animals, and often humans, to clearly and readily communicate about their internal states, the following observations can be stated unequivocally:

 The field of Consciousness research is rapidly evolving. Abundant new techniques and strategies for human and non-human animal research have been developed. Consequently, more data is becoming readily available, and this calls for a periodic reevaluation of previously held preconceptions in this field. Studies of non-human animals have shown that homologous brain circuits correlated with conscious experience and perception can be selectively facilitated and disrupted to assess whether they are in fact necessary for those experiences. Moreover, in humans, new non-invasive techniques are readily available to survey the correlates of consciousness.

 The neural substrates of emotions do not appear to be confined to cortical structures. In fact, subcortical neural networks aroused during affective states in humans are also critically important for generating emotional behaviors in animals. Artificial arousal of the same brain regions generates corresponding behavior and feeling states in both humans and non-human animals. Wherever in the brain one evokes instinctual emotional behaviors in non-human animals, many of the ensuing behaviors are consistent with experienced feeling states, including those internal states that are rewarding and punishing. Deep brain stimulation of these systems in humans can also generate similar affective states. Systems associated with affect are concentrated in subcortical regions where neural homologies abound. Young human and nonhuman animals without neocortices retain these brain-mind functions. Furthermore, neural circuits supporting behavioral/electrophysiological states of attentiveness, sleep and decision making appear to have arisen in evolution as early as the invertebrate radiation, being evident in insects and cephalopod mollusks (e.g., octopus).

 Birds appear to offer, in their behavior, neurophysiology, and neuroanatomy a striking case of parallel evolution of consciousness. Evidence of near human-like levels of consciousness has been most dramatically observed in African grey parrots. Mammalian and avian emotional networks and cognitive microcircuitries appear to be far more homologous than previously thought. Moreover, certain species of birds have been found to exhibit neural sleep patterns similar to those of mammals, including REM sleep and, as was demonstrated in zebra finches, neurophysiological patterns, previously thought to require a mammalian neocortex. Magpies in particular have been shown to exhibit striking similarities to humans, great apes, dolphins, and elephants in studies of mirror self-recognition.

 In humans, the effect of certain hallucinogens appears to be associated with a disruption in cortical feedforward and feedback processing. Pharmacological interventions in non-human animals with compounds known to affect conscious behavior in humans can lead to similar perturbations in behavior in non-human animals. In humans, there is evidence to suggest that awareness is correlated with cortical activity, which does not exclude possible contributions by subcortical or early cortical processing, as in visual awareness. Evidence that human and nonhuman animal emotional feelings arise from homologous subcortical brain networks provide compelling evidence for evolutionarily shared primal affective qualia.

We declare the following: “The absence of a neocortex does not appear to preclude an organism from experiencing affective states. Convergent evidence indicates that non-human animals have the neuroanatomical, neurochemical, and neurophysiological substrates of conscious states along with the capacity to exhibit intentional behaviors. Consequently, the weight of evidence indicates that humans are not unique in possessing the neurological substrates that generate consciousness. Nonhuman animals, including all mammals and birds, and many other creatures, including octopuses, also possess these neurological substrates.”

* The Cambridge Declaration on Consciousness was written by Philip Low and edited by Jaak Panksepp, Diana Reiss, David Edelman, Bruno Van Swinderen, Philip Low and Christof Koch. The Declaration was publicly proclaimed in Cambridge, UK, on July 7, 2012, at the Francis Crick Memorial Conference on Consciousness in Human and non-Human Animals, at Churchill College, University of Cambridge, by Low, Edelman and Koch. The Declaration was signed by the conference participants that very evening, in the presence of Stephen Hawking, in the Balfour Room at the Hotel du Vin in Cambridge, UK. The signing ceremony was memorialized by CBS 60 Minutes.

Information Overload in the Era of ‘Big Data’ (Science Daily)

ScienceDaily (Aug. 20, 2012) — Botany is plagued by the same problem as the rest of science and society: our ability to generate data quickly and cheaply is surpassing our ability to access and analyze it. In this age of big data, scientists facing too much information rely on computers to search large data sets for patterns that are beyond the capability of humans to recognize — but computers can only interpret data based on the strict set of rules in their programming.

New tools called ontologies provide the rules computers need to transform information into knowledge, by attaching meaning to data, thereby making those data retrievable by computers and more understandable to human beings. Ontology, from the Greek word for the study of being or existence, traditionally falls within the purview of philosophy, but the term is now used by computer and information scientists to describe a strategy for representing knowledge in a consistent fashion. An ontology in this contemporary sense is a description of the types of entities within a given domain and the relationships among them.

A new article in this month’s American Journal of Botany by Ramona Walls (New York Botanical Garden) and colleagues describes how scientists build ontologies such as the Plant Ontology (PO) and how these tools can transform plant science by facilitating new ways of gathering and exploring data.

When data from many divergent sources, such as data about some specific plant organ, are associated or “tagged” with particular terms from a single ontology or set of interrelated ontologies, the data become easier to find, and computers can use the logical relationships in the ontologies to correctly combine the information from the different databases. Moreover, computers can also use ontologies to aggregate data associated with the different subclasses or parts of entities.

For example, suppose a researcher is searching online for all examples of gene expression in a leaf. Any botanist performing this search would include experiments that described gene expression in petioles and midribs or in a frond. However, a search engine would not know that it needs to include these terms in its search — unless it was told that a frond is a type of leaf, and that every petiole and every midrib are parts of some leaf. It is this information that ontologies provide.

The article in the American Journal of Botany by Walls and colleagues describes what ontologies are, why they are relevant to plant science, and some of the basic principles of ontology development. It includes an overview of the ontologies that are relevant to botany, with a more detailed description of the PO and the challenges of building an ontology that covers all green plants. The article also describes four keys areas of plant science that could benefit from the use of ontologies: (1) comparative genetics, genomics, phenomics, and development; (2) taxonomy and systematics; (3) semantic applications; and (4) education. Although most of the examples in this article are drawn from plant science, the principles could apply to any group of organisms, and the article should be of interest to zoologists as well.

As genomic and phenomic data become available for more species, many different research groups are embarking on the annotation of their data and images with ontology terms. At the same time, cross-species queries are becoming more common, causing more researchers in plant science to turn to ontologies. Ontology developers are working with the scientists who generate data to make sure ontologies accurately reflect current science, and with database developers and publishers to find ways to make it easier for scientist to associate their data with ontologies.

Journal Reference:

R. L. Walls, B. Athreya, L. Cooper, J. Elser, M. A. Gandolfo, P. Jaiswal, C. J. Mungall, J. Preece, S. Rensing, B. Smith, D. W. Stevenson. Ontologies as integrative tools for plant scienceAmerican Journal of Botany, 2012; 99 (8): 1263 DOI: 10.3732/ajb.1200222

Cloud Brightening to Control Global Warming? Geoengineers Propose an Experiment (Science Daily)

A conceptualized image of an unmanned, wind-powered, remotely controlled ship that could be used to implement cloud brightening. (Credit: John McNeill)

ScienceDaily (Aug. 20, 2012) — Even though it sounds like science fiction, researchers are taking a second look at a controversial idea that uses futuristic ships to shoot salt water high into the sky over the oceans, creating clouds that reflect sunlight and thus counter global warming.

University of Washington atmospheric physicist Rob Wood describes a possible way to run an experiment to test the concept on a small scale in a comprehensive paper published this month in the journal Philosophical Transactions of the Royal Society.

The point of the paper — which includes updates on the latest study into what kind of ship would be best to spray the salt water into the sky, how large the water droplets should be and the potential climatological impacts — is to encourage more scientists to consider the idea of marine cloud brightening and even poke holes in it. In the paper, he and a colleague detail an experiment to test the concept.

“What we’re trying to do is make the case that this is a beneficial experiment to do,” Wood said. With enough interest in cloud brightening from the scientific community, funding for an experiment may become possible, he said.

The theory behind so-called marine cloud brightening is that adding particles, in this case sea salt, to the sky over the ocean would form large, long-lived clouds. Clouds appear when water forms around particles. Since there is a limited amount of water in the air, adding more particles creates more, but smaller, droplets.

“It turns out that a greater number of smaller drops has a greater surface area, so it means the clouds reflect a greater amount of light back into space,” Wood said. That creates a cooling effect on Earth.

Marine cloud brightening is part of a broader concept known as geoengineering which encompasses efforts to use technology to manipulate the environment. Brightening, like other geoengineering proposals, is controversial for its ethical and political ramifications and the uncertainty around its impact. But those aren’t reasons not to study it, Wood said.

“I would rather that responsible scientists test the idea than groups that might have a vested interest in proving its success,” he said. The danger with private organizations experimenting with geoengineering is that “there is an assumption that it’s got to work,” he said.

Wood and his colleagues propose trying a small-scale experiment to test feasibility and begin to study effects. The test should start by deploying sprayers on a ship or barge to ensure that they can inject enough particles of the targeted size to the appropriate elevation, Wood and a colleague wrote in the report. An airplane equipped with sensors would study the physical and chemical characteristics of the particles and how they disperse.

The next step would be to use additional airplanes to study how the cloud develops and how long it remains. The final phase of the experiment would send out five to 10 ships spread out across a 100 kilometer, or 62 mile, stretch. The resulting clouds would be large enough so that scientists could use satellites to examine them and their ability to reflect light.

Wood said there is very little chance of long-term effects from such an experiment. Based on studies of pollutants, which emit particles that cause a similar reaction in clouds, scientists know that the impact of adding particles to clouds lasts only a few days.

Still, such an experiment would be unusual in the world of climate science, where scientists observe rather than actually try to change the atmosphere.

Wood notes that running the experiment would advance knowledge around how particles like pollutants impact the climate, although the main reason to do it would be to test the geoengineering idea.

A phenomenon that inspired marine cloud brightening is ship trails: clouds that form behind the paths of ships crossing the ocean, similar to the trails that airplanes leave across the sky. Ship trails form around particles released from burning fuel.

But in some cases ship trails make clouds darker. “We don’t really know why that is,” Wood said.

Despite increasing interest from scientists like Wood, there is still strong resistance to cloud brightening.

“It’s a quick-fix idea when really what we need to do is move toward a low-carbon emission economy, which is turning out to be a long process,” Wood said. “I think we ought to know about the possibilities, just in case.”

The authors of the paper are treading cautiously.

“We stress that there would be no justification for deployment of [marine cloud brightening] unless it was clearly established that no significant adverse consequences would result. There would also need to be an international agreement firmly in favor of such action,” they wrote in the paper’s summary.

There are 25 authors on the paper, including scientists from University of Leeds, University of Edinburgh and the Pacific Northwest National Laboratory. The lead author is John Latham of the National Center for Atmospheric Research and the University of Manchester, who pioneered the idea of marine cloud brightening.

Wood’s research was supported by the UW College of the Environment Institute.

Journal Reference:

J. Latham, K. Bower, T. Choularton, H. Coe, P. Connolly, G. Cooper, T. Craft, J. Foster, A. Gadian, L. Galbraith, H. Iacovides, D. Johnston, B. Launder, B. Leslie, J. Meyer, A. Neukermans, B. Ormond, B. Parkes, P. Rasch, J. Rush, S. Salter, T. Stevenson, H. Wang, Q. Wang, R. Wood. Marine cloud brighteningPhilosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 2012; 370 (1974): 4217 DOI:10.1098/rsta.2012.0086

Traços exóticos da ‘partícula de Deus’ surpreendem físicos (Folha de são Paulo)

JC e-mail 4559, de 10 de Agosto de 2012.

Estudo com participação de brasileiro indica que bóson de Higgs pode não se encaixar em teoria mais aceita hoje. Análise preliminar dá pistas de partículas ainda desconhecidas; outros cientistas pedem cautela com os dados.

A partícula de Deus está, ao que parece, do jeito que o diabo gosta: malcomportada. É o que indica uma análise preliminar de dados coletados no LHC, maior acelerador de partículas do mundo.

O trabalho, feito por Oscar Éboli, do Instituto de Física da Universidade de São Paulo (USP), sugere que o chamado bóson de Higgs, que seria responsável por dar massa a tudo o que existe, não está se portando como deveria, a julgar pela teoria que previu sua existência, o Modelo Padrão. Se confirmado, o comportamento anômalo da partícula seria a deixa para uma nova era da física.

A descoberta do possível bóson, anunciada com estardalhaço no mês passado, foi comemorada como a finalização de uma etapa gloriosa no estudo das partículas fundamentais da matéria. Sua existência, em resumo, explicaria porque o Sol pode produzir sua energia e criaturas como nós podem existir.

Dada sua importância para a consistência do Universo (e fazendo uma analogia com a história bíblica da torre de Babel), o físico ganhador do Nobel Leon Lederman deu ao bóson o apelido de “partícula de Deus”.

Para analisar o bóson de Higgs, é preciso primeiro produzir uma colisão entre prótons em altíssima velocidade – função primordial do LHC. Então, do impacto de alta energia, surgem montes de novas partículas, dentre as quais o Higgs, que rapidamente decai, como se diz.

É que, por ser muito instável, o bóson se “decompõe” quando a energia da colisão diminui. Aparecem, no lugar dele, outras partículas. É esse subproduto que pode ser detectado e indicar a existência do bóson de Higgs. Contudo, isso exige a realização de muitos impactos, até que as estatísticas comecem a sugerir a presença do procurado bóson.

Os dados coletados até aqui são suficientes para apontar a existência da partícula, mas suas características específicas ainda não puderam ser determinadas. “Estamos ainda num estágio inicial da exploração das propriedades da dela”, diz Éboli. “Contudo, há uma indicação de que o Higgs decaia mais em dois fótons [partículas de luz] do que seria esperado no Modelo Padrão.”

Os resultados dessa análise preliminar foram divulgados no Arxiv.org, repositório de estudos de física na internet, e abordados na revista “Pesquisa Fapesp”.

Surpresa bem-vinda – A novidade anima os cientistas. “Para a maioria dos físicos, o Modelo Padrão é uma boa representação da natureza, mas não é a teoria final”, afirma Éboli. “Se de fato for confirmado que o Higgs está decaindo mais que o esperado em dois fótons, isso pode significar que novas partículas podem estar dentro do alcance de descoberta do LHC.”

Poderia ser o primeiro vislumbre de um novo “zoológico” de tijolos elementares da matéria. Previa-se que essas partículas exóticas começassem a aparecer com as energias elevadas do LHC.

Tudo muito interessante, mas nada resolvido. “É um trabalho muito sério, mas eu acho que ainda é muito cedo para se tirar qualquer conclusão se se trata ou não do Higgs padrão”, afirma Sérgio Novaes, pesquisador da Unesp que participa de um dos experimentos que detectaram o bóson de Higgs. “Até o final do ano as coisas estarão um pouco mais claras”, avalia ele.

Populations Survive Despite Many Deleterious Mutations: Evolutionary Model of Muller’s Ratchet Explored (Science Daily)

ScienceDaily (Aug. 10, 2012) — From protozoans to mammals, evolution has created more and more complex structures and better-adapted organisms. This is all the more astonishing as most genetic mutations are deleterious. Especially in small asexual populations that do not recombine their genes, unfavourable mutations can accumulate. This process is known as Muller’s ratchet in evolutionary biology. The ratchet, proposed by the American geneticist Hermann Joseph Muller, predicts that the genome deteriorates irreversibly, leaving populations on a one-way street to extinction.

Equilibrium of mutation and selection processes: A population can be divided into groups of individuals that carry different numbers of deleterious mutations. Groups with few mutations are amplified by selection but loose members to other groups by mutation. Groups with many mutations don’t reproduce as much, but gain members by mutation. (Credit: © Richard Neher/MPI for Developmental Biology)

In collaboration with colleagues from the US, Richard Neher from the Max Planck Institute for Developmental Biology has shown mathematically how Muller’s ratchet operates and he has investigated why populations are not inevitably doomed to extinction despite the continuous influx of deleterious mutations.

The great majority of mutations are deleterious. “Due to selection individuals with more favourable genes reproduce more successfully and deleterious mutations disappear again,” explains the population geneticist Richard Neher, leader of an independent Max Planck research group at the Max Planck Institute for Developmental Biology in Tübingen, Germany. However, in small populations such as an asexually reproducing virus early during infection, the situation is not so clear-cut. “It can then happen by chance, by stochastic processes alone, that deleterious mutations in the viruses accumulate and the mutation-free group of individuals goes extinct,” says Richard Neher. This is known as a click of Muller’s ratchet, which is irreversible — at least in Muller’s model.

Muller published his model on the evolutionary significance of deleterious mutations in 1964. Yet to date a quantitative understanding of the ratchet’s processes was lacking. Richard Neher and Boris Shraiman from the University of California in Santa Barbara have now published a new theoretical study on Muller’s ratchet. They chose a comparably simple model with only deleterious mutations all having the same effect on fitness. The scientists assumed selection against those mutations and analysed how fluctuations in the group of the fittest individuals affected the less fit ones and the whole population. Richard Neher and Boris Shraiman discovered that the key to the understanding of Muller’s ratchet lies in a slow response: If the number of the fittest individuals is reduced, the mean fitness decreases only after a delay. “This delayed feedback accelerates Muller’s ratchet,” Richard Neher comments on the results. It clicks more and more frequently.

“Our results are valid for a broad range of conditions and parameter values — for a population of viruses as well as a population of tigers.” However, he does not expect to find the model’s conditions one-to-one in nature. “Models are made to understand the essential aspects, to identify the critical processes,” he explains.

In a second study Richard Neher, Boris Shraiman and several other US-scientists from the University of California in Santa Barbara and Harvard University in Cambridge investigated how a small asexual population could escape Muller’s ratchet. “Such a population can only stay in a steady state for a long time when beneficial mutations continually compensate for the negative ones that accumulate via Muller’s ratchet,” says Richard Neher. For their model the scientists assumed a steady environment and suggest that there can be a mutation-selection balance in every population. They have calculated the rate of favourable mutations required to maintain the balance. The result was surprising: Even under unfavourable conditions, a comparably small proportion in the range of several percent of positive mutations is sufficient to sustain a population.

These findings could explain the long-term maintenance of mitochondria, the so-called power plants of the cell that have their own genome and divide asexually. By and large, evolution is driven by random events or as Richard Neher says: “Evolutionary dynamics are very stochastic.”

Deeply Held Religious Beliefs Prompting Sick Kids to Be Given ‘Futile’ Treatment (Science Daily)

ScienceDaily (Aug. 13, 2012) — Parental hopes of a “miraculous intervention,” prompted by deeply held religious beliefs, are leading to very sick children being subjected to futile care and needless suffering, suggests a small study in the Journal of Medical Ethics.

The authors, who comprise children’s intensive care doctors and a hospital chaplain, emphasise that religious beliefs provide vital support to many parents whose children are seriously ill, as well as to the staff who care for them.

But they have become concerned that deeply held beliefs are increasingly leading parents to insist on the continuation of aggressive treatment that ultimately is not in the best interests of the sick child.

It is time to review the current ethics and legality of these cases, they say.

They base their conclusions on a review of 203 cases which involved end of life decisions over a three year period.

In 186 of these cases, agreement was reached between the parents and healthcare professionals about withdrawing aggressive, but ultimately futile, treatment.

But in the remaining 17 cases, extended discussions with the medical team and local support had failed to resolve differences of opinion with the parents over the best way to continue to care for the very sick child in question.

The parents had insisted on continuing full active medical treatment, while doctors had advocated withdrawing or withholding further intensive care on the basis of the overwhelming medical evidence.

The cases in which withdrawal or withholding of intensive care was considered to be in the child’s best interests were consistent with the Royal College of Paediatrics and Child Health guidance.

Eleven of these cases (65%) involved directly expressed religious claims that intensive care should not be stopped because of the expectation of divine intervention and a complete cure, together with the conviction that the opinion of the medical team was overly pessimistic and wrong.

Various different faiths were represented among the parents, including Christian fundamentalism, Islam, Judaism, and Roman Catholicism.

Five of the 11 cases were resolved after meeting with the relevant religious leaders outside the hospital, and intensive care was withdrawn in a further case after a High Court order.

But five cases were not resolved, so intensive care was continued. Four of these children eventually died; one survived with profound neurological disability.

Six of the 17 cases in which religious belief was not a cited factor, were all resolved without further recourse to legal, ethical, or socio-religious support. Intensive care was withdrawn in all these children, five of whom died and one of whom survived, but with profound neurological disability.

The authors emphasise that parental reluctance to allow treatment to be withdrawn is “completely understandable as [they] are defenders of their children’s rights, and indeed life.”

But they argue that when children are too young to be able to actively subscribe to their parents’ religious beliefs, a default position in which parental religion is not the determining factor might be more appropriate.

They cite Article 3 of the Human Rights Act, which aims to ensure that no one is subjected to torture or inhumane or degrading treatment or punishment.

“Spending a lifetime attached to a mechanical ventilator, having every bodily function supervised and sanitised by a carer or relative, leaving no dignity or privacy to the child and then adult, has been argued as inhumane,” they argue.

And they conclude: “We suggest it is time to reconsider current ethical and legal structures and facilitate rapid default access to courts in such situations when the best interests of the child are compromised in expectation of the miraculous.”

In an accompanying commentary, the journal’s editor, Professor Julian Savulescu, advocates: “Treatment limitation decisions are best made, not in the alleged interests of patients, but on distributive justice grounds.”

In a publicly funded system with limited resources, these should be given to those whose lives could be saved rather than to those who are very unlikely to survive, he argues.

“Faced with the choice between providing an intensive care bed to a [severely brain damaged] child and one who has been at school and was hit by a cricket ball and will return to normal life, we should provide the bed to the child hit by the cricket ball,” he writes.

In further commentaries, Dr Steve Clarke of the Institute for Science and Ethics maintains that doctors should engage with devout parents on their own terms.

“Devout parents, who are hoping for a miracle, may be able to be persuaded, by the lights of their own personal…religious beliefs, that waiting indefinite periods of time for a miracle to occur while a child is suffering, and while scarce medical equipment is being denied to other children, is not the right thing to do,” he writes.

Leading ethicist, Dr Mark Sheehan, argues that these ethical dilemmas are not confined to fervent religious belief, and to polarise the issue as medicine versus religion is unproductive, and something of a “red herring.”

Referring to the title of the paper, Charles Foster, of the University of Oxford, suggests that the authors have asked the wrong question. “The legal and ethical orthodoxy is that no beliefs, religious or secular, should be allowed to stonewall the best interests of the child,” he writes.

How Do They Do It? Predictions Are in for Arctic Sea Ice Low Point (Science Daily)

ScienceDaily (Aug. 14, 2012) — It’s become a sport of sorts, predicting the low point of Arctic sea ice each year. Expert scientists with decades of experience do it but so do enthusiasts, whose guesses are gamely included in a monthly predictions roundup collected by Sea Ice Outlook, an effort supported by the U.S. government.

Arctic sea ice, as seen from an ice breaker. (Credit: Bonnie Light, UW)

When averaged, the predictions have come in remarkably close to the mark in the past two years. But the low and high predictions are off by hundreds of thousands of square kilometers.

Researchers are working hard to improve their ability to more accurately predict how much Arctic sea ice will remain at the end of summer. It’s an important exercise because knowing why sea ice declines could help scientists better understand climate change and how sea ice is evolving.

This year, researchers from the University of Washington’s Polar Science Center are the first to include new NASA sea ice thickness data collected by airplane in a prediction.

They expect 4.4 million square kilometers of remaining ice (about 1.7 million square miles), just barely more than the 4.3 million kilometers in 2007, the lowest year on record for Arctic sea ice. The median of 23 predictions collected by the Sea Ice Outlook and released on Aug. 13 is 4.3 million.

“One drawback to making predictions is historically we’ve had very little information about the thickness of the ice in the current year,” said Ron Lindsay, a climatologist at the Polar Science Center, a department in the UW’s Applied Physics Laboratory.

To make their prediction, Lindsay and Jinlun Zhang, an oceanographer in the Polar Science Center, start with a widely used model pioneered by Zhang and known as the Pan-Arctic Ice Ocean Modeling and Assimilation System. That system combines available observations with a model to track sea ice volume, which includes both ice thickness and extent.

But obtaining observations about current-year ice thickness in order to build their short-term prediction is tough. NASA is currently in the process of designing a new satellite that will replace one that used to deliver ice thickness data but has since failed. In the meantime, NASA is running a program called Operation IceBridge that uses airplanes to survey sea ice as well as Arctic ice sheets.

“This is the first year they made a concerted effort to get the data from the aircraft, process it and get it into hands of scientists in a timely manner,” Lindsay said. “In the past, we’ve gotten data from submarines, moorings or satellites but none of that data was available in a timely manner. It took months or even years.”

There’s a shortcoming to the IceBridge data, however: It’s only available through March. The radar used to measure snow depth on the surface of the ice, an important element in the observation system, has trouble accurately gauging the depth once it has melted and so the data is only collected through the early spring before the thaw.

The UW scientists have developed a method for informing their prediction that is starting to be used by others. Researchers have struggled with how best to forecast the weather in the Arctic, which affects ice melt and distribution.

“Jinlun came up with the idea of using the last seven summers. Because the climate is changing so fast, only the recent summers are probably relevant,” Lindsay said.

The result is seven different possibilities of what might happen. “The average of those is our best guess,” Lindsay said.

Despite the progress in making predictions, the researchers say their abilities to foretell the future will always be limited. Because they can’t forecast the weather very far in advance and because the ice is strongly affected by winds, they have little confidence beyond what the long-term trend tells us in predictions that are made far in advance.

“The accuracy of our prediction really depends on time,” Zhang said. “Our June 1 prediction for the Sept. 15 low point has high uncertainty but as we approach the end of June or July, the uncertainty goes down and the accuracy goes up.”

In hindsight, that’s true historically for the average predictions collected by Study of Environmental Arctic Change’s Sea Ice Outlook, a project funded by the National Science Foundation and the National Oceanic and Atmospheric Administration.

While the competitive aspect of the predictions is fun, the researchers aren’t in it to win it.

“Essentially it’s not for prediction but for understanding,” Zhang said. “We do it to improve our understanding of sea ice processes, in terms of how dynamic processes affect the seasonal evolution of sea ice.”

That may not be entirely the same for the enthusiasts who contribute a prediction. One climate blog polls readers in the summer for their best estimate of the sea ice low point. It’s included among the predictions collected by the Sea Ice Outlook, with an asterisk noting it as a “public outlook.”

The National Science Foundation and NASA fund the UW research into the Arctic sea ice low point.

Nova legislação dará base científica à prevenção de desastres naturais, dizem especialistas (Fapesp)

Lei sancionada em abril obrigará municípios a elaborar carta geotécnica, instrumento multidisciplinar que orientará implantação de sistemas de alerta e planos diretores (Valter Campanato/ABr)

08/08/2012

Por Fábio de Castro

Agência FAPESP – Em janeiro de 2011, enchentes e deslizamentos deixaram cerca de mil mortos e 500 desaparecidos na Região Serrana do Rio de Janeiro. A tragédia evidenciou a precariedade dos sistemas de alerta no Brasil e foi considerada por especialistas como a prova definitiva de que era preciso investir na prevenção de desastres.

O mais importante desdobramento dessa análise foi a Lei 12.608, sancionada em abril, que estabelece a Política Nacional de Proteção e Defesa Civil e cria o sistema de informações e monitoramento de desastres, de acordo com especialistas reunidos no seminário “Caminhos da política nacional de defesa de áreas de risco”, realizado pela Escola Politécnica da Universidade de São Paulo (USP) no dia 6 de agosto.

A nova lei obriga as prefeituras a investir em planejamento urbano na prevenção de desastres do tipo enchentes e deslizamentos de terra. Segundo os especialistas, pela primeira vez a prevenção de desastres poderá ser feita com fundamento técnico e científico sólido, já que a lei determina que, para fazer o planejamento, todas as prefeituras precisarão elaborar cartas geotécnicas dos municípios.

Katia Canil, pesquisadora do Laboratório de Riscos Ambientais do Instituto de Pesquisas Tecnológicas (IPT), disse que as prefeituras terão dois anos para elaborar as cartas geotécnicas para lastrear seus planos diretores, que deverão contemplar ações de prevenção e mitigação de desastres. Os municípios que não apresentarem esse planejamento não receberão recursos federais para obras de prevenção e mitigação.

“As cartas geotécnicas são documentos cartográficos que reúnem informações sobre as características geológicas e geomorfológicas dos municípios, identificando riscos geológicos e facilitando a criação de regras para a ocupação urbana. Com a obrigatoriedade desse instrumento, expressa na lei, poderemos ter estratégias de prevenção de desastres traçadas com base no conhecimento técnico e científico”, disse Canil à Agência FAPESP.

A primeira carta geotécnica do Brasil foi feita em 1979, no município de Santos (SP), mas, ainda assim, o instrumento se manteve pouco difundido no país. Segundo Canil, a institucionalização da ferramenta será um fator importante para a adequação dos planos diretores em relação às características geotécnicas dos terrenos.

“Poucos municípios têm carta geotécnica, porque não era um instrumento obrigatório. Agora, esse panorama deve mudar. Mas a legislação irá gerar uma grande demanda de especialistas em diversas áreas, porque as cartas geotécnicas integram uma gama de dados interdisciplinares”, disse a pesquisadora do IPT.

As cartas geotécnicas reúnem documentos que resultam de levantamentos geológicos e geotécnicos de campo, além de análises laboratoriais, com o objetivo de sintetizar todo o conhecimento disponível sobre o meio físico e sua relação com os processos geológicos e humanos presentes no local. “E tudo isso precisa ser expresso em uma linguagem adequada para que os gestores compreendam”, disse Canil.

As cidades terão que se organizar para elaborar cartas geotécnicas e a capacitação técnica necessária não é trivial. “Não se trata apenas de cruzar mapas. É preciso ter experiência aliada ao treinamento em áreas como geologia, engenharia, engenharia geotécnica, cartografia, geografia, arquitetura e urbanismo”, disse Canil. O IPT já oferece um curso de capacitação para elaboração de cartas geotécnicas.

Uma dificuldade importante para a elaboração das cartas será a carência de mapeamento geológico de base nos municípios brasileiros. “A maior parte dos municípios não tem dados primários, como mapeamentos geomorfológicos, pedológicos e geológicos”, disse Canil.

Plano nacional de prevenção

A tragédia da Região Serrana fluminense, em janeiro de 2011, foi um marco que mudou o rumo das discussões sobre desastres, destacando definitivamente o papel central da prevenção, segundo Carlos Nobre, secretário de Políticas e Programas de Pesquisa e Desenvolvimento do Ministério da Ciência, Tecnologia e Inovação (MCTI).

“Aquele episódio foi um solavanco que chacoalhou a percepção brasileira para o tema dos grandes desastres. Tornou-se óbvio para os gestores e para a população que é preciso enfatizar o eixo da prevenção. Foi um marco que mudou nossa perspectiva para sempre: prevenção é fundamental”, disse durante o evento.

Segundo Nobre, que também é pesquisador do Instituto Nacional de Pesquisas Espaciais (Inpe) e membro da coordenação do Programa FAPESP de Pesquisa sobre Mudanças Climáticas Globais, a experiência internacional mostra que a prevenção pode reduzir em até 90% o número de vítimas fatais em desastres naturais, além de diminuir em cerca de 35% os danos materiais. “Além de poupar vidas, a economia com os prejuízos materiais já compensa com sobras todos os investimentos em prevenção”, disse.

De acordo com Nobre, a engenharia terá um papel cada vez mais importante na prevenção, à medida que os desastres naturais se tornarem mais extremos por consequência das mudanças climáticas.

“O engenheiro do século 21 precisará ser treinado para a engenharia da sustentabilidade – um campo transversal da engenharia que ganhará cada vez mais espaço. A engenharia, se bem conduzida, é central para solucionar alguns dos principais problemas da atualidade”, afirmou.

Segundo Nobre, além da nova legislação, que obrigará o planejamento com base em cartas geotécnicas dos municípios, o Brasil conta com diversas iniciativas na área de prevenção de desastres. Uma delas será anunciada nesta quarta-feira (08/08): o Plano Nacional de Prevenção a Desastres Naturais, que enfatiza as obras voltadas para a instalação de sistemas de alerta.

“Há obras de grande escala necessárias no Brasil, especialmente no que se refere aos sistemas de alerta. Um dos elementos importantes do novo plano é a questão do alerta precoce. Experiências internacionais mostram que um alerta feito até duas horas antes de um deslizamento é capaz de salvar vidas”, disse.

Segundo Nobre, as iniciativas do plano serão coerentes com a nova legislação. O governo federal deverá investir R$ 4,6 bilhões, nos próximos meses, em iniciativas de prevenção de desastres nos estados do Rio de Janeiro, Minas Gerais e Santa Catarina.

Mas, para pleitear verbas federais, o município deverá cumprir uma série de requisitos, como incorporar as ações de proteção e defesa civil no planejamento municipal, identificar e mapear as áreas de risco de desastres naturais, impedir novas ocupações e vistoriar edificações nessas áreas.

Segundo Nobre, outra ação voltada para a prevenção de desastres foi a implantação do Centro Nacional de Monitoramento e Alertas de Desastres Naturais (Cemaden), do MCTI, que começou a operar em dezembro de 2011, no campus do Inpe em Cachoeira Paulista (SP).

“Esse centro já tinha um papel importante na previsão de tempo, mas foi reformulado e contratou 35 profissionais. O Cemaden nasce como um emblema dos novos sistemas de alerta: uma concepção que une geólogos, meteorólogos e especialistas em desastres naturais para identificar vulnerabilidades, algo raro no mundo”, afirmou.

Segundo ele, essa nova estrutura já tem um sistema de alertas em funcionamento. “É um sistema que ainda vai precisar ser avaliado com o tempo. Mas até agora, desde dezembro de 2011, já foram lançados mais de 100 alertas. O país levará vários anos para reduzir as fatalidades como os países que têm bons sistemas de prevenção. Mas estamos no caminho certo”, disse Nobre.

New Book Explores ‘Noah’s Flood’: Says Bible and Science Can Get Along (Science Daily)

ScienceDaily (Aug. 14, 2012) — David Montgomery is a geomorphologist, a geologist who studies changes to topography over time and how geological processes shape landscapes. He has seen firsthand evidence of how the forces that have shaped Earth run counter to some significant religious beliefs.

But the idea that scientific reason and religious faith are somehow at odds with each other, he said, “is, in my view, a false dichotomy.”

In a new book, “The Rocks Don’t Lie: A Geologist Investigates Noah’s Flood” (Aug. 27, 2012, W.W. Norton), Montgomery explores the long history of religious thinking — particularly among Christians — on matters of geological discovery, from the writings of St. Augustine 1,700 years ago to the rise in the mid-20th century of the most recent rendering of creationism.

“The purpose is not to tweak people of faith but to remind everyone about the long history in the faith community of respecting what we can learn from observing the world,” he said.

Many of the earliest geologists were clergy, he said. Nicolas Steno, considered the founder of modern geology, was a 17th century Roman Catholic priest who has achieved three of the four steps to being declared a saint in the church.

“Though there are notable conflicts between religion and science — the famous case of Galileo Galilei, for example — there also is a church tradition of working to reconcile biblical stories with known scientific fact,” Montgomery said.

“What we hear today as the ‘Christian’ positions are really just one slice of a really rich pie,” he said.

For nearly two centuries there has been overwhelming geological evidence that a global flood, as depicted in the story of Noah in the biblical book of Genesis, could not have happened. Not only is there not enough water in the Earth system to account for water levels above the highest mountaintop, but uniformly rising levels would not allow the water to have the erosive capabilities attributed to Noah’s Flood, Montgomery said.

Some rock formations millions of years old show no evidence of such large-scale water erosion. Montgomery is convinced any such flood must have been, at best, a regional event, perhaps a catastrophic deluge in Mesopotamia. There are, in fact, Mesopotamian stories with details very similar, but predating, the biblical story of Noah’s Flood.

“If your world is small enough, all floods are global,” he said.

Perhaps the greatest influence in prompting him to write “The Rocks Don’t Lie” was a 2002 expedition to the Tsangpo River on the Tibetan Plateau. In the fertile river valley he found evidence in sediment layers that a great lake had formed in the valley many centuries ago, not once but numerous times. Downstream he found evidence that a glacier on several occasions advanced far enough to block the river, creating the huge lake.

But ice makes an unstable dam, and over time the ice thinned and finally give way, unleashing a tremendous torrent of water down the deepest gorge in the world. It was only after piecing the story together from geological evidence that Montgomery learned that local oral traditions told of exactly this kind of great flood.

“To learn that the locals knew about it and talked about it for the last thousand years really jolted my thinking. Here was evidence that a folk tale might be reality based,” he said.

He has seen evidence of huge regional floods in the scablands of Eastern Washington, carved by torrents when glacial Lake Missoula breached its ice dam in Montana and raced across the landscape, and he found Native American stories that seem to tell of this catastrophic flood.

Other flood stories dating back to the early inhabitants of the Pacific Northwest and from various islands in the Pacific Ocean, for example, likely tell of inundation by tsunamis after large earthquakes.

But he noted that in some regions of the world — in Africa, for example — there are no flood stories in the oral traditions because there the annual floods help sustain life rather than bring destruction.

Floods are not always responsible for major geological features. Hiking a trail from the floor of the Grand Canyon to its rim, Montgomery saw unmistakable evidence of the canyon being carved over millions of years by the flow of the Colorado River, not by a global flood several thousand years ago as some people still believe.

He describes that hike in detail in “The Rocks Don’t Lie.” He also explores changes in the understanding of where fossils came from, how geologists read Earth history in layers of rock, and the writings of geologists and religious authorities through the centuries.

Montgomery hopes the book might increase science literacy. He noted that a 2001 National Science Foundation survey found that more than half of American adults didn’t realize that dinosaurs were extinct long before humans came along.

But he also would like to coax readers to make sense of the world through both what they believe and through what they can see for themselves, and to keep an open mind to new ideas.

“If you think you know everything, you’ll never learn anything,” he said.