Arquivo da tag: ciência

‘Não podemos brincar de Deus com as alterações no genoma humano’, alerta ONU (ONU)

Publicado em Atualizado em 07/10/2015

A modificação do código genético permite tratar doenças como o câncer, mas pode gerar mudanças hereditárias. UNESCO pede uma regulamentação clara sobre os procedimentos científicos e informação à população.

Foto: Flickr/ ynse

“Terapia genética poderia ser o divisor de águas na história da medicina e a alteração no genoma é sem dúvida um dos maiores empreendimentos da ciência em nome da humanidade”, afirmou a Organização das Nações Unidas para a Educação, a Ciência e a Cultura (UNESCO) sobre um relatório publicado pelo Comitê Internacional de Bioética (IBC) nesta segunda-feira (5).

O IBC acrescentou, no entanto, que intervenções no genoma humano deveriam ser autorizadas somente em casos preventivos, diagnósticos ou terapêuticos que não gerem alterações para os descendentes. O relatório destaca também a importância da regulamentação e informação clara aos consumidores.

O documento ressaltou os avanços na possibilidade de testes genéticos em casos de doenças hereditárias, por meio da terapia genética, o uso de células tronco embrionárias na pesquisa médica e uso de clones e alterações genéticas para fins medicinais. São citadas também novas técnicas que podem inserir, tirar e corrigir o DNA, podendo tratar ou curar o câncer e outras doenças. Porém, estas mesmas técnicas também possibilitam mudanças no DNA, como determinar a cor dos olhos de um bebê, por exemplo.

“O grande medo é que podemos estar tentando “brincar de Deus” com consequências imprevisíveis” e no final precipitando a nossa própria destruição”, alertou o antigo secretário-geral da ONU, Kofi Annan em 2004, quando perguntado qual seria a linha ética que determinaria o limite das alterações no genoma humano. Para responder a essa questão, os Estados-membros da UNESCO adotaram em 2005 a Declaração Universal sobre Bioética e Direitos Humanos que lida com os dilemas éticos levantados pelas rápidas mudanças na medicina, na ciência e tecnologia.

Pesquisa sobre portuários é questionada pela Câmara de Santos (Diário do Litoral)

Levantamento aponta uso de drogas pela categoria. Intenção dos vereadores era redigir uma moção de apoio aos trabalhadores

Da Reportagem

Atualizado em 22 de setembro de 2015 às 11h45

Dois presidentes de sindicatos ligados ao Porto de Santos protestaram, na sessão de ontem da Câmara, contra a divulgação de uma pesquisa feita pela Universidade Federal Paulista (Unifesp) apontando o consumo de entorpecentes e ingestão de álcool entre os trabalhadores avulsos do cais.

As críticas partiram do presidente do Sindicato dos Operários Portuários (Sintraport), Claudiomiro Machado, o Miro, e do presidente do Sindicato dos Estivadores, Rodnei Oliveira, o Nei da Estiva. Eles afirmaram que o levantamento feito pela universidade feriu a honra da “família portuária”.

Quase todos os vereadores apoiaram a fala dos sindicalistas e questionaram o método de como a pesquisa foi feita. A pesquisa apontaria que 25% dos trabalhadores avulsos usam crack ou cocaína e 80% fazem ingestão de bebida alcoólica.

Miro questionou, por exemplo, o local onde o levantamento foi feito. “Dentro do Porto o acesso é liberado apenas ao trabalhador. Não foram lá entrevistar trabalhador portuário”.

O presidente do Sintraport relatou o drama vivido por um associado, cujo filho foi questionado na escola sobre a profissão do  pai. “Falaram para o garoto: teu pai é portuário? Então ele usa cocaína, usa crack”.

Já Nei da Estiva se mostrou indignado pelo fato de nenhum sindicato ter sido procurado para comentar os dados da pesquisa.

Intenção dos vereadores era redigir uma moção de apoio aos trabalhadores ( Foto: Matheus Tagé/DL)Intenção dos vereadores era redigir uma moção de apoio aos trabalhadores ( Foto: Matheus Tagé/DL)

O vereador Antônio Carlos Banha Joaquim (PMDB) lembrou que a Unifesp já foi alvo de uma investigação de uma Comissão de Inquérito aberta na casa, que apurou contratos da universidade com a Prefeitura. “Um trabalho científico tem de ser feito com metodologia”, comentou.

Banha também se disse atingido com o resultado da pesquisa. “Meu avô era trabalhador portuário. Ele deve estar rolando no caixão”, comentou, antes de sugerir que a Unifesp seja questionada judicialmente sobre o levantamento.

Para o vereador Benedito Furtado (PSB), o resultado da pesquisa “dá a entender que 80% dos portuários são alcoólatras”. Ele também atacou ferozmente a universidade. “Essa tal de Unifesp não cumpre lei municipal”.

Ressaltando ser filho de estivador, Geonísio Pereira de Aguiar, o Boquinha (PSDB), além de questionar a seriedade da pesquisa, disse que quase todos os alunos da instituição não são de Santos e, por isso,  devem conhecer pouco o cais.

Igor Martins de Melo, o Professor Igor (PSB), foi outro a lembrar que os pesquisadores precisam ter autorização para entrar na área portuária. “Quer dizer, então, que o maior porto da América Latina é tocado por um bando de irresponsáveis? O que é isso?”

Vereador e professor de Matemática, José Lascane (PSDB) disse que é preciso tomar extremo cuidado ao se fazer um levantamento feito pela Unifesp. “A amostra precisa ser bem avaliada, bem como a formulação da pergunta, que precisa ser bem clara”.

Cobrou posição

Marcelo Del Bosco (PPS) deu uma sugestão ao líder do Governo na Câmara, Sadao Nakai (PSDB): o secretário municipal de Assuntos Portuários e Marítimos, José Eduardo Lopes, deve se manifestar sobre o levantamento.

Roberto Oliveira Teixeira, o Pastor Roberto (PMDB), disse que as esposas dos trabalhadores portuários “se sentiram humilhadas com o resultado dessa pesquisa”.

Exxon’s Own Research Confirmed Fossil Fuels’ Role in Global Warming Decades Ago (Inside Climate News)

Top executives were warned of possible catastrophe from greenhouse effect, then led efforts to block solutions.

By Neela Banerjee, Lisa Song and David Hasemyer

Sep 16, 2015

Exxon Experiment

Exxon’s Richard Werthamer (right) and Edward Garvey (left) are aboard the company’s Esso Atlantic tanker working on a project to measure the carbon dioxide levels in the ocean and atmosphere. The project ran from 1979 to 1982. (Credit: Richard Werthamer)

“In the first place, there is general scientific agreement that the most likely manner in which mankind is influencing the global climate is through carbon dioxide release from the burning of fossil fuels,” Black told Exxon’s Management Committee, according to a written version he recorded later.

It was July 1977 when Exxon’s leaders received this blunt assessment, well before most of the world had heard of the looming climate crisis.

A year later, Black, a top technical expert in Exxon’s Research & Engineering division, took an updated version of his presentation to a broader audience. He warned Exxon scientists and managers that independent researchers estimated a doubling of the carbon dioxide (CO2) concentration in the atmosphere would increase average global temperatures by 2 to 3 degrees Celsius (4 to 5 degrees Fahrenheit), and as much as 10 degrees Celsius (18 degrees Fahrenheit) at the poles.  Rainfall might get heavier in some regions, and other places might turn to desert.

“Some countries would benefit but others would have their agricultural output reduced or destroyed,” Black said, in the written summary of his 1978 talk.

His presentations reflected uncertainty running through scientific circles about the details of climate change, such as the role the oceans played in absorbing emissions. Still, Black estimated quick action was needed. “Present thinking,” he wrote in the 1978 summary, “holds that man has a time window of five to ten years before the need for hard decisions regarding changes in energy strategies might become critical.”

Exxon responded swiftly. Within months the company launched its own extraordinary research into carbon dioxide from fossil fuels and its impact on the earth. Exxon’s ambitious program included both empirical CO2 sampling and rigorous climate modeling. It assembled a brain trust that would spend more than a decade deepening the company’s understanding of an environmental problem that posed an existential threat to the oil business.

Then, toward the end of the 1980s, Exxon curtailed its carbon dioxide research. In the decades that followed, Exxon worked instead at the forefront of climate denial. It put its muscle behind efforts to manufacture doubt about the reality of global warming its own scientists had once confirmed. It lobbied to block federal and international action to control greenhouse gas emissions. It helped to erect a vast edifice of misinformation that stands to this day.

This untold chapter in Exxon’s history, when one of the world’s largest energy companies worked to understand the damage caused by fossil fuels, stems from an eight-month investigation by InsideClimate News. ICN’s reporters interviewed former Exxon employees, scientists, and federal officials, and consulted hundreds of pages of internal Exxon documents, many of them written between 1977 and 1986, during the heyday of Exxon’s innovative climate research program. ICN combed through thousands of documents from archives including those held at the University of Texas-Austin, the Massachusetts Institute of Technology and the American Association for the Advancement of Science.

The documents record budget requests, research priorities, and debates over findings, and reveal the arc of Exxon’s internal attitudes and work on climate and how much attention the results received.

Of particular significance was a project launched in August 1979, when the company outfitted a supertanker with custom-made instruments. The project’s mission was to sample carbon dioxide in the air and ocean along a route from the Gulf of Mexico to the Persian Gulf.

In 1980, Exxon assembled a team of climate modelers who investigated fundamental questions about the climate’s sensitivity to the buildup  of carbon dioxide in the air. Working with university scientists and the U.S. Department of Energy, Exxon strove to be on the cutting edge of inquiry into what was then called the greenhouse effect.

Exxon’s early determination to understand rising carbon dioxide levels grew out of a corporate culture of farsightedness, former employees said. They described a company that continuously examined risks to its bottom line, including environmental factors. In the 1970s, Exxon modeled its research division after Bell Labs, staffing it with highly accomplished scientists and engineers.

In written responses to questions about the history of its research, ExxonMobil spokesman Richard D. Keil said that “from the time that climate change first emerged as a topic for scientific study and analysis in the late 1970s, ExxonMobil has committed itself to scientific, fact-based analysis of this important issue.”

“At all times,” he said, “the opinions and conclusions of our scientists and researchers on this topic have been solidly within the mainstream of the consensus scientific opinion of the day and our work has been guided by an overarching principle to follow where the science leads. The risk of climate change is real and warrants action.”

At the outset of its climate investigations almost four decades ago, many Exxon executives, middle managers and scientists armed themselves with a sense of urgency and mission.

One manager at Exxon Research, Harold N. Weinberg, shared his “grandiose thoughts” about Exxon’s potential role in climate research in a March 1978 internal company memorandum that read: “This may be the kind of opportunity that we are looking for to have Exxon technology, management and leadership resources put into the context of a project aimed at benefitting mankind.”

His sentiment was echoed by Henry Shaw, the scientist leading the company’s nascent carbon dioxide research effort.

“Exxon must develop a credible scientific team that can critically evaluate the information generated on the subject and be able to carry bad news, if any, to the corporation,” Shaw wrote to his boss Edward E. David, the executive director of Exxon Research and Engineering in 1978. “This team must be recognized for its excellence in the scientific community, the government, and internally by Exxon management.”

Irreversible and Catastrophic

Exxon budgeted more than $1 million over three years for the tanker project to measure how quickly the oceans were taking in CO2. It was a small fraction of Exxon Research’s annual $300 million budget, but the question the scientists tackled was one of the biggest uncertainties in climate science: how quickly could the deep oceans absorb atmospheric CO2? If Exxon could pinpoint the answer, it would know how long it had before CO2 accumulation in the atmosphere could force a transition away from fossil fuels.

Exxon also hired scientists and mathematicians to develop better climate models and publish research results in peer-reviewed journals. By 1982, the company’s own scientists, collaborating with outside researchers, created rigorous climate models – computer programs that simulate the workings of the climate to assess the impact of emissions on global temperatures. They confirmed an emerging scientific consensus that warming could be even worse than Black had warned five years earlier.

Esso Atlantic

Between 1979 and 1982, Exxon researchers sampled carbon dioxide levels aboard the company’s Esso Atlantic tanker (shown here).

Exxon’s research laid the groundwork for a 1982 corporate primer on carbon dioxide and climate change prepared by its environmental affairs office. Marked “not to be distributed externally,” it contained information that “has been given wide circulation to Exxon management.” In it, the company recognized, despite the many lingering unknowns, that heading off global warming “would require major reductions in fossil fuel combustion.”

Unless that happened, “there are some potentially catastrophic events that must be considered,” the primer said, citing independent experts. “Once the effects are measurable, they might not be reversible.”

The Certainty of Uncertainty

Like others in the scientific community, Exxon researchers acknowledged the uncertainties surrounding many aspects of climate science, especially in the area of forecasting models. But they saw those uncertainties as questions they wanted to address, not an excuse to dismiss what was increasingly understood.

“Models are controversial,” Roger Cohen, head of theoretical sciences at Exxon Corporate Research Laboratories, and his colleague, Richard Werthamer, senior technology advisor at Exxon Corporation, wrote in a May 1980 status report on Exxon’s climate modeling program. “Therefore, there are research opportunities for us.”

When Exxon’s researchers confirmed information the company might find troubling, they did not sweep it under the rug.

“Over the past several years a clear scientific consensus has emerged,” Cohen wrote in September 1982, reporting on Exxon’s own analysis of climate models. It was that a doubling of the carbon dioxide blanket in the atmosphere would produce average global warming of 3 degrees Celsius, plus or minus 1.5 degrees C (equal to 5 degrees Fahrenheit plus or minus 1.7 degrees F).

“There is unanimous agreement in the scientific community that a temperature increase of this magnitude would bring about significant changes in the earth’s climate,” he wrote, “including rainfall distribution and alterations in the biosphere.”

He warned that publication of the company’s conclusions might attract media attention because of the “connection between Exxon’s major business and the role of fossil fuel combustion in contributing to the increase of atmospheric CO2.”

Nevertheless, he recommended publication.

Our “ethical responsibility is to permit the publication of our research in the scientific literature,” Cohen wrote. “Indeed, to do otherwise would be a breach of Exxon’s public position and ethical credo on honesty and integrity.”

Exxon followed his advice. Between 1983 and 1984, its researchers published their results in at least three peer-reviewed papers in Journal of the Atmospheric Sciences and an American Geophysical Union monograph.

David, the head of Exxon Research, told a global warming conference financed by Exxon in October 1982 that “few people doubt that the world has entered an energy transition away from dependence upon fossil fuels and toward some mix of renewable resources that will not pose problems of COaccumulation.” The only question, he said, was how fast this would happen.

But the challenge did not daunt him. “I’m generally upbeat about the chances of coming through this most adventurous of all human experiments with the ecosystem,” David said.

Exxon considered itself unique among corporations for its carbon dioxide and climate research.  The company boasted in a January 1981 report, “Scoping Study on CO2,” that no other company appeared to be conducting similar in-house research into carbon dioxide, and it swiftly gained a reputation among outsiders for genuine expertise.

“We are very pleased with Exxon’s research intentions related to the CO2 question. This represents very responsible action, which we hope will serve as a model for research contributions from the corporate sector,” said David Slade, manager of the federal government’s carbon dioxide research program at the Energy Department, in a May 1979 letter to Shaw. “This is truly a national and international service.”

Business Imperatives

In the early 1980s Exxon researchers often repeated that unbiased science would give it legitimacy in helping shape climate-related laws that would affect its profitability.

Still, corporate executives remained cautious about what they told Exxon’s shareholders about global warming and the role petroleum played in causing it, a review of federal filings shows. The company did not elaborate on the carbon problem in annual reports filed with securities regulators during the height of its CO2 research.

Nor did it mention in those filings that concern over CO2 was beginning to influence business decisions it was facing.

Throughout the 1980s, the company was worried about developing an enormous gas field off the coast of Indonesia because of the vast amount of CO2 the unusual reservoir would release.

Exxon was also concerned about reports that synthetic oil made from coal, tar sands and oil shales could significantly boost CO2 emissions. The company was banking on synfuels to meet growing demand for energy in the future, in a world it believed was running out of conventional oil.

In the mid-1980s, after an unexpected oil glut caused prices to collapse, Exxon cut its staff deeply to save money, including many working on climate. But the climate change problem remained, and it was becoming a more prominent part of the political landscape.

“Global Warming Has Begun, Expert Tells Senate,” declared the headline of a June 1988 New York Times article describing the Congressional testimony of NASA’s James Hansen, a leading climate expert. Hansen’s statements compelled Sen. Tim Wirth (D-Colo.) to declare during the hearing that “Congress must begin to consider how we are going to slow or halt that warming trend.”

With alarm bells suddenly ringing, Exxon started financing efforts to amplify doubt about the state of climate science.

Exxon helped to found and lead the Global Climate Coalition, an alliance of some of the world’s largest companies seeking to halt government efforts to curb fossil fuel emissions. Exxon used the American Petroleum Institute, right-wing think tanks, campaign contributions and its own lobbying to push a narrative that climate science was too uncertain to necessitate cuts in fossil fuel emissions.

As the international community moved in 1997 to take a first step in curbing emissions with the Kyoto Protocol, Exxon’s chairman and CEO Lee Raymond argued to stop it.

“Let’s agree there’s a lot we really don’t know about how climate will change in the 21st century and beyond,” Raymond said in his speech before the World Petroleum Congress in Beijing in October 1997.

“We need to understand the issue better, and fortunately, we have time,” he said. “It is highly unlikely that the temperature in the middle of the next century will be significantly affected whether policies are enacted now or 20 years from now.”

Over the years, several Exxon scientists who had confirmed the climate consensus during its early research, including Cohen and David, took Raymond’s side, publishing views that ran contrary to the scientific mainstream.

Paying the Price

Exxon’s about-face on climate change earned the scorn of the scientific establishment it had once courted.

In 2006, the Royal Society, the United Kingdom’s science academy, sent a harsh letter to Exxon accusing it of being “inaccurate and misleading” on the question of climate uncertainty. Bob Ward, the Academy’s senior manager for policy communication, demanded that Exxon stop giving money to dozens of organizations he said were actively distorting the science.

In 2008, under mounting pressure from activist shareholders, the company announced it would end support for some prominent groups such as those Ward had identified.

Still, the millions of dollars Exxon had spent since the 1990s on climate change deniers had long surpassed what it had once invested in its path-breaking climate science aboard the Esso Atlantic.

“They spent so much money and they were the only company that did this kind of research as far as I know,” Edward Garvey, who was a key researcher on Exxon’s oil tanker project, said in a recent interview with InsideClimate News and Frontline. “That was an opportunity not just to get a place at the table, but to lead, in many respects, some of the discussion. And the fact that they chose not to do that into the future is a sad point.”

Michael Mann, director of the Earth System Science Center at Pennsylvania State University, who has been a frequent target of climate deniers, said that inaction, just like actions, have consequences. When he recently spoke to InsideClimate News, he was unaware of this chapter in Exxon’s history.

“All it would’ve taken is for one prominent fossil fuel CEO to know this was about more than just shareholder profits, and a question about our legacy,” he said. “But now because of the cost of inaction—what I call the ‘procrastination penalty’—we face a far more uphill battle.”

Part II, coming on September 17, will further examine Exxon’s early climate research.

ICN staff members Zahra Hirji, Paul Horn, Naveena Sadasivam, Sabrina Shankman and Alexander Wood also contributed to this report.

The Widening World of Hand-Picked Truths (New York Times)

Nearly half a century ago, in what passed as outrage in pre-Internet times, people across the country became incensed by the latest edition of Time magazine. In place of the familiar portrait of a world leader — Indira Gandhi, Lyndon B. Johnson, Ho Chi Minh — the cover of the April 8, 1966, issue was emblazoned with three red words against a stark black background: “Is God Dead?”

Thousands of people sent letters of protest to Time and to their local newspapers. Ministers denounced the magazine in their sermons.

The subject of the fury — a sprawling, 6,000-word essay of the kind Time was known for — was not, as many assumed, a denunciation of religion. Drawing on a panoply of philosophers and theologians, Time’s religion editor calmly considered how society was adapting to the diminishing role of religion in an age of secularization, urbanism and, especially, stunning advances in science.

With astronauts walking in space, and polio and other infectious diseasesseemingly on the way to oblivion, it was natural to assume that people would increasingly stop believing things just because they had always believed them. Faith would steadily give way to the scientific method as humanity converged on an ever better understanding of what was real.

Almost 50 years later, that dream seems to be coming apart. Some of the opposition is on familiar grounds: The creationist battle against evolution remains fierce, and more sophisticated than ever. But it’s not just organized religions that are insisting on their own alternate truths. On one front after another, the hard-won consensus of science is also expected to accommodate personal beliefs, religious or otherwise, about the safety of vaccines, G.M.O. crops, fluoridation or cellphone radio waves, along with the validity of global climate change.

Like creationists with their “intelligent design,” the followers of these causes come armed with their own personal science, assembled through Internet searches that inevitably turn up the contortions of special interest groups. In an attempt to dilute the wisdom of the crowd, Google recently tweaked its algorithm so that searching for “vaccination” or “fluoridation,” for example, brings vetted medical information to the top of the results.

But presenting people with the best available science doesn’t seem to change many minds. In a kind of psychological immune response, they reject ideas they consider harmful. A study published this month in the Proceedings of the National Academy of Sciences suggested that it is more effective to appeal to anti-vaxxers through their emotions, with stories and pictures of children sick with measles, the mumps or rubella — a reminder that subjective feelings are still trusted over scientific expertise.

On a deeper level, characteristics that once seemed biologically determined are increasingly challenged as malleable social constructs. As she resigned from her post this summer, an N.A.A.C.P. local leader continued to insistshe was black although she was born white. Facebook now offers users a list of 56 genders to choose from. Transgender sits on the list, along with its opposite, cisgender — meaning that, like most people, you identify yourself as male or female according to the way the cells of your embryo unfolded in the womb.

Even conditions once certified as pathologies are redefined. While some parents cling to discredited research blaming vaccines for giving children autism, others embrace the condition as one more way of being and speak of a new civil rights movement promoting “neurodiversity,” the subject of a book by Steve Silberman, published this month.

While this has been a welcome and humane development for those diagnosed as “higher functioning” on the autism scale, parents of severely impaired children have expressed dismay.

Viewed from afar, the world seems almost on the brink of conceding that there are no truths, only competing ideologies — narratives fighting narratives. In this epistemological warfare, those with the most power are accused of imposing their version of reality — the “dominant paradigm” — on the rest, leaving the weaker to fight back with formulations of their own. Everything becomes a version.

Ideas like these have been playing out in the background as native Hawaiian protesters continue to delay the construction of a new telescope on Mauna Kea that they say would desecrate a mountaintop where the Sky Father and Earth Mother gave birth to humankind. Last month, they staged a demonstration at the annual meeting of the International Astronomical Union in Honolulu.

There are already 13 telescopes on the mountain, all part of the Mauna Kea Science Reserve, which was established by the state in 1968 on what is widely considered the premier astronomical vantage point in the Northern Hemisphere. After I wrote about the controversy last fall, I heard from young anthropologists, speaking the language of postmodernism, who consider science to be just another tool with which Western colonialism further extends its “cultural hegemony” by marginalizing the dispossessed and privileging its own worldview.

Science, through this lens, doesn’t discover knowledge, it “manufactures” it, along with other marketable goods.

Altruism and compassion toward the feelings of others represent the best of human impulses. And it is good to continually challenge rigid categories and entrenched beliefs. But that comes at a sacrifice when the subjective is elevated over the assumption that lurking out there is some kind of real world.

The widening gyre of beliefs is accelerated by the otherwise liberating Internet. At the same time it expands the reach of every mind, it channels debate into clashing memes, often no longer than 140 characters, that force people to extremes and trap them in self-reinforcing bubbles of thought.

In the end, you’re left to wonder whether you are trapped in a bubble, too, a pawn and a promoter of a “hegemonic paradigm” called science, seduced by your own delusions.

How climate change deniers got it right — but very wrong (MSNBC)

 VIDEO: GREENHOUSE, 1/22/15, 3:34 PM ET

06/16/15 08:30 PM

By Tony Dokoupil

It turns out the climate change deniers were right: There isn’t 97% agreement among climate scientists. The real figure? It’s not lower, but actually higher.

The scientific “consensus” on climate change has gotten stronger, surging past the famous — and controversial — figure of 97% to more than 99.9%, according to a new study reviewed by msnbc.

James L. Powell, director of the National Physical Sciences Consortium, reviewed more than 24,000 peer-reviewed papers on global warming published in 2013 and 2014. Only five reject the reality of rising temperatures or the fact that human emissions are the cause, he found.

“It’s now a ruling paradigm, as much an accepted fact in climate science as plate tectonics is in geology and evolution is in biology,” he told msnbc. “It’s 99.9% plus.”

Powell, a member of the National Science Board under Presidents Ronald Reagan and George H.W. Bush, decided to share an exclusive draft of his research on Tuesday — just days before Pope Francis is set to deliver a major address on climate change — because he doesn’t want his holiness to reference outdated numbers.

“I don’t want the Pope to say 97%,” Powell said by phone, arguing that accuracy now is more important than ever. “It’s wrong, and it’s not trivial.”

VIDEO: THE ED SHOW, 6/8/15, 5:53 PM ET – Santorum lectures Pope on climate change

Pope Francis is preparing to charge into the political debate over climate change, citing “a very consistent scientific consensus” and the risk of “unprecedented destruction,” according to a leaked draft of Thursday’s papal encyclical.

The notion of 97% agreement among climate scientists started with studies in 2009 and 2010. It wasn’t until a 2013 study, however, that the figure went viral. President Barack Obama tweeted it. The comedian John Oliver set up a slapstick debate between a climate change denier and 97 of his peers.

But Powell argues that acceptance of man-made global warming has grown. The author of a new Columbia University Press book on scientific revolutions used an online database to compile a mountain of global warming papers published in the last two years.

He also tried a different approach than the earlier studies. Rather than search for explicit acceptance of anthropomorphic global warming, Powell searched for explicit rejection. All the papers in the middle, he figured, weren’t neutral on the subject — they were settled on it.

The results include work from nearly the entire population of working climate scientists — close to 70,000 scientists, often sharing their byline with three or four other authors. They also include a dwindling opposition: Powell could find only four solitary authors who challenged the evidence for human-caused global warming.

That’s a rate of one dissenting voice for every 17,000 agreeing scientists, and it’s not a strong voice. Powell called the four dissents “known deniers and crackpots,” and noted that their work had been cited only once by the wider academic community.

“I don’t want the Pope to say 97%. It’s wrong, and it’s not trivial.”
JAMES L. POWELL, DIRECTOR OF THE NATIONAL PHYSICAL SCIENCES CONSORTIUM

Naomi Oreskes, a professor of the history of science at Harvard, hasn’t read the Powell paper but she doesn’t doubt the general direction of the findings.Back in 2004, she became the first researcher to claim a “consensus” on climate change, finding a roughly 75% agreement within the literature.

“Scientists have done so much more work since then,” she said. For me, as a historian of science, it really feels like overkill. One starts to think, how many more times do we need to say this before we really get it and start to act on it?”

One reason for inaction of course is politics. Many of the world’s leaders still doubt the science of climate change, assuming incorrectly that it’s unsettled or exploratory. The view is especially prevalent among the current crop of Republican presidential candidates.

Earlier this month, for example, former Pennsylvania Sen. Rick Santorum told Fox News that the pope would be “better off leaving science to the scientists.” Jeb Bush, Marco Rubio and Ted Cruz, meanwhile, claim that the science remains vague or is made up entirely.

That raises a second reason for inaction, according to Oreskes: intentional deception. Oreskes is the co-author of the “Merchants of Doubt,” a book that demonstrated how interest groups had undermined the science on tobacco, ozone depletion, acid rain and now climate change.

Many self-proclaimed “climate skeptics” no longer deny that the globe is warming, and some even acknowledge a human role in the new heat wave. Instead, they now say, warming is real — it just isn’t dangerous. They also attack the idea of a consensus, whatever the percentage.

“Nothing has really changed there,” said Oreskes. “The details shift but the overall picture remains the same. It’s a bit like Monet’s water lilies; it can look different at different at different times of day but it’s the same picture.”

Powell, however, hopes his work can finally close the debate, end the notion of doubt, move the frame ahead.

“There isn’t any evidence against global warming and there isn’t any alternative theory,” he said. “We’ve been looking for negative feedbacks and we’ve never found one that amounts to anything. It’s not impossible that we will, but I wouldn’t bet my grandchildren’s future on it.”

RELATED: Santorum to Pope Francis: ‘Leave science to the scientists’

RELATED: Pope Francis may drop political bombshell on climate change

Entertaining Science: A report from a colloquy at the intersection of science and entertainment (CASTAC)

June 9th, 2015, by 

Header-Final-Cropped-High-Res

As you read this post, members of a community of like-minded scholars are unwinding after a weekend symposium at the UK’s University of Manchester. The symposium Stories About Science—Exploring Science Communication and Entertainment Media explored the intersections of science with entertainment from various disciplinary perspectives and as experienced by a diverse range of publics. Organized through the University of Manchester’s Centre for the History of Science, Technology and Medicine (CHSTM), the SAS symposium was the brainchild of the Playing God Project of CHSTM ‘s Science and Entertainment Laboratory research group.

So what, you may ask, does any of this have to do with CASTAC? Well, as an anthropologist invested in exploring ethnographically the cultural qualities of humanity’s intersections with science, I was interested in efforts by the symposium’s presenters, not unlike CASTAC’s own, to understand significant cultural aspects of science in contemporary society. Perhaps more intriguingly, I saw it as a potential opportunity to further our goal of fostering discussions between anthropologists and other STS scholars. To that end, I contacted several SAS symposiasts to get a sense of what they presented at SAS. Colloquy topics ran from explorations of gender for fictional television scientists to the ways legitimate scientists are presented in the media to the power of comics in science communication.

The Presentations

Among the research presented was a paper by Rashel Li of the Australian National University, who reported on her focus group studies of the ways in which gender balance (or imbalance) has been portrayed in science-themed film and television. Viewing representations of gender through the lens of the American sitcom, The Big Bang TheoryLi’s work focuses on that show’s discipline-based gender distribution of men in physics and women in biology and its attempts to portray the characters as equally capable in their respective science fields. Li looks specifically at the ways female scientists are portrayed in the show by collecting feedback on the show’s  representations of gender from adult focus groups. The principal responses of these focus groups ranged from being annoyed by how The Big Bang Theory followed gender-based stereotypes of men in physics and women in biology, to being unaware of the imbalanced gender distribution, to thinking that the show reflected reality and helped humanize science. Perhaps not surprisingly, none of the focus group participants endorsed an imbalanced characterization of scientific capability.

Christopher Herzog (University of Salzburg, Austria) explored in his paper the phenomenon of contemporary neuroscience plays, which highlight the theatre’s unique potential of renegotiating the mind-body problem by having to represent the mental phenomenologically via bodies on stage. Often combined with neuroscientific visualizations of the brain via screen projections, these are theatrical performances in which “deviant minds,” brain pathologies (e.g., anterograde amnesia), and mental illness (e.g., depression) are presented to the public. Arguing against presumed educational or informative functions of science plays and for an epistemically more nuanced understanding of the genre, Herzog contends the plays do not impart scientific ‘facts.’ Ultimately, according to Herzog, “Neuroscience plays are a form of meta-visualization, illustrating how theatre can critically alert us to tendencies in our contemporary culture, and specifically how the forms of presentation (e.g., the dissemination of brain images in mass media) and received-neuroscientific-facts often result in anthropological and social categories of normalcy through recourse to the authority of science.”

Declan Fahy (American University) presented the argument that astrophysicist and public figure Neil deGrasse Tyson illuminates and embodies the enhanced power of scientific celebrity. Using a cultural-historical analysis of Tyson’s decades-long public career to demonstrate how he became a scientific star, Fahy argued that Tyson’s fame rests on how he came to symbolize three wider historical movements in post-1960s U.S culture: the rise of the African-American public intellectual, the endeavors to enhance scientific literacy, and the drive to reignite space exploration. Fahy described how Tyson’s star status has earned him social power to spread scientific ideas through wider culture, granted him influence over science policy, the US space program, and astronomical research, and created, as a consequence of his celebrity, a potent form of scientific authority in popular culture. The argument is one Fahy examines in depth in his book, The New Celebrity Scientists: Out of the Lab and Into the Limelight (2015).

In a similar vein, Benjamin Gross, of the Chemical Heritage Foundation (CHF) in Philadelphia, used Neil deGrasse Tyson’s Cosmos: A Spacetime Odyssey as a starting point for his presentation on how a group of scholars and communications personnel at the CHF successfully organized #CosmosChat, a weekly Twitter conversation in 2014 examining Cosmos’s presentation of science and history. Mobilizing CHF’s library and artifact collections, as well as the expertise of in-house research fellows, the tweets critiqued Cosmos and supplemented each episode’s weekly content. Gross discussed the substantive themes that emerged during the course of these critiques and evaluated the potential applicability of the #CosmosChat model to other communication opportunities that lie at the intersection of science and entertainment.

Unknown-1

Drawing on two projects aimed at communicating science to young people, Science Comics and Cosmic Comics, Emma Weitkamp of the University of West England presented a paper on comics as science communication. Traditionally seen as purely entertainment media, comics have more recently been employed in science communication. Exploring comics as methods for situating science within our day-to-day activities, Weitkamp poses the question: Can a combination of science, humor, and narrative help to show how science is part of our everyday lives? Weitkamp posits that comic media have strong potential, both as learning aids and as creative ways to place science within society, as their fictional nature allows greater juxtaposition of the real and the imaginary, allowing authors, for example, to pose ‘what if’ questions to their readers, such as what if the world didn’t physically work the way it does?

The success of SAS was another success for STS. As David Kirby, the principal investigator in CHSTM ‘s Playing God Project, notes “science and entertainment represent two of the most powerful cultural institutions that humans have developed to understand and explain their world.” If this is true, the scholars aligned with SAS seem poised to offer some intriguing and potentially synergistic research that could align well with the work of CASTAC scholars.

Science Under Siege (CBC)

Paul Kennedy

Wednesday June 03, 2015

http://www.cbc.ca/radio/ideas/science-under-siege-part-1-1.3091552

Are we living through an Anti-Scientific Revolution? Scientists around the world are increasingly restricted in what they can research, publish and say — constrained by belief and ideology from all sides.  Historically, science has always had a thorny relationship with institutions of power. But what happens to societies which turn their backs on curiosity-driven research? And how can science lift the siege?  CBC Radio producer Mary Lynk looks for some answers in this three-part series.

Science Under Siege, Part 1:  Dangers of Ignorance – airs Wednesday, June 3
Explores the historical tension between science and political power and the sometimes fraught relationship between the two over the centuries. But what happens when science gets sidelined? What happens to societies which turn their backs on curiosity-driven research?

Science Under Siege, Part 2: The Great Divide – airs Thursday, June 4
Explores the state of science in the modern world, and the expanding — and dangerous — gulf between scientists and the rest of society.  Many policy makers, politicians and members of the public are giving belief and ideology the same standing as scientific evidence. Are we now seeing an Anti-Scientific revolution?  A look at how evidence-based decision making has been sidelined.

Science Under Siege, Part 3: Fighting Back – airs Friday, June 5
Focuses on the culture war being waged on science, and possible solutions for reintegrating science and society. The attack on science is coming from all sides, both the left and right of the political spectrum. How can the principle of direct observation of the world, free of any influence from corporate or any other influence, reassert itself? The final episode of this series looks at how science can withstand the attack against it and overcome ideology and belief.

The Ant, the Shaman and the Scientist: Shamanic lore spurs scientific discovery in the Amazon (Notes from the Ethnoground)

NOVEMBER 22, 2011

When he pointed to the tree trunk and said the scars were from fires set by invisible forest spirits, I had no idea this supernatural observation would lead to a new discovery for natural science.  Mariano, the eldest shaman of the Matsigenka village of Yomybato in Manu National Park, Peru, had first showed me the curious clearings in the forest that form around clumps of Cordia nodosa, a bristly tropical shrub related to borage (Borago officinalis).  Both the Matsigenka people and tropical ecologists recognize the special relationship that exists between Cordia and ants of the genus Myrmelachista: the Matsigenka word for the plant is matiagiroki, which means “ant shrub.”

 SupaiChacra2
Maximo Vicente, Mariano’s grandson, standing by a 
swollen, scarred trunk near a Cordia patch.

For scientists, the clearings in the forest understory around patches of Cordia are caused by a mutualistic relationship with the ants.  Cordia plants provide the ant colony with hollow branch nodes for nesting and bristly corridors along twigs and leaves for protection, while the ants use their strong mandibles and acidic secretions to clear away competing vegetation.  Local Quechua-speaking colonists refer to the clearings as “Devil’s gardens” (supay chacra).  For the Matsigenka, these clearings are the work of spirits known as Sangariite, which means ‘Pure’ or ‘Invisible Ones’.  Matsigenka shamans like Mariano come to these spirit clearings and consume powerful narcotics and hallucinogens such as tobacco paste, ayahuasca (Banisteriopsis), or the Datura-like toé (Brugmansia).[1]

 SupaiChacra
A “Sangariite village clearing” (igarapagite sangatsiri)
in the upland forests of Manu Park.

With the aid of visionary plants, the shaman perceives the true nature of these mundane forest clearings: they are the villages of Sangariite spirits, unimaginably distant and inaccessible under ordinary states of consciousness.  While in trance, the shaman enters the village and develops an ongoing relationship with a spirit twin or ally among the Sangariite, who can provide him or her with esoteric knowledge, news from distant places, healing power, artistic inspiration, auspicious hunting and even novel varieties of food crops or medicinal plants.[2]  As proof of the existence of these invisible villages, Mariano pointed out to me the scars on adjacent tree trunks all around large, dense Cordia patches: “The scars are caused by fires the Sangariite set to clear their gardens every summer,” he explained.

 jaguarshaman
Mariano wearing a cotton tunic with designs taught him by the
Sangariite spirits during an ayahuasca trance.

Douglas Yu, an expert on ant-plant interactions, was researching Cordia populations in the forests around Yomybato.[3]  I told him of Mariano’s observations about the Sangariite villages, and pointed out the distinctive marks on adjacent trees.  In his years of research, Yu had never noticed the trunk scars.  Intrigued, he cut into the scars and found nests teeming with Myrmelachista ants that appeared to be galling the trunks to create additional housing.  As detailed in a 2009 publication in American Naturalist[4], this case is the first recorded example of ants galling plants, reopening a century-old debate in tropical ecology begun by legendary scientists Richard Spruce and Alfred Wallace. The discovery of Myrmelachista‘s galling capability also helped Yu understand how this ant species persists in the face of competition by two more aggressive ant types, Azteca and Allomerus, that can also inhabit Cordia depending on ecological conditions.

 DougYuAnts
Douglas Yu carries out research on ant-plant
interactions in the Peruvian Amazon.

My ongoing collaborations with Yu and other tropical biologists in indigenous communities have highlighted how important it is to pay attention to local people’s rich and often underappreciated knowledge about forest ecosystems: sometimes even those elements of folklore that appear quaint or “unscientific” contain astute insights about natural processes.

 AntGall
Cross section of a tree trunk galled by Myrmelachista ants
(photo: Megan Frederickson).

— This article was first published online on Nov. 7, 2011 with Spanish and Portuguese translations by O Eco Amazônia.

References:

[1] G.H. Shepard Jr. (1998) Psychoactive plants and ethnopsychiatric medicines of the Matsigenka. Journal of Psychoactive Drugs 30 (4):321-332; G.H. Shepard Jr. (2005) Psychoactive botanicals in ritual, religion and shamanism. Chapter 18 in: E. Elisabetsky & N. Etkin (Eds.), Ethnopharmacology. Encyclopedia of Life Support Systems (EOLSS), Theme 6.79. Oxford, UK: UNESCO/Eolss Publishers [http://www.eolss.net].

[2] G.H. Shepard Jr. (1999) Shamanism and diversity:  A Matsigenka perspective. In Cultural and Spiritual Values of Biodiversity, edited by D. A. Posey. London: United Nations Environmental Programme and Intermediate Technology Publications.

[3] D.W. Yu, H. B. Wilson and N. E. Pierce (2001) An empirical model of species coexistence in a spatially structured environment. Ecology 82 (6):1761-1771.
[4] D.P. Edwards, M.E. Frederickson, G.H. Shepard Jr. and D.W. Yu (2009) ‘A plant needs its ants like a dog needs its fleas’: Myrmelachista schumanni ants gall many tree species to create housing. The American Naturalist 174 (5):734-740. [http://www.ncbi.nlm.nih.gov/pubmed/19799500]

Posted by Glenn H. Shepard at 10:11 AM

There never was a global warming ‘pause,’ NOAA study concludes (Environment & Energy Publishing)

Gayathri Vaidyanathan, E&E reporter

Published: Friday, June 5, 2015

The global warming “pause” does not exist, according to scientists at the National Oceanic and Atmospheric Administration.

Their finding refutes a theory that has dominated climate science in recent years. The Intergovernmental Panel on Climate Change (IPCC) in 2013 found that global temperatures in recent years have not risen as quickly as they did in the 20th century. That launched an academic hunt for the missing heat in the oceans, volcanoes and solar rays. Meanwhile, climate deniers triumphantly crowed that global warming has paused or gone on a “hiatus.”

But it now appears that the pause never was. NOAA scientists have fixed some small errors in global temperature data and found that temperatures over the past 15 years have been rising at a rate comparable to warming over the 20th century. The study was published yesterday inScience.

That a minor change to the analysis can switch the outcome from a hiatus to increased warming shows “how fragile a concept it [the hiatus] was in the first place,” said Gavin Schmidt, director of the NASA Goddard Institute for Space Studies, who was unaffiliated with the study.

According to the NOAA study, the world has warmed since 1998 by 0.11 degree Celsius per decade. Scientists had previously calculated that the trend was about half that.

The new rate is equal to the rate of warming seen between 1951 and 1999.

There has been no slowdown in the rate of global warming, said Thomas Karl, director of NOAA’s National Centers for Environmental Information and lead author of the study.

“Global warming is firmly entrenched on our planet, and it continues to progress and is likely to continue to do so in the future unless emissions of greenhouse gases are substantially altered,” he said.

Errors from weather stations, buoys and buckets

That NOAA has to adjust temperature readings is not unusual. Many factors can affect raw temperature measurements, according to a study by Karl in 1988.

For instance, a weather station may be situated beneath a tree, which would bias temperatures low. Measurements made near a parking lot would read warm due to the waves of heat emanating from asphalt surfaces. NOAA and other agencies adjust the raw temperature data to remove such biases.

It has become clear in recent years that some biases still persist in the data, particularly of ocean temperatures. The culprit: buckets.

Ships traverse the world, and, occasionally, workers onboard dip a bucket over the hull and bring up water that they measure using a thermometer. The method is old school and error prone — water in a bucket is usually cooler than the ocean.

For a long time, scientists had assumed that most ships no longer use buckets and instead measure water siphoned from the ocean to cool ship engines. The latter method is more robust. But data released last year showed otherwise and compelled NOAA to correct for this bias.

A second correction involved sensor-laden buoys interspersed across the oceans whose temperature readings are biased low. Karl and his colleagues corrected for this issue, as well.

The corrections “made a significant impact,” Karl said. “They added about 0.06 degrees C per decade additional warming since 2000.”

The ‘slowdown hasn’t gone away’

What that means for the global warming hiatus depends on whom you ask. The warming trend over the past 15 years is comparable to the trend between 1950 and 1998 (a 48-year stretch), which led Karl to say that global warming never slowed.

Other scientists were not fully convinced. For a truly apples-to-apples comparison, the past 15 years should be compared with other 15-year stretches, said Peter Stott, head of the climate monitoring and attribution team at the U.K. Met Office.

For instance, the globe warmed more slowly in the past 15 years than between 1983 and 1998 (the previous 15-year stretch), even with NOAA’s new data corrections, Stott said.

“The slowdown hasn’t gone away,” he said in an email. “While the Earth continues to accumulate energy as a result of increasing man-made greenhouse gas emissions … global temperatures have not increased smoothly.”

The disagreements arise because assigning trends — including the trend of a “hiatus” — to global warming depends on the time frame of reference.

“Trends based on short records are very sensitive to the beginning and end dates and do not in general reflect long-term climate trends,” the IPCC stated in 2013, even as it discussed the pause.

Robert Kaufmann, an environment professor at Boston University who was unaffiliated with the study, called trends a “red herring.”

A trend implies that the planet will warm, decade after decade, at a steady clip. There is no reason why that should be the case, Kaufmann said. Many factors — human emissions of warming and cooling gases, natural variability, and external factors such as the sun — feed into Earth’s climate. The relative contributions of each factor can vary by year, decade, century or on even larger time scales.

“There is no scientific basis to assume that the climate is going to warm at the same rate year after year, decade after decade,” he said.

Copying the language of skeptics

Trends are a powerful weapon in the hands of climate deniers. As early as 2006, deniers used the slowdown of warming from 1998 onward to say that global warming had stopped or paused.

The idea of a “pause” seeped into academia, launching dozens of studies into what might have caused it. But there was a subtle difference between scientists’ understanding of the pause and that of the skeptics; scientists never believed that warming had stopped, only that it had slowed compared with the rapidly warming ’90s. They wanted to know why.

Over the years, scientists have unraveled the contributions of volcanoes to global cooling, the increased uptake of heat by the Pacific Ocean, the cooling role of La Niñas and other drivers of natural variability. Their understanding of our planet’s climate evolved rapidly.

As scientists wrote up their findings, they unwittingly adopted the skeptics’ language of the “pause,” said Stephan Lewandowsky, a psychologist at the University of Bristol who was unaffiliated with the NOAA study. That was problematic.

“That’s sort of a subtle semantic thing, but it is really important because it suggests that these [scientists] bought into the existence of the hiatus,” he said.

Then, in 2013, the IPCC wrote about the pause. The German government complained that the term implies that warming had stopped, which is inaccurate. The objection was ignored.

NOAA’s strong refutation of the hiatus is particularly weighty because it comes from a government lab, and the work was headed by Karl, a pioneer of temperature reanalysis studies.

NOAA will be using the data corrections to assess global temperatures from July onward, Karl said. NASA is discussing internally whether to apply the fixes suggested in the study, according to Schmidt of NASA.

The study was greeted by Democrats in Congress as proof that climate change is real. Sen. Barbara Boxer (D-Calif.), ranking member of the Environment and Public Works Committee, used it as an opportunity to chide her opponents.

“Climate change deniers in Congress need to stop ignoring the fact that the planet may be warming at an even faster rate than previously observed, and we must take action now to reduce dangerous carbon pollution,” she said in a statement.

Experiment Provides Further Evidence That Reality Doesn’t Exist Until We Measure It (IFLScience)

June 2, 2015 | by Stephen Luntz

photo credit: Pieter Kuiper via Wikimedia Commons. A comparison of double slit interference patterns with different widths. Similar patterns produced by atoms have confirmed the dominant model of quantum mechanics 

Physicists have succeeded in confirming one of the theoretical aspects of quantum physics: Subatomic objects switch between particle and wave states when observed, while remaining in a dual state beforehand.

In the macroscopic world, we are used to waves being waves and solid objects being particle-like. However, quantum theory holds that for the very small this distinction breaks down. Light can behave either as a wave, or as a particle. The same goes for objects with mass like electrons.

This raises the question of what determines when a photon or electron will behave like a wave or a particle. How, anthropomorphizing madly, do these things “decide” which they will be at a particular time?

The dominant model of quantum mechanics holds that it is when a measurement is taken that the “decision” takes place. Erwin Schrodinger came up with his famous thought experiment using a cat to ridicule this idea. Physicists think that quantum behavior breaks down on a large scale, so Schrödinger’s cat would not really be both alive and dead—however, in the world of the very small, strange theories like this seem to be the only way to explain what we we see.

In 1978, John Wheeler proposed a series of thought experiments to make sense of what happens when a photon has to either behave in a wave-like or particle-like manner. At the time, it was considered doubtful that these could ever be implemented in practice, but in 2007 such an experiment was achieved.

Now, Dr. Andrew Truscott of the Australian National University has reported the same thing in Nature Physics, but this time using a helium atom, rather than a photon.

“A photon is in a sense quite simple,” Truscott told IFLScience. “An atom has significant mass and couples to magnetic and electric fields, so it is much more in tune with its environment. It is more of a classical particle in a sense, so this was a test of whether a more classical particle would behave in the same way.”

Trustcott’s experiment involved creating a Bose-Einstein Condensate of around a hundred helium atoms. He conducted the experiment first with this condensate, but says the possibility that atoms were influencing each other made it important to repeat after ejecting all but one. The atom was passed through a “grate” made by two laser beams that can scatter an atom in a similar manner to a solid grating that can scatter light. These have been shown to cause atoms to either pass through one arm, like a particle, or both, like a wave.

A random number generator was then used to determine whether a second grating would appear further along the atom’s path. Crucially, the number was only generated after the atom had passed the first grate.

The second grating, when applied, caused an interference pattern in the measurement of the atom further along the path. Without the second grating, the atom had no such pattern.

An optical version of Wheeler’s delayed choice experiment (left) and an atomic version as used by Truscott (right). Credit: Manning et al.

Truscott says that there are two possible explanations for the behavior observed. Either, as most physicists think, the atom decided whether it was a wave or a particle when measured, or “a future event (the method of detection) causes the photon to decide its past.”

In the bizarre world of quantum mechanics, events rippling back in time may not seem that much stranger than things like “spooky action at a distance” or even something being a wave and a particle at the same time. However, Truscott said, “this experiment can’t prove that that is the wrong interpretation, but it seems wrong, and given what we know from elsewhere, it is much more likely that only when we measure the atoms do their observable properties come into reality.”

Ethnography: A Scientist Discovers the Value of the Social Sciences (The Scholarly Kitchen)

 

Picture from an early ethnographic study

I have always liked to think of myself as a good listener. Whether you are in therapy (or should be), conversing with colleagues, working with customers, embarking on strategic planning, or collaborating on a task, a dose of emotional intelligence – that is, embracing patience and the willingness to listen — is essential.

At the American Mathematical Society, we recently embarked on ambitious strategic planning effort across the organization. On the publishing side we have a number of electronic products, pushing us to consider how we position these products for the next generation of mathematician. We quickly realized that it is easy to be complacent. In our case we have a rich history online, and yet – have we really moved with the times? Does a young mathematician need our products?

We came to a sobering and rather exciting realization: In fact, we do not have a clear idea how mathematicians use online resources to do their research, teaching, hiring, and job hunting. We of course have opinions, but these are not informed by anything other than anecdotal evidence from conversations here and there.

To gain a sense of how mathematicians are using online resources, we embarked on an effort to gather more systematic intelligence embracing a qualitative approach to the research – ethhnography. The concept of ethnographic qualitative research was a new one to me – and it felt right. I quickly felt like I was back in school and a graduate student in ethnography, reading the literature, and thinking through with colleagues how we might apply qualitative research methods to understanding mathematicians’ behavior. It is worth taking a look at two excellent books: Just Enough Research by Erika Hall, and Practical Ethnography: A Guide to Doing Ethnography in the Private Sector by Sam Ladner.

What do we mean by ethnographic research? In essence we are talking about a rich, multi-factorial descriptive approach. While quantitative research uses pre-existing categories in its analysis, qualitative research is open to new ways of categorizing data – in this case, mathematicians’ behavior in using information. The idea is that one observes the subject (“key informant” in technical jargon) in their natural habitat. Imagine you are David Attenborough, exploring an “absolutely marvelous” new species – the mathematician – as they operate in the field. The concept is really quite simple. You just want to understand what your key informants are doing, and preferably why they are doing it. One has to do it in a setting that allows for them to behave naturally – this really requires an interview with one person not a group (because group members may influence each other’s actions).

Perhaps the hardest part is the interview itself. If you are anything like me, you will go charging in saying something along the lines of “look at these great things we are doing. What do you think? Great right?” Well, of course this is plain wrong. While you have a goal going in, perhaps to see how an individual is behaving with respect to a specific product, your questions need to be agnostic in flavor. The idea is to have the key informant do what they normally do, not just say what they think they do – the two things may be quite different. The questions need to be carefully crafted so as not to lead, but to enable gentle probing and discussion as the interview progresses. It is a good idea to record the interview – both in audio form, and ideally with screen capture technology such as Camtasia. When I was involved with this I went out and bought a good, but inexpensive audio recorder.

We decided that rather than approach mathematicians directly, we should work with the library at an academic institution. Libraries are our customers. The remarkable thing about academic libraries is that ethnography is becoming part of the service they provide to their stakeholders at many institutions. We actually began with a remarkable librarian, based at Rice University – Debra Kolah. She is the head of the user experience office at the Fondren Library of Rice University in Texas. She also happens to be the physics, math and statistics librarian at Rice. Debra is remarkable, and has become an expert in ethnographic study of academic user experience. She has multiple projects underway at Rice, working with a range of stakeholders, aiming to foster the activity of the library in the academic community she directly serves. She is a picture of enthusiasm when it comes to serving her community and to gaining insights into the cultural patterns of academic user behavior. Debra was our key to understanding how important it is to work with the library to reach the mathematical community at an institution. The relationship is trusted and symbiotic. This triangle of an institution’s library, academic, and outside entity, such as a society, or publisher, may represent the future of the library.

So the interviews are done – then what? Analysis. You have to try to make sense of all of this material you’ve gathered. First, transcribing audio interviews is no easy task. You have a range of voices and much technical jargon. The best bet is to get one of the many services out there to take the files and do a first pass transcription. They will get most of it right. Perhaps they will write “archive instead of arXiv, but that can be dealt with later. Once you have all this interview text, you need to group it into meaningful categories – what’s called “coding”. The idea is that you try to look at the material with a fresh, unbiased eye, to see what themes emerge from the data. Once these themes are coded, you can then start to think about patterns in the data. Interestingly, qualitative researchers have developed a host of software programs to aid the researcher in doing this. We settled for a relatively simple, web based solution – Dedoose.

With some 62 interviews under our belt, we are beginning to see patterns emerge in the ways that mathematicians behave online. I am not going to reveal our preliminary findings here – I must save that up for when the full results are in – but I am confident that the results will show a number of consistent threads that will help us think through how to better serve our community.

In summary, this experience has been a fascinating one – a new world for me. I have been trained as a scientist. As a scientist, I have ideas about what scientific method is, and what evidence is. I now understand the value of the qualitative approach – hard for a scientist to say. Qualitative research opens a window to descriptive data and analysis. As our markets change, understanding who constitutes our market, and how users behave is more important than ever.

Carry on listening!

The surprising links between faith and evolution and climate denial — charted (The Washington Post)

 May 20, 2015

For a long time, we’ve been having a pretty confused discussion about the relationship between religious beliefs and the rejection of science — and especially its two most prominent U.S. incarnations, evolution denial and climate change denial.

At one extreme is the position that science denial is somehow deeply or fundamentally religion’s fault. But this neglects the wide diversity of views about science across faiths and denominations — and even across individuals of the same faith or denomination — not all of which are anti-climate science, or anti-evolution.

At the other extreme, meanwhile, is the view that religion has no conflict with science at all. But that can’t be right either: Though the conflict between the two may not be fundamental or necessary in all cases, it is pretty clear that the main motive for evolution denial is, indeed, a perceived conflict with faith (not to mention various aspects of human cognition that just make accepting evolution very hard for many people).

The main driver of climate science rejection, however, appears to be a free market ideology — which is tough to characterize as religious in nature. Nonetheless, it has often been observed (including by me) that evolution denial and climate science rejection often seem to overlap, at least to an extent.

[Pope Francis has given the climate movement just what it needed: faith]

And there does seem to be at least some tie between faith and climate science doubt. Research by Yale’s Dan Kahan, for instance, found a modest correlation between religiosity and less worry about climate change. Meanwhile, a 2013 study in Political Science Quarterly found that “believers in Christian end-times theology are less likely to support policies designed to curb global warming than are other Americans.”

So how do we make sense of this complex brew?

Josh Rosenau, an evolutionary biologist who works for the National Center for Science Education — which champions both evolutionary science and climate science teaching in schools — has just created a chart that, no matter what you think of the relationship between science and religion, will give you plenty to talk about.

Crunching data from the 2007 incarnation of a massive Pew survey of American religious beliefs, Rosenau plotted different U.S. faiths and denominations based on their members’ views about both the reality of specifically human evolution, and also how much they favor “stricter environmental laws and regulations.” And this was the result (click to enlarge):

As Rosenau notes, in the figure above, “The circle sizes are scaled so that their areas are in proportion to the relative population sizes in Pew’s massive sample (nearly 36,000 people!).” And as you can see, while at the top right atheists, agnostics, Buddhists, non-Orthodox Jews and others strongly accept evolution and environmental rules, at the bottom left Southern Baptists, Pentecostals and other more conservative leaning faiths are just as skeptical of both.

Obviously, it is important to emphasize that a given individual, of any faith, could be anywhere on the chart above — it’s just that this is where the denominations as a whole seemed to fall out, based on Rosenau’s analysis (which itself mirrors prior analyses of the political alignments of U.S. faiths and denominations by political scientist and Religion News Service blogger Tobin Grant).

Reached by phone Tuesday, Rosenau (whom I’ve known for a long time from the community of bloggers about science and the environment) seemed to be still trying to fully understand the implications of the figure he’d created. “People seemed to like it,” he said. “I think some people are finding hope in it” — hope, specifically, that there is a way out of seemingly unending science versus religion spats.

Here are some of Rosenau’s other conclusions from the exercise, from his blog post introducing the chart:

First, look at all those groups whose members support evolution. There are way more of them than there are of the creationist groups, and those circles are bigger. We need to get more of the pro-evolution religious out of the closet.

Second, look at all those religious groups whose members support climate change action. Catholics fall a bit below the zero line on average, but I have to suspect that the forthcoming papal encyclical on the environment will shake that up.

[Our new pro-science pontiff: Pope Francis on climate change, evolution, and the Big Bang]

Rosenau also remarks on the striking fact that for the large bulk of religions and religious denominations, as support for evolution increases, so does support for tougher environmental rules (and vice versa). The two appear to be closely related.

So what can that mean?

Rosenau told me he was still trying to work that out — still playing with the data and new analyses to try to understand it.

One possible way of interpreting the figure is that as with political parties themselves, people at least partially self-sort into faiths or denominations that seem more consonant with their own worldviews. And thus, a cluster of issue stances may travel alongside these choices of affiliation. “People are choosing what religion they want to associate with,” suggested Rosenau. “If people feel alienated from a church, they’re switching.”

There may also be a substantive point here that links together the ideas. A view of the world that thinks of human beings as having evolved, as being part of the natural world and having emerged through the same process as other organisms, may also be related to a manner of thinking that puts great overall emphasis on the value of nature and one’s connectedness with it.

In any case, while the pattern above may require more analysis, one clear punchline of the figure is that it really doesn’t make sense to say that religion is at war with science. You can say that for some people, religion is clearly linked to less science acceptance — especially on evolution. But for others, clearly, religion presents no hurdle at all.

I would also agree that these data reinforce the idea that the pope’s coming encyclical on the environment could really shake matters up. Catholics are the biggest bubble in the chart above, and they’re right in the middle of the pack on the environment.

The pope, incidentally, also appears to accept evolution.

Hawaiian telescope fight prompts new rules for Mauna Kea (Nature)

Thirty Meter Telescope can proceed, but one-quarter of existing telescopes on mountain must be removed in the next decade.

Alexandra Witze

27 May 2015

Hawaii Governor David Ige says the Thirty Meter Telescope project can move forward.

The controversial Thirty Meter Telescope (TMT) should be built atop the sacred Hawaiian mountain of Mauna Kea as planned — but one-quarter of the 13 telescopes already there need to be taken down by the time the TMT starts operating in the mid-2020s, Hawaii’s governor David Ige said on 26 May.

Ige’s long-awaited statement aims to break the impasse between the TMT project, which halted construction in early April after protests broke out, and Native Hawaiians, who see the telescope — bigger than any on Mauna Kea so far — as the latest violation of an important cultural site.

The governor laid out sweeping changes to how Mauna Kea will be managed in the future. “We have in many ways failed the mountain,” he said. “We have not done right by a very special place.”

The shift could significantly affect astronomers who use the world-class facilities atop Mauna Kea, which include the twin 10-metre Keck telescopes as well as the 8-metre-class Gemini Northern and Subaru telescopes. The first astronomical observatories were built on Mauna Kea starting in the 1960s.

Perhaps most significantly, “the university must decommission as many telescopes as possible, with one to begin this year and at least 25% of all telescopes gone by the time the TMT is ready for operation,” Ige said. The first to go will be the Caltech Submillimeter Observatory, whose closure was announced in 2009; it will start to be dismantled later this year.

But none of the other 12 telescopes had immediate plans to shutter. The submillimetre-wavelength James Clerk Maxwell Telescope is just beginning a new life under the operation of the East Asian Observatory. The 3.8-metre United Kingdom Infrared Telescope was similarly transferred from the UK’s Science and Technology Facilities Council to the University of Hawaii in Manoa last year.

“This is all new to us,” says Peter Michaud, a spokesman for the Gemini Observatory based in Hilo, Hawaii. “Until we learn more about it, we’re not really able to say much of anything.”

A 2010 plan commissioned by the university lays out a framework for how various observatories could be taken down. The governor’s announcement is likely to accelerate those scenarios, says Günter Hasinger, director of the University of Hawaii’s Institute for Astronomy in Manoa. “In principle this is nothing new,” he says. “We have always made the point that the space on top of the mountain should only be populated by the best telescopes.”

A changing landscape

Ige’s changes all push toward reducing impact on the mountain’s 4,200-metre summit. The University of Hawaii leases more than 45 square kilometres as a science reserve. The current lease is good until the end of 2033, but Ige said that when that is up the university must return more than 40 square kilometres — all the land not needed for astronomy — to the state’s Department of Land and Natural Resources. The university must also agree that the TMT location, which is a few hundred metres below the actual summit, is the last area on the mountain where any telescopes will ever be built.

An artist’s conception of the Thirty Meter Telescope on Mauna Kea, with existing telescopes in the background.

Visitors to the mountain top will be limited, and be required to receive cultural training. A new cultural council will be created to provide input to the Office of Mauna Kea Management.

“It’s up to different organizations to decide their next step,” said Ige. “I intend to fully protect the right of TMT to proceed to construction, and respect and protect the right of protestors to peacefully protest.”

“We will work with the framework he has put forth,” said Henry Yang, chair of the TMT International Observatory board, in a statement. “We know we have a lot of work ahead of us. We appreciate that there are still people who are opposed to the project, and we will continue to respectfully listen and work with them to seek solutions.“

Ige said his office would work with the university to develop a timeline for the various actions. “To my point of view this is a very important step forward, and will hopefully solve the Gordian knot that we are in,” says Hasinger.

TMT construction ignited a firestorm of protest among Native Hawaiians and also by many astronomers who pushed to redress what they see as decades of scientists essentially colonizing a sacred space.

The $1.5-billion TMT project chose Mauna Kea over a mountain top in Chile, and had gone through a seven-year permissions process. Partners include the University of California, the California Institute of Technology, and the governments of China, Japan, India and Canada. Legal challenges are still wending their way through Hawaiian courts.

Two competing telescopes are both under construction in Chile.

Nature, doi:10.1038/nature.2015.17639

Related stories

Reunião Magna da ABC ressalta que o prazer de fazer ciência pela ciência está acima de qualquer premiação (Jornal da Ciência)

terça-feira, 12 de maio de 2015

    12.05 - Suzana

    Evento atraiu parcela de leigos interessados em comprovar que o desenvolvimento científico pode ser uma solução para os problemas socioeconômicos do Brasil

    Para participantes e organizadores da Reunião Magna 2015 da Academia Brasileira de Ciências, realizada de 4 a 6 de maio, no Rio de Janeiro, é, sem dúvida, difícil elencar os momentos que mais capturaram a atenção. A excelência dos temas escolhidos para as sessões, assim como o alto nível dos palestrantes, atraiu um público variado, que reuniu desde jovens talentos da Ciência no Brasil aos renomados integrantes da Academia e que há anos trabalham, dentro e fora de laboratórios, para que o Brasil ganhe destaque no cenário de produção científica internacional.

    Sob o tema “O Valor da Ciência”, na acepção de Poincaré, matemático, físico e filósofo francês que conferiu uma nova abordagem à Ciência entre os séculos XIX e XX, a Reunião Magna 2015 estimulou a discussão em torno do valor intrínseco de atividade científica _ ciência pela ciência_, ressaltando a importância da Ciência para o desenvolvimento socioeconômico brasileiro. Entre as palestras, um denominador comum ficou claro, é preciso estimular a ousadia dos jovens cientistas para que as pesquisas se transformem em inovação. Outro ponto em comum das apresentações dos cientistas foi a importância do trabalho de equipe e valorização de cada colaborador em uma pesquisa.

    Ganhadora do Prêmio Nobel de Química e primeira mulher israelense a obter a premiação, a cientista Ada Yonath mostrou que o bom humor é um traço dos pesquisadores, que têm o brilho no olhar ao comprovar o fundamento de suas pesquisas. Após sua palestra no último dia da Reunião Magna, Ada conheceu o Instituto de Ciências Biomédicas da UFRJ. Recebida pelo vice-diretor do Instituto, prof. José Garcia Abreu, e pela coordenadora do Programa de Pós-Graduação em Ciências Morfológicas da UFRJ, profa. Flávia Alcantara Gomes, Membro Afiliado da ABC, Ada afirmou, com relação às perguntas dos alunos de pós-graduação, sobre o que representou ganhar o prêmio Nobel, que “ganhar o prêmio foi bom…. mas entender a estrutura dos ribossomos foi o que me deu de fato a maior satisfação”. Ela deixou claro que o prazer da descoberta científica deve estar acima do prazer do reconhecimento.

    O prazer de concluir uma pesquisa e ver o trabalho de anos refletindo em um bem maior para sociedade também foi a mensagem do biólogo francês Jules Hoffmann, vencedor do Prêmio Nobel de Fisiologia/ Medicina de 2011, por um trabalho feito com Bruce Beutler que descobriu a ativação da imunidade inata. Ele capturou atenção máxima de todo o público do segundo dia da Reunião Magna de 2015. Atualmente à frente da direção da área de pesquisa e membro do Conselho de Administração do Centro Nacional de Pesquisa Científica da França (CNRS), ele enalteceu a contribuição de todos os demais integrantes de sua equipe, que, segundo ele, foram fundamentais para o resultado.

    Respostas às inquietudes

    Ao final do evento, o coordenador da Reunião Magna 2015, professor Vivaldo Moura Neto, disse ao Jornal da Ciência que a escolha do tema teve a intenção de destacar o prazer humano de fazer ciência, de buscar respostas às inquietudes do homem diante da natureza. Segundo ele, é preciso valorizar a Ciência no que ela pode trazer como implicações no desenvolvimento tecnológico, inovador e assim certamente contribuir para o desenvolvimento do país.

    “Nós, hoje como ontem e ainda amanhã, precisaremos estar atentos a isto. Será preciso que os governantes reconheçam a contribuição da Ciência brasileira ao desenvolvimento, uma ciência madura, produtiva, rica de possibilidades para atender ao desenvolvimento nacional. Repetimos isto durante os três dias, demonstramos esta verdade com projetos em curso e resultados testados”, disse. “Vejam exemplos do que se faz no CENPES, na COPPE, nos Institutos do Centro de Ciências da saúde da UFRJ, na USP, na Universidade de Campinas, na Bioquímica da UFRGS, no Instituto do Cérebro da PUC-do Rio Grande do Sul, ou o trabalho dos virologistas no Pará, além de tantos mais. De fato, o encontro permitiu mostrar o quanto estamos prontos para oferecer, a partir da ciência fundamental, os produtos que ela sabe gerar. Os engenheiros foram contundentes nos seus exemplos. É incrível a miopia de tantos que, lá do alto, não veem o que se passa aqui na terra brasilis”, completou o coordenador da Reunião Magna 2015.

    Atrair os jovens

    Nesta edição, a Reunião Magna atraiu, além de cientistas experientes e jovens talentos da produção científica, um grupo expressivo de “leigos”, segundo o professor Moura Neto. “Nos três dias da Reunião, houve relatos das experiências dos mais vividos, mas todos nós sabemos que não se faz um pesquisador, não se faz um cientista, de repente. É preciso atrair os jovens, entusiasmá-los, orientá-los. No entanto, não havia apenas jovens de centros universitários, ou os mais experientes da ciência, havia uma parcela de leigos, que certamente encontrou na Reunião Magna uma fonte de conhecimento, uma esperança de que se poderá melhorar o país. Seria interessante, naturalmente, que os governantes também vissem isto”, completou Moura Neto.

    Sobre a participação dos convidados estrangeiros e da sua percepção sobre o desenvolvimento da produção científica no Brasil, o coordenador da Reunião Magna 2015 disse que eles já têm colaboração com equipes brasileiras. “Se eles mantêm estas colaborações, é porque sabem da qualidade excelente do que fazemos aqui. Eles mesmo disseram isto, como por exemplo o matemático francês Etiennen Ghys, que aliás fez parte de sua formação no Rio de Janeiro, no IMPA”, comentou.

    Segundo Moura Neto, o prêmio Nobel Jules Hoffmann enalteceu, nas suas conversas de corredor na ABC, durante o evento, a satisfação de suas colaborações com esquipes paulistas. O coordenador da Reunião Magna também comentou que a cientista Ada Yonath, que, segundo ele, encantou a todos com uma conferência espetacular, manifestou possibilidades de colaborar com grupos brasileiros. “Se eles querem estas colaborações é porque nos reconhecem ombro a ombro”, finalizou.

    Suzana Liskauskas/ Jornal da Ciência

    Experts Warn of “Cataclysmic” Changes as Planetary Temperatures Rise (Truthout)

    Monday, 27 April 2015 00:00 By Dahr Jamail, Truthout | Report 

    Two unprecedentedly high temperatures were recorded in Antarctica, providing an ominous sign of accelerating ACD as one of the readings came in at just over 63 degrees Fahrenheit. (Photo: Iceberg via Shutterstock)

    Two unprecedentedly high temperatures were recorded in Antarctica, providing an ominous sign of accelerating climate change as one of the readings came in at just more than 63 degrees Fahrenheit. (Photo: Iceberg via Shutterstock)

    Climate Disruption DispatchesThis month’s anthropogenic climate disruption (ACD) dispatch begins with the fact that recently released National Oceanic and Atmospheric Administration data show that this March was, by far, the hottest planetary March ever recorded, and the hottest January to March period on record as well.

    We are watching unprecedented melting of glaciers across the planet, increasingly high temperature records and epic-level droughts that are now becoming the new normal: Planetary distress signals are increasing in volume.

    One of these took place recently in Antarctica, of all places, where two unprecedentedly high temperatures were recorded, providing an ominous sign of accelerating ACD as one of the readings came in at just over 63 degrees Fahrenheit.

    “We’re going to be out of water.”

    A fascinating recent report shows that approximately 12 million people living in coastal areas will be displaced during the next 85 years, with areas along the Eastern Seaboard of the United States seeing some of the most dramatic impacts.

    In the US, another report shows that the Navajo Nation is literally dying of thirst, with one of the nation’s leaders flatly sounding the alarm by stating, “We’re going to be out of water.”

    A study just published in Geophysical Research Lettersbolsters the case that a period of much faster ACD is imminent, if it hasn’t already begun.

    On that note, leading climate researchers recently saidthere is a possibility that the world will see a 6-degree Celsius temperature increase by 2100, which would lead to “cataclysmic changes” and “unimaginable consequences for human civilization.”

    With these developments in mind, let us take a look at recent developments across the planet since the last dispatch.

    Earth

    Signs of ACD’s impact across this sector of the planet are once again plentiful, and the fact that the Amazon is suffering is always a very loud alarm buzzer, given that every year the world’s largest rainforest cycles through 18 billion tons of carbon when its 6 million square kilometers of trees breathe in carbon dioxide and then release it back into the atmosphere when they die. This is twice the amount of carbon that fossil fuel burning emits in an entire year. A recent report shows that while the Amazon is continuing to absorb more carbon than it is releasing, a tipping point is coming, and likely soon, as deforestation, drought and fires there continue to remove precious trees at a frightful rate. With 1.5 acres of rainforest lost every single second, somewhere around the world, the situation in the Amazon does not bode well for our future.

    In the United States, in Harvard Forest, located 70 miles west of the university’s campus in Cambridge, Massachusetts, hemlock trees are dying at an alarming rate. Harvard Forest is a case study, as it is part of a network of 60 forests around the world called the Center for Tropical Forest Science-Forest Global Earth Observatories, where they are being studied for their response to ACD and other anthropogenic issues. Kristina Anderson-Teixeira, an ecologist with the network, said its forests are “being impacted by a number of different global change factors. We do expect more of this, be it pests or pathogens or droughts or heat waves or thawing permafrost.”

    Another report from April revealed that Russia has been losing an amount of forest the size of Switzerland (16,600 square miles of tree cover) every year, for three years running.

    Without ice in the summer, polar bears will starve and die off.

    Terrestrial animals continue to struggle to survive in many areas. It should come as no surprise that in the Arctic, a recent studyshows that the theory that polar bears will be able to adapt to ice-free seas in the summer by eating on land has been debunked. Without ice in the summer, polar bears will starve and die off.

    Another study shows that ACD is threatening mountain goats, due to the warming that is occurring even at the higher elevations where the goats live, as the rate of warming there is two to three times faster than the rest of the planet. According to the study, due to the warming, the goats’ future is now uncertain.

    In California, sea lion strandings have already reached more than 2,250 for this year alone, which is a record. The worsening phenomenon is being blamed on warming seas that are disrupting the food supply of marine mammals.

    Across the United States, hunters are seeing their traditions being changed by ACD. “I could point you to a million different forums online where hunters are complaining about the season and how hunting is terrible,” said one hunter in a recent report. “At the end of the day, it’s changing weather patterns. Winters around here are not as cold as they used to be.”

    March report from a researcher in Rhode Island showed that the growth and molting rates of juvenile lobsters are decreasing “significantly” due to oceans becoming increasingly acidic from ACD. This makes the animals more vulnerable to predations, thus leading to fewer adult lobsters and an overall rapidly declining population.

    Air

    There have been a few major developments recently in this sector of our analysis.

    Interestingly, some of the more commonly used anesthetics are apparently accumulating in the planet’s atmosphere, thus contributing to warming of the climate, according to a report in April. It is a small amount, mind you, but the volume is increasing.

    US greenhouse gas pollution increased 2 percent over the previous year in 2013.

    Bad news on the mitigation front comes in the form of a study that revealed that ongoing urban sprawl and auto exhaust is hampering cities’ best efforts toward lowering carbon dioxide emissions. If people continue to drive as much as they are, and development continues apace, the push to build more dense housing, better transit systems and more bike lanes in urban centers will be for naught.

    Speaking of lack of mitigation, the US Environmental Protection Agency recently announced that US greenhouse gas pollution increased 2 percent over the previous year in 2013.

    Drought plagued California gets more bad news in this sector, as recently released data shows that the state continues to have its warmest year ever recorded, with statewide temperatures coming in nearly 2 degrees Fahrenheit warmer than the previous record, which was set in 2014. The state is quite literally baking.

    Another study showed that the frozen soil (permafrost) of the planets’ northern polar regions that holds billions of tons of organic carbon is melting and that melting is being sped up by ACD, hence releasing even more carbon into our already carbon dioxide-supersaturated atmosphere.

    Lastly in this section, those who believe in technological fixes for our predicament received some bad news in April, which came in the form of a report that shows that any attempts to geoengineer the climate are likely to result in “different” climate disruption, rather than an elimination of the problem. The most popular proposed idea of solar radiation management that would utilize stratospheric sulfate aerosols to dim the sun has been proven to be, well, destructive. Using a variety of climate models, Ken Caldeira from the Carnegie Institution for Science in Stanford, California, has investigated the likely consequences of such geoengineering on agriculture across the globe.

    According to a report on the matter:

    His research showed that while dimming could rapidly decrease global temperatures, high carbon dioxide levels would be expected to persist, and it is the balance between temperature, carbon dioxide, and sunlight that affects plant growth and agriculture. Exploring the regional effects, he finds that a stratospherically dimmed world would show increased plant productivity in the tropics, but lessened plant growth across the northerly latitudes of America, Europe and Asia. It is easy to see how there might be geopolitical shifts associated with changes in regional food production across the globe. “It’s probably the poor tropics that stand to benefit and the rich north that stands to lose,” said Prof Caldeira.

    Hence, given that the results would be detrimental to the “rich north,” which by far and away has pumped more carbon dioxide into the atmosphere than the “poor tropics,” the results of geoengineering would indeed be karmic.

    Water

    In the United States, California’s epic drought continues to lead in the water sector of analysis.

    For the first time in California’s history, mandatory water use reductions have been imposed on residents after a winter of record-low snowfalls, and hence a record-low snowpack. “People should realize we are in a new era,” Gov. Jerry Brown said at a news conference there in April, standing on a patch of brown and green grass that would normally be thick with snow that time of year. “The idea of your nice little green lawn getting watered every day, those days are past.”

    Climate scientists also recently announced, disconcertingly, that California’s record-breaking drought is merely a preview of future ACD-generated megadroughts.

    Shortly after Brown announced the mandatory water restrictions for his state, another study was released showing that California will also be facing more extreme heat waves, along with rising seas, caused by increasingly intense impacts from ACD. According to the study, the average number of days with temperatures reaching 95 degrees will double or even triple by the end of this century. Simultaneously, at least $19 billion worth of coastal property will literally disappear as sea levels continue to rise.

    Experts also announced in April that in “drought-era” California, “every day” should now be considered “fire season.” NASA Jet Propulsion Laboratory climatologist Bill Patzert said of California, “We are in an incendiary situation.”

    California’s state climatologist, Michael Anderson, issued a very stark warning in April when he said the state faces dust bowl-like conditions, as he compared the water crisis in California to the legendary US dustbowl. “You’re looking on numbers that are right on par with what was the Dust Bowl,” he said.

    Forty out of the 50 US states will face a water shortage within the next 10 years.

    As aforementioned, this year’s dry, warm winter has left the entire western United States snowpack at record-low levels. Given that this is a critical source of fresh surface water for the entire region, this will only exacerbate the already critical water shortages that are plaguing the region.

    One ramification of this is exampled by how the once-powerful Rio Grande River has been reduced to a mere trickle still hundreds of miles from its destination at the end of its 1,900-mile journey to the Gulf of Mexico, thanks to the increasing impacts of ACD. Farmers and residents who rely on it for water are in deep trouble.

    And it’s not just California and the US Southwest that are dealing with major water shortages. The Government Accountability Office recently released a report showing that 40 out of the 50 US states will face a water shortage within the next 10 years.

    Meanwhile, up in Alaska, that state’s iconic Iditarod sled dog race has been reduced to having mushers have their dogs drag their sleds across large swaths of mud that spanned over 100 miles in some areas, due to warmer temperatures there melting snow and ice that used to cover the course. “I love the challenge, being able to overcome anything on the trail,” said four-time winner Martin Buser of the new conditions. “But if this is a new normal, I’m not sure I can sustain it.”

    In this writer’s backyard, glaciers are melting away at dramatic rates in Olympic National Park. Pictures tell the story, which was also addressed in detail recently at a talk given at the park by University of Washington research professor Michelle Koutnik, who was part of a team monitoring the park’s Blue Glacier. By way of example, an entire section of the lower Blue Glacier that existed in 1989 was completely gone by 2008, and melt rates are increasing. A sobering “before and after” look at the photographic evidence should not be missed.

    A recent study gave another grim report on glaciers, this one focusing on Canada where glaciers in British Columbia and Alberta are projected to shrink by at least 70 percent by the end of this century, and of course ACD was noted as the main driving force behind the change. “Most of that is going to go,” one of the researchers said of Canada’s glaciers. “And most seems to be on its way out.”

    study recently published in the Proceedings of the National Academy of Sciences has found that as the Arctic Ocean warms and loses its sea ice cover, phytoplankton populations will explode. This creates another positive feedback loop for ACD, as it further amplifies warming in a region that is already heating up twice as fast as the rest of the globe.

    On the other end of the water spectrum, rising seas continue to afflict Venice, where the city is seeing dramatic changes. According to a recent report: “In the 1920s, there were about 400 incidents of acqua alta, or high water, when the right mix of tides and winds drives the liquid streets up into homes and shops in the lowers parts of the city. By the 1990s, there were 2,400 incidents – and new records are set every year.”

    Fire

    An April report shows that ACD is predicted to bring more fires and less snow to the iconic Yellowstone National Park. These changes will likely fuel catastrophic wildfires, cause declines in mountain snows and threaten the survival of animals and plants, according to the scientists who authored the report. It shows that expected warming over the US West over the next three decades will transform the land in and around Yellowstone from a wetter, mostly forested Rocky Mountain ecosystem into a more open landscape, more akin to the arid US Southwest.

    “Ecological Implications of Climate Change on the Greater Yellowstone Ecosystem,” compiled by more than 20 university and government scientists, said that such dry conditions in that area have not been seen for the last 10,000 years, and extremely destructive wildfires like the one in 1988 that burned thousands of acres of the park are going to become more common, while years without major fires will become rare.

    Denial and Reality

    The climate disruption deniers have been barking loudly over the last month, which should be expected as irrefutable evidence of ACD continues in an avalanche.

    Following Florida’s lead, Wisconsin officially became the next state to censor its employees’ work regarding climate disruption. Wisconsin has banned its employees from working on ACD, after Florida banned the use of the terms “climate change” and “global warming.”

    Perhaps this is what played a role in inspiring acclaimed astrophysicist Neil deGrasse Tyson to proclaim that politicians denying science is “the beginning of the end of an informed democracy.”

    Facing a loss of high-profile corporate sponsors, the American Legislative Exchange Council (ALEC), now tired of being accused of ACD denial, has threatened actionagainst activist groups that accuse it of denying ACD. This “action” could come in the form of lawsuits.

    The Yale Project on Climate Change Communication released very interesting county-by-county maps of the United States, which show the various levels of ACD denial across the country and are worth examining.

    Over the last four years, extreme weather events in the US caused 1,286 fatalities and $227 billion in economic losses.

    Not to be outdone by fellow Republican ACD-denying presidential candidates, Marco Rubio voluntarily donned the dunce cap by stating that scientists have not determined what percentage of ACD is due to human activities compared to natural climate variability, and added brilliantly, “climate is always changing.”

    This year has seen us cross yet another milestone in the Arctic – this one being that sea ice covering the top of the world reached the lowest maximum extent yet observed during the winter. This means, ominously, that in just the last four years Arctic sea ice has seen a new low both for its seasonal winter peak (2015) and for its summer minimum (2012). While most sane people would see this as a gut-wrenching fact to have to process emotionally, Robert Molnar, the CEO of the Sailing the Arctic Race, is busily planning an “extreme yacht race” for the summer and fall of 2017 there. “The more ice that’s being melted, the more free water is there for us to be sailing,” he said.

    In stark contrast, US Secretary of State John Kerry is visiting the Arctic amid concerns over the melting ice, and some of the mainstream media, in this case The Washington Post, are running op-eds claiming that ACD deniers are actually now in retreat due to their own outlandish comments.

    In a historic move, even oil giant BP’s shareholders voted overwhelmingly to support a resolution that would force the company to disclose some of its ACD-related risks.

    Also on the reality front, recently released analysis shows that densely populated Asian islands and countries like Hong Kong, Japan, Taiwan and the Philippines are likely to face even more intense climatic events in the future.

    Another report, this one titled “An Era of Extreme Weather” by the Center for American Progress, shows that major weather events across the United States in 2014 cost an estimated $19 billion and caused at least 65 human fatalities. The report also shows that over the last four years, extreme weather events in the US caused 1,286 fatalities and $227 billion in economic losses spanning 44 states.

    US President Barack Obama formally submitted to the UN a commitment to reduce US greenhouse gas emissions by up to 28 percent below 2005 levels by 2025. Critics believe this is far too little, too late, but at least it is a move in the right direction.

    In an interesting twist of fate, while many Florida Republican lawmakers are busily denying ACD, other Florida Republicans are busy working to protect their state’s coastal areas from rising seas resulting from advancing ACD.

    Lastly in this month’s dispatch, a recently published study shows that acidic oceans helped fuel the largest mass extinction event in the history of the planet, which wiped out approximately 90 percent of all life on earth.

    The carbon released that was one of the primary drivers of that extinction event was found to have been released at a similar rate to modern emissions. Dr. Matthew Clarkson, one of the authors of the study, commented: “Scientists have long suspected that an ocean acidification event occurred during the greatest mass extinction of all time, but direct evidence has been lacking until now. This is a worrying finding, considering that we can already see an increase in ocean acidity today that is the result of human carbon emissions.”

    Copyright, Truthout. May not be reprinted without permission

    University offering free online course to demolish climate denial (The Guardian)

    The University of Queensland’s course examines the science of climate science denial

    David Attenborough signs his new book 'Life in the Air' at the Natural Hisory Museum in London.  Attenborough is among the big names interviewed in the University of Queensland MOOC.

    David Attenborough signs his new book ‘Life in the Air’ at the Natural Hisory Museum in London. Attenborough is among the big names interviewed in Denial101x. Photograph: Sarah Lee/Sarah Lee

    Starting 28 April, 2015, the University of Queensland is offering a free Massive Open Online Course (MOOC) aimed at “Making Sense of Climate Science Denial”.

     Denial101x summary.

    The course coordinator is John Cook, University of Queensland Global Change Institute climate communication fellow, and founder of the climate science myth debunking website Skeptical ScienceCook’s research has primarily focused on the psychology of climate science denial. As he explains,

    97% of climate scientists agree that humans are causing global warming; however, less than half of Australians are aware of humanity’s role in climate change, while half of the US Senate has voted that humans aren’t causing global warming. This free course explains why there is such a huge gap between the scientific community and the public. Our course looks at what’s driving climate science denial and the most common myths about climate change. 

    The course includes climate science and myth debunking lectures by the international team of volunteer scientific contributors to Skeptical Science, including myself, and interviews with many of the world’s leading climate science and psychology experts. Making Sense of Climate Science Denial is a seven-week program featuring interviews with 75 scientific experts, including Sir David AttenboroughKatharine HayhoeRichard AlleyMichael Mann, and Naomi Oreskes.

    The course incorporates lessons in both climate science and psychology to explain the most common climate myths and to detail how to respond to them. Research has shown that myth debunking is most effective when people understand why the myth originated in the first place. For example, cherry picking (focusing on a small bit of convenient data and ignoring the rest) is one of the most common fallacies behind climate science myths.

    The lectures in the University of Queensland MOOC not only explain the science, but also the fallacies underpinning each myth. This is a unique and important feature to this course, because understanding their origins effectively acts to inoculate people against myths.

    Thousands of students from more than 130 countries have already enrolled in Making Sense of Climate Science Denial. The goal is for the students to come out of the course with a stronger understanding of climate science, myth debunking, and the psychology of science denial that’s become so pervasive and dangerous in today’s world.

    Pesquisa revela poder da energia liberada pelas mãos (RAC)

    Energia liberada pelas mãos consegue curar malefícios, afirma pesquisa da USP

    25/11/2011 – 08h58 . Gazeta de Ribeirão

    A missionária Marta Brisa transmite as técnicas de Johrei em Ana Paula Politi
    (Foto: Lucas Mamede/Da Gazeta de Ribeirão)

    Um estudo desenvolvido recentemente pela USP (Universidade de São Paulo), em conjunto com a Unifesp (Universidade Federal de São Paulo), comprova que a energia liberada pelas mãos tem o poder de curar qualquer tipo de mal estar. O trabalho foi elaborado devido às técnicas manuais já conhecidas na sociedade, caso do Johrei, utilizada pela igreja Messiânica do Brasil e ao mesmo tempo semelhante à de religiões como o espiritismo, que pratica o chamado “passe”.

    Todo o processo de desenvolvimento dessa pesquisa nasceu em 2000, como tema de mestrado do pesquisador Ricardo Monezi, na Faculdade de Medicina da USP. Ele teve a iniciativa de investigar quais seriam os possíveis efeitos da prática de imposição das mãos. “Este interesse veio de uma vivência própria, onde o Reiki (técnica) já havia me ajudado, na adolescência, a sair de uma crise de depressão”, afirmou Monezi, que hoje é pesquisador da Unifesp.

    Segundo o cientista, durante seu mestrado foram investigado os efeitos da imposição em camundongos, nos quais foi possível observar um notável ganho de potencial das células de defesa contra células que ficam os tumores. “Agora, no meu doutorado que está sendo finalizado na Unifesp, estudamos não apenas os efeitos fisiológicos, mas também os psicológicos”, completou.

    A constatação no estudo de que a imposição de mãos libera energia capaz de produzir bem-estar foi possível porque a ciência atual ainda não possui uma precisão exata sobre esse efeitos. “A ciência chama estas energias de ‘energias sutis’, e também considera que o espaço onde elas estão inseridas esteja próximo às frequências eletromagnéticas de baixo nível”, explicou.

    As sensações proporcionadas por essas práticas analisadas por Monezi foram a redução da percepção de tensão, do stress e de sintomas relacionados a ansiedade e depressão. “O interessante é que este tipo de imposição oferece a sensação de relaxamento e plenitude. E além de garantir mais energia e disposição.”

    Neste estudo do mestrado foram utilizados 60 ratos. Já no doutorado foram avaliados 44 idosos com queixas de stress.
    O processo de desenvolvimento para realizar este doutorado foi finalizado no primeiro semestre deste ano. Mas a Unifesp está prestes a iniciar novas investigações a respeito dos efeitos do Reiki e práticas semelhantes a partir de abril do ano que vem.

    Preternatural machines (AEON)

    by 

    Robots came to Europe before the dawn of the mechanical age. To a medieval world, they were indistinguishable from magic

    E R Truitt is a medieval historian at Bryn Mawr College in Pennsylvania. Her book, Medieval Robots: Mechanism, Magic, Nature, and Art, is out in June.

    Edited by Ed Lake

    In 807 the Abbasid caliph in Baghdad, Harun al-Rashid, sent Charlemagne a gift the like of which had never been seen in the Christian empire: a brass water clock. It chimed the hours by dropping small metal balls into a bowl. Instead of a numbered dial, the clock displayed the time with 12 mechanical horsemen that popped out of small windows, rather like an Advent calendar. It was a thing of beauty and ingenuity, and the Frankish chronicler who recorded the gift marvelled how it had been ‘wondrously wrought by mechanical art’. But given the earliness of the date, what’s not clear is quite what he might have meant by that.

    Certain technologies are so characteristic of their historical milieux that they serve as a kind of shorthand. The arresting title credit sequence to the TV series Game of Thrones (2011-) proclaims the show’s medieval setting with an assortment of clockpunk gears, waterwheels, winches and pulleys. In fact, despite the existence of working models such as Harun al-Rashid’s gift, it was another 500 years before similar contraptions started to emerge in Europe. That was at the turn of the 14th century, towards the end of the medieval period – the very era, in fact, whose political machinations reportedly inspired the plot of Game of Thrones.

    When mechanical clockwork finally took off, it spread fast. In the first decades of the 14th century, it became so ubiquitous that, in 1324, the treasurer of Lincoln Cathedral offered a substantial donation to build a new clock, to address the embarrassing problem that ‘the cathedral was destitute of what other cathedrals, churches, and convents almost everywhere in the world are generally known to possess’. It’s tempting, then, to see the rise of the mechanical clock as a kind of overnight success.

    But technological ages rarely have neat boundaries. Throughout the Latin Middle Ages we find references to many apparent anachronisms, many confounding examples of mechanical art. Musical fountains. Robotic servants. Mechanical beasts and artificial songbirds. Most were designed and built beyond the boundaries of Latin Christendom, in the cosmopolitan courts of Baghdad, Damascus, Constantinople and Karakorum. Such automata came to medieval Europe as gifts from foreign rulers, or were reported in texts by travellers to these faraway places.

    In the mid-10th century, for instance, the Italian diplomat Liudprand of Cremona described the ceremonial throne room in the Byzantine emperor’s palace in Constantinople. In a building adjacent to the Great Palace complex, Emperor Constantine VII received foreign guests while seated on a throne flanked by golden lions that ‘gave a dreadful roar with open mouth and quivering tongue’ and switched their tails back and forth. Next to the throne stood a life-sized golden tree, on whose branches perched dozens of gilt birds, each singing the song of its particular species. When Liudprand performed the customary prostration before the emperor, the throne rose up to the ceiling, potentate still perched on top. At length, the emperor returned to earth in a different robe, having effected a costume change during his journey into the rafters.

    The throne and its automata disappeared long ago, but Liudprand’s account echoes a description of the same marvel that appears in a Byzantine manual of courtly etiquette, written – by the Byzantine emperor himself, no less – at around the same time. The contrast between the two accounts is telling. The Byzantine one is preoccupied with how the special effects slotted into certain rigid courtly rituals. It was during the formal introduction of an ambassador, the manual explains, that ‘the lions begin to roar, and the birds on the throne and likewise those in the trees begin to sing harmoniously, and the animals on the throne stand upright on their bases’. A nice refinement of royal protocol. Liudprand, however, marvelled at the spectacle. He hazarded a guess that a machine similar to a winepress might account for the rising throne; as for the birds and lions, he admitted: ‘I could not imagine how it was done.’

    Other Latin Christians, confronted with similarly exotic wonders, were more forthcoming with theories. Engineers in the West might have lacked the knowledge to copy these complex machines or invent new ones, but thanks to gifts such as Harun al-Rashid’s clock and travel accounts such as Liudprand’s, different kinds of automata became known throughout the Christian world. In time, scholars and philosophers used their own scientific ideas to account for them. Their framework did not rely on a thorough understanding of mechanics. How could it? The kind of mechanical knowledge that had flourished since antiquity in the East had been lost to Europe following the decline of the western Roman Empire.

    Instead, they talked about what they knew: the hidden powers of Nature, the fundamental sympathies between celestial bodies and earthly things, and the certainty that demons existed and intervened in human affairs. Arthur C Clarke’s dictum that any sufficiently advanced technology is indistinguishable from magic was rarely more apposite. Yet the very blurriness of that boundary made it fertile territory for the medieval Christian mind. In time, the mechanical age might have disenchanted the world – but its eventual victory was much slower than the clock craze might suggest. And in the meantime, there were centuries of magical machines.

    In the medieval Latin world, Nature could – and often did – act predictably. But some phenomena were sufficiently weird and rare that they could not be considered of a piece with the rest of the natural world. They therefore were classified as preternatural: literally, praeter naturalis or ‘beyond nature’.

    What might fall into this category? Pretty much any freak occurrence or deviation from the ordinary course of things: a two-headed sheep, for example. Then again, some phenomena qualified as preternatural because their causes were not readily apparent and were thus difficult to know. Take certain hidden – but essential – characteristics of objects, such as the supposedly fire-retardant skin of the salamander, or the way that certain gems were thought to detect or counteract poison. Magnets were, of course, a clear case of the preternatural at work.

    If the manifestations of the preternatural were various, so were its causes. Nature herself might be responsible – just because she often behaved predictably did not mean that she was required to do so – but so, equally, might demons and angels. People of great ability and learning could use their knowledge, acquired from ancient texts, to predict preternatural events such as eclipses. Or they might harness the secret properties of plants or natural laws to bring about certain desired outcomes. Magic was largely a matter of manipulating this preternatural domain: summoning demons, interpreting the stars, and preparing a physic could all fall under the same capacious heading.

    All of which is to say, there were several possible explanations for the technological marvels that were arriving from the east and south. Robert of Clari, a French knight during the disastrous Fourth Crusade of 1204, described copper statues on the Hippodrome that moved ‘by enchantment’. Several decades later, Franciscan missionaries to the Mongol Empire reported on the lifelike artificial birds at the Khan’s palace and speculated that demons might be the cause (though they didn’t rule out superior engineering as an alternative theory).

    Does a talking statue owe its powers to celestial influence or demonic intervention?

    Moving, speaking statues might also be the result of a particular alignment of planets. While he taught at the cathedral school in Reims, Gerbert of Aurillac, later Pope Sylvester II (999-1003), introduced tools for celestial observation (the armillary sphere and the star sphere) and calculation (the abacus and Arabic numerals) to the educated elites of northern Europe. His reputation for learning was so great that, more than 100 years after his death, he was also credited with making a talking head that foretold the future. According to some accounts, he accomplished this through demonic magic, which he had learnt alongside the legitimate subjects of science and mathematics; according to others, he used his superior knowledge of planetary motion to cast the head at the precise moment of celestial conjunction so that it would reveal the future. (No doubt he did his calculations with an armillary sphere.)

    Because the category of the preternatural encompassed so many objects and phenomena, and because there were competing, rationalised explanations for preternatural things, it could be difficult to discern the correct cause. Does a talking statue owe its powers to celestial influence or demonic intervention? According to one legend, Albert the Great – a 13th-century German theologian, university professor, bishop, and saint – used his knowledge to make a prophetic robot. One of Albert’s brothers in the Dominican Order went to visit him in his cell, knocked on the door, and was told to enter. When the friar went inside he saw that it was not Brother Albert who had answered his knock, but a strange, life-like android. Thinking that the creature must be some kind of demon, the monk promptly destroyed it, only to be scolded for his rashness by a weary and frustrated Albert, who explained that he had been able to create his robot because of a very rare planetary conjunction that happened only once every 30,000 years.

    In legend, fiction and philosophy, writers offered explanations for the moving statues, artificial animals and musical figures that they knew were part of the world beyond Latin Christendom. Like us, they used technology to evoke particular places or cultures. The golden tree with artificial singing birds that confounded Liudprand on his visit to Constantinople appears to have been a fairly common type of automaton: it appears in the palaces of Samarra and Baghdad and, later, in the courts of central India. In the early 13th century, the sultan of Damascus sent a metal tree with mechanical songbirds as a gift to the Holy Roman Emperor Frederick II. But this same object also took root in the Western imagination: we find writers of fiction in medieval Europe including golden trees with eerily lifelike artificial birds in many descriptions of courts in Babylon and India.

    In one romance from the early 13th century, sorcerers use gemstones with hidden powers combined with necromancy to make the birds hop and chirp. In another, from the late 12th century, the king harnesses the winds to make the golden branches sway and the gilt birds sing. There were several different species of birds represented on the king’s fabulous tree, each with its own birdsong, so exact that real birds flocked to the tree in hopes of finding a mate. ‘Thus the blackbirds, skylarks, jaybirds, starlings, nightingales, finches, orioles and others which flocked to the park in high spirits on hearing the beautiful birdsong, were quite unhappy if they did not find their partner!’

    Of course, the Latin West did not retain its innocence of mechanical explanations forever. Three centuries after Gerbert taught his students how to understand the heavens with an armillary sphere, the enthusiasm for mechanical clocks began to sweep northern Europe. These giant timepieces could model the cosmos, chime the hour, predict eclipses and represent the totality of human history, from the fall of humankind in the Garden of Eden to the birth and death of Jesus, and his promised return.

    Astronomical instruments, like astrolabes and armillary spheres, oriented the viewer in the cosmos by showing the phases of the moon, the signs of the zodiac and the movements of the planets. Carillons, programmed with melodies, audibly marked the passage of time. Large moving figures of people, weighted with Christian symbolism, appear as monks, Jesus, the Virgin Mary. They offered a master narrative that fused past, present and future (including salvation). The monumental clocks of the late medieval period employed cutting-edge technology to represent secular and sacred chronology in one single timeline.

    Secular powers were no slower to embrace the new technologies. Like their counterparts in distant capitals, European rulers incorporated mechanical marvels into their courtly pageantry. The day before his official coronation in Westminster Abbey in 1377, Richard II of England was ‘crowned’ by a golden mechanical angel – made by the goldsmiths’ guild – during his coronation pageant in Cheapside.

    And yet, although medieval Europeans had figured out how to build the same kinds of complex automata that people in other places had been designing and constructing for centuries, they did not stop believing in preternatural causes. They merely added ‘mechanical’ to the list of possible explanations. Just as one person’s ecstatic visions might equally be attributed to divine inspiration or diabolical trickery, a talking or moving statue could be ascribed to artisanal or engineering know-how, the science of the stars, or demonic art. Certainly the London goldsmiths in 1377 were in no doubt about how the marvellous angel worked. But because a range of possible causes could animate automata, reactions to them in this late medieval period tended to depend heavily on the perspective of the individual.

    At a coronation feast for the queen at the court of Ferdinand I of Aragon in 1414, theatrical machinery – of the kind used in religious Mystery Plays – was used for part of the entertainment. A mechanical device called a cloud, used for the arrival of any celestial being (gods, angels and the like), swept down from the ceiling. The figure of Death, probably also mechanical, appeared above the audience and claimed a courtier and jester named Borra for his own. Other guests at the feast had been forewarned, but nobody told Borra. A chronicler reported on this marvel with dry exactitude:
    Death threw down a rope, they [fellow guests] tied it around Borra, and Death hanged him. You would not believe the racket that he made, weeping and expressing his terror, and he urinated into his underclothes, and urine fell on the heads of the people below. He was quite convinced he was being carried off to Hell. The king marvelled at this and was greatly amused.

    Such theatrical tricks sound a little gimcrack to us, but if the very stage machinery might partake of uncanny forces, no wonder Borra was afraid.

    Nevertheless, as mechanical technology spread throughout Europe, mechanical explanations of automata (and machines in general) gradually prevailed over magical alternatives. By the end of the 17th century, the realm of the preternatural had largely vanished. Technological marvels were understood to operate within the boundaries of natural laws rather than at the margins of them. Nature went from being a powerful, even capricious entity to an abstract noun denoted with a lower-case ‘n’: predictable, regular, and subject to unvarying law, like the movements of a mechanical clock.

    This new mechanistic world-view prevailed for centuries. But the preternatural lingered, in hidden and surprising ways. In the 19th century, scientists and artists offered a vision of the natural world that was alive with hidden powers and sympathies. Machines such as the galvanometer – to measure electricity – placed scientists in communication with invisible forces. Perhaps the very spark of life was electrical.

    Even today, we find traces of belief in the preternatural, though it is found more often in conjunction with natural, rather than artificial, phenomena: the idea that one can balance an egg on end more easily at the vernal equinox, for example, or a belief in ley lines and other Earth mysteries. Yet our ongoing fascination with machines that escape our control or bridge the human-machine divide, played out countless times in books and on screen, suggest that a touch of that old medieval wonder still adheres to the mechanical realm.

    30 March 2015

    Climate change: Embed the social sciences in climate policy (Nature)

    David Victor

    01 April 2015

    David G. Victor calls for the IPCC process to be extended to include insights into controversial social and behavioural issues.

    Illustration by David Parkins

    The Intergovernmental Panel on Climate Change (IPCC) is becoming irrelevant to climate policy. By seeking consensus and avoiding controversy, the organization is suffering from the streetlight effect — focusing ever more attention on a well-lit pool of the brightest climate science. But the insights that matter are out in the darkness, far from the places that the natural sciences alone can illuminate.

    With the ink barely dry on the IPCC’s latest reports, scientists and governments are planning reforms for the next big assessment12. Streamlining the review and writing processes could, indeed, make the IPCC more nimble and relevant. But decisions made at February’s IPCC meeting in Nairobi showed that governments have little appetite for change.

    The basic report-making process and timing will remain intact. Minor adjustments such as greater coverage of cross-cutting topics and more administration may make the IPCC slower. Similar soul searching, disagreement, indecision and trivial procedural tweaks have followed each of the five IPCC assessments over the past 25 years3.

    This time needs to be different. The IPCC must overhaul how it engages with the social sciences in particular (see go.nature.com/vp7zgm). Fields such as sociology, political science and anthropology are central to understanding how people and societies comprehend and respond to environmental changes, and are pivotal in making effective policies to cut emissions and collaborate across the globe.

    The IPCC has engaged only a narrow slice of social-sciences disciplines. Just one branch — economics — has had a major voice in the assessment process. In Working Group III, which assesses climate-change mitigation and policy, nearly two-thirds of 35 coordinating lead authors hailed from the field, and from resource economics in particular. The other social sciences were mostly absent. There was one political scientist: me. Among the few bright spots in that report compared with earlier ones is greater coverage of behavioural economics and risk analysis. In Working Group II, which assesses impacts and adaptation, less than one-third of the 64 coordinating lead authors were social scientists, and about half of those were economists.

    Bringing the broader social sciences into the IPCC will be difficult, but it is achievable with a strategy that reflects how the fields are organized and which policy-relevant questions these disciplines know well. It will require big reforms in the IPCC, and the panel will have to relinquish part of the assessment process to other organizations that are less prone to paralysis in the face of controversy.

    Tunnel vision

    The IPCC walks a wavering line between science, which requires independence, and diplomacy, which demands responsiveness to government preference. Although scientists supply and hone the material for reports, governments have a say in all stages of assessment: they adopt the outline for each chapter, review drafts and approve the final reports.

    “Insights such as which policies work (or fail) in practice are skirted.”

    Such tight oversight creates incentives for scientists to stick to the agreed scope and strip out controversial topics. These pressures are especially acute in the social sciences because governments want to control statements about social behaviour, which implicate policy. This domain covers questions such as which countries will bear the costs of climate change; schemes for allocating the burden of cutting emissions; the design of international agreements; how voters respond to information about climate policy; and whether countries will go to war over climate-related stress. The social sciences can help to provide answers to these questions, key for effective climate policy. In practice, few of these insights are explored much by the IPCC.

    The narrowness of what governments will allow the IPCC to publish is particularly evident in the summary for policy-makers produced at the end of each assessment. Governments approve this document line-by-line with consensus. Disagreements range from those over how to phrase concepts such as a ‘global commons’ that requires collective action to those about whole graphs, which might present data in ways that some governments find inconvenient.

    For example, during the approval of the summary from Working Group III last April, a small group of nations vetoed graphs that showed countries’ emissions grouped according to economic growth. Although this format is good science — economic growth is the main driver of emissions — it is politically toxic because it could imply that some countries that are developing rapidly need to do more to control emissions4.

    Context dependent

    The big problem with the IPCC’s output is not the widely levelled charge that it has become too policy prescriptive or is captivated by special interests5. Its main affliction is pabulum — a surfeit of bland statements that have no practical value for policy. Abstract, global numbers from stylized, replicable models get approved because they do not implicate any country or action. Insights such as which policies work (or fail) in practice are skirted. Caveats are buried or mangled.

    Readers of the Working Group III summary for policy-makers might learn, for instance, that annual economic growth might decrease by just 0.06 percentage points by 2050 if governments were to adopt policies that cut emissions in line with the widely discussed goal of 2 °C above pre-industrial levels6. They would have to wade through dense tables to realize that only a fraction of the models say that the goal is achievable, and through the main report to learn that the small cost arises only under simplified assumptions that are far from messy reality.

    Source: Ref. 6

    That said, the social sciences are equally culpable. Because societies are complex and are in many ways harder to study than cells in a petri dish, the intellectual paradigms across most of the social sciences are weak. Beyond a few exceptions — such as mainstream economics — the major debates in social science are between paradigms rather than within them.

    Consider the role of international law. Some social scientists treat law like a contract; others believe that it works mainly through social pressures. The first set would advise policy-makers to word climate deals precisely — to include targets and timetables for emissions cuts — and to apply mechanisms to ensure that countries honour their agreements. The second group would favour bold legal norms with clear focal points — striving for zero net emissions, for example7. Each approach could be useful in the right context.

    Multiple competing paradigms make it hard to organize social-science knowledge or to determine which questions and methods are legitimate. Moreover, the incentives within the social sciences discourage focusing on particular substantive topics such as climate change — especially when they require interdisciplinary collaboration. In political science, for example, research on political mobilization, administrative control and international cooperation among other specialities are relevant. Yet no leading political-science department has a tenured professor who works mainly on climate change8.

    The paradigm problem need not be paralysing. Social scientists should articulate why different intellectual perspectives and contexts lead to different conclusions. Leading researchers in each area can map out disagreement points and their relevance.

    Climate scientists and policy-makers should talk more about how disputes are rooted in different values and assumptions — such as about whether government institutions are capable of directing mitigation. Such disputes help to explain why there are so many disagreements in climate policy, even in areas in which the facts seem clear9.

    Unfortunately, the current IPCC report structure discourages that kind of candour about assumptions, values and paradigms. It focuses on known knowns and known unknowns rather than on deeper and wider uncertainties. The bias is revealed in how the organization uses official language to describe findings — half of the statements in the Working Group III summary were given a ‘high confidence’ rating (see ‘Confidence bias’).

    Wider vista

    Building the social sciences into the IPCC and the climate-change debate more generally is feasible over the next assessment cycle, which starts in October and runs to 2022, with efforts on the following three fronts.

    First, the IPCC must ask questions that social scientists can answer. If the panel looks to the social-sciences literature on climate change, it will find little. But if it engages the fields on their own terms it will find a wealth of relevant knowledge — for example, about how societies organize, how individuals and groups perceive threats and respond to catastrophic stresses, and how collective action works best.

    Dieter Telemans/Panos

    The solar-powered Barefoot College in Rajasthan, India, trains rural villagers in how to install, build and repair solar technologies.

    As soon as the new IPCC leadership is chosen later this year, the team should invite major social-sciences societies such as the American Political Science Association, the American and European societies of international law, the American Sociological Association and the Society for Risk Analysis to propose relevant topics that they can assess and questions they can answer. Multidisciplinary scientific organizations in diverse countries — such as the Royal Society in London and the Third World Academy of Sciences — would round out the picture, because social-sciences societies tend to be national and heavily US-based.

    These questions should guide how the IPCC scopes its next reports. The agency should also ask such societies to organize what they know about climate by discipline — how sociology examines issues related to the topic, for example — and feed that into the assessment.

    Second, the IPCC must become a more attractive place for social-science and humanities scholars who are not usually involved in the climate field and might find IPCC involvement daunting. The IPCC process is dominated by insiders who move from assessment to assessment and are tolerant of the crushing rounds of review and layers of oversight that consume hundreds of hours and require travel to the corners of the globe. Practically nothing else in science service has such a high ratio of input to output. The IPCC must use volunteers’ time more efficiently.

    Third, all parties must recognize that a consensus process cannot handle controversial topics such as how best to design international agreements or how to govern the use of geoengineering technologies. For these, a parallel process will be needed to address the most controversial policy-relevant questions.

    This supporting process should begin with a small list of the most important questions that the IPCC cannot handle on its own. A network of science academies or foundations sympathetic to the UN’s mission could organize short reports — drawing from IPCC assessments and other literature — and manage a review process that is truly independent of government meddling. Oversight from prominent social scientists, including those drawn from the IPCC process, could give the effort credibility as well as the right links to the IPCC itself.

    The list of topics to cover in this parallel mechanism includes how to group countries in international agreements — beyond the crude kettling adopted in 1992 that split the world into industrialized nations and the rest. The list also includes which kinds of policies have had the biggest impact on emissions, and how different concepts of justice and ethics could guide new international agreements that balance the burdens of mitigation and adaptation. There will also need to be a sober re-assessment of policy goals when it becomes clear that stopping warming at 2 °C is no longer feasible10.

    The IPCC has proved to be important — it is the most legitimate body that assesses the climate-related sciences. But it is too narrow and must not monopolize climate assessment. Helping the organization to reform itself while moving contentious work into other forums is long overdue.

    Nature 520, 27–29 (02 April 2015), doi:10.1038/520027a

    References

    1. IPCC. Future Work of the IPCC: Chairman’s Vision Paper on the Future of the IPCC (IPCC, 2015).
    2. IPCC. Future Work of the IPCC: Consideration of the Recommendations by the Task Group on Future Work of the IPCC (IPCC, 2015).
    3. Committee to Review the Intergovernmental Panel on Climate ChangeClimate Change Assessments: Review of the Processes and Procedures of the IPCC (InterAcademy Council, 2010).
    4. Victor, D. G.Gerlagh, R. & Baiocchi, G. Science 3453436 (2014).
    5. Hulme, M. et alNature 463730732 (2010).
    6. IPCCSummary for Policymakers in Climate Change 2014: Mitigation of Climate Change. Contribution of Working Group III to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change (eds Edenhofer, O. et al.) (Cambridge Univ. Press, 2014).
    7. Hafner-Burton, E. M.Victor, D. G. & Lupu, Y. Am. J. Intl Law 1064797 (2012).
    8. Keohane, R. O. PS: Political Sci. & Politics 481926 (2015).
    9. Hulme, M. Why We Disagree About Climate Change: Understanding Controversy, Inaction and Opportunity (Cambridge Univ. Press, 2009).
    10. Victor, D. G. & Kennel C. F. Nature 5143031 (2014).

    Anthropocene: The human age (Nature)

    Momentum is building to establish a new geological epoch that recognizes humanity’s impact on the planet. But there is fierce debate behind the scenes.

    Richard Monastersky

    11 March 2015

    Illustration by Jessica Fortner

    Almost all the dinosaurs have vanished from the National Museum of Natural History in Washington DC. The fossil hall is now mostly empty and painted in deep shadows as palaeobiologist Scott Wing wanders through the cavernous room.

    Wing is part of a team carrying out a radical, US$45-million redesign of the exhibition space, which is part of the Smithsonian Institution. And when it opens again in 2019, the hall will do more than revisit Earth’s distant past. Alongside the typical displays of Tyrannosaurus rex and Triceratops, there will be a new section that forces visitors to consider the species that is currently dominating the planet.

    “We want to help people imagine their role in the world, which is maybe more important than many of them realize,” says Wing.

    This provocative exhibit will focus on the Anthropocene — the slice of Earth’s history during which people have become a major geological force. Through mining activities alone, humans move more sediment than all the world’s rivers combined. Homo sapiens has also warmed the planet, raised sea levels, eroded the ozone layer and acidified the oceans.

    Given the magnitude of these changes, many researchers propose that the Anthropocene represents a new division of geological time. The concept has gained traction, especially in the past few years — and not just among geoscientists. The word has been invoked by archaeologists, historians and even gender-studies researchers; several museums around the world have exhibited art inspired by the Anthropocene; and the media have heartily adopted the idea. “Welcome to the Anthropocene,” The Economist announced in 2011.

    The greeting was a tad premature. Although the term is trending, the Anthropocene is still an amorphous notion — an unofficial name that has yet to be accepted as part of the geological timescale. That may change soon. A committee of researchers is currently hashing out whether to codify the Anthropocene as a formal geological unit, and when to define its starting point.

    But critics worry that important arguments against the proposal have been drowned out by popular enthusiasm, driven in part by environmentally minded researchers who want to highlight how destructive humans have become. Some supporters of the Anthropocene idea have even been likened to zealots. “There’s a similarity to certain religious groups who are extremely keen on their religion — to the extent that they think everybody who doesn’t practise their religion is some kind of barbarian,” says one geologist who asked not to be named.

    The debate has shone a spotlight on the typically unnoticed process by which geologists carve up Earth’s 4.5 billion years of history. Normally, decisions about the geological timescale are made solely on the basis of stratigraphy — the evidence contained in layers of rock, ocean sediments, ice cores and other geological deposits. But the issue of the Anthropocene “is an order of magnitude more complicated than the stratigraphy”, says Jan Zalasiewicz, a geologist at the University of Leicester, UK, and the chair of the Anthropocene Working Group that is evaluating the issue for the International Commission on Stratigraphy (ICS).

    Written in stone

    For geoscientists, the timescale of Earth’s history rivals the periodic table in terms of scientific importance. It has taken centuries of painstaking stratigraphic work — matching up major rock units around the world and placing them in order of formation — to provide an organizing scaffold that supports all studies of the planet’s past. “The geologic timescale, in my view, is one of the great achievements of humanity,” says Michael Walker, a Quaternary scientist at the University of Wales Trinity St David in Lampeter, UK.

    Walker’s work sits at the top of the timescale. He led a group that helped to define the most recent unit of geological time, the Holocene epoch, which began about 11,700 years ago.

    Sources: Dams/Water/Fertilizer, IGBP; Fallout, Ref. 5; Map, E. C. Ellis Phil. Trans. R. Soc. A 369, 1010–1035 (2011); Methane, Ref. 4

    The decision to formalize the Holocene in 2008 was one of the most recent major actions by the ICS, which oversees the timescale. The commission has segmented Earth’s history into a series of nested blocks, much like the years, months and days of a calendar. In geological time, the 66 million years since the death of the dinosaurs is known as the Cenozoic era. Within that, the Quaternary period occupies the past 2.58 million years — during which Earth has cycled in and out of a few dozen ice ages. The vast bulk of the Quaternary consists of the Pleistocene epoch, with the Holocene occupying the thin sliver of time since the end of the last ice age.

    When Walker and his group defined the beginning of the Holocene, they had to pick a spot on the planet that had a signal to mark that boundary. Most geological units are identified by a specific change recorded in rocks — often the first appearance of a ubiquitous fossil. But the Holocene is so young, geologically speaking, that it permits an unusual level of precision. Walker and his colleagues selected a climatic change — the end of the last ice age’s final cold snap — and identified a chemical signature of that warming at a depth of 1,492.45 metres in a core of ice drilled near the centre of Greenland1. A similar fingerprint of warming can be seen in lake and marine sediments around the world, allowing geologists to precisely identify the start of the Holocene elsewhere.

    “The geologic timescale, in my view, is one of the great achievements of humanity.”

    Even as the ICS was finalizing its decision on the start of the Holocene, discussion was already building about whether it was time to end that epoch and replace it with the Anthropocene. This idea has a long history. In the mid-nineteenth century, several geologists sought to recognize the growing power of humankind by referring to the present as the ‘anthropozoic era’, and others have since made similar proposals, sometimes with different names. The idea has gained traction only in the past few years, however, in part because of rapid changes in the environment, as well as the influence of Paul Crutzen, a chemist at the Max Plank Institute for Chemistry in Mainz, Germany.

    Crutzen has first-hand experience of how human actions are altering the planet. In the 1970s and 1980s, he made major discoveries about the ozone layer and how pollution from humans could damage it — work that eventually earned him a share of a Nobel prize. In 2000, he and Eugene Stoermer of the University of Michigan in Ann Arbor argued that the global population has gained so much influence over planetary processes that the current geological epoch should be called the Anthropocene2. As an atmospheric chemist, Crutzen was not part of the community that adjudicates changes to the geological timescale. But the idea inspired many geologists, particularly Zalasiewicz and other members of the Geological Society of London. In 2008, they wrote a position paper urging their community to consider the idea3.

    Those authors had the power to make things happen. Zalasiewicz happened to be a member of the Quaternary subcommission of the ICS, the body that would be responsible for officially considering the suggestion. One of his co-authors, geologist Phil Gibbard of the University of Cambridge, UK, chaired the subcommission at the time.

    Although sceptical of the idea, Gibbard says, “I could see it was important, something we should not be turning our backs on.” The next year, he tasked Zalasiewicz with forming the Anthropocene Working Group to look into the matter.

    A new beginning

    Since then, the working group has been busy. It has published two large reports (“They would each hurt you if they dropped on your toe,” says Zalasiewicz) and dozens of other papers.

    The group has several issues to tackle: whether it makes sense to establish the Anthropocene as a formal part of the geological timescale; when to start it; and what status it should have in the hierarchy of the geological time — if it is adopted.

    When Crutzen proposed the term Anthropocene, he gave it the suffix appropriate for an epoch and argued for a starting date in the late eighteenth century, at the beginning of the Industrial Revolution. Between then and the start of the new millennium, he noted, humans had chewed a hole in the ozone layer over Antarctica, doubled the amount of methane in the atmosphere and driven up carbon dioxide concentrations by 30%, to a level not seen in 400,000 years.

    When the Anthropocene Working Group started investigating, it compiled a much longer long list of the changes wrought by humans. Agriculture, construction and the damming of rivers is stripping away sediment at least ten times as fast as the natural forces of erosion. Along some coastlines, the flood of nutrients from fertilizers has created oxygen-poor ‘dead zones’, and the extra CO2 from fossil-fuel burning has acidified the surface waters of the ocean by 0.1 pH units. The fingerprint of humans is clear in global temperatures, the rate of species extinctions and the loss of Arctic ice.

    The group, which includes Crutzen, initially leaned towards his idea of choosing the Industrial Revolution as the beginning of the Anthropocene. But other options were on the table.

    Some researchers have argued for a starting time that coincides with an expansion of agriculture and livestock cultivation more than 5,000 years ago4, or a surge in mining more than 3,000 years ago (see ‘Humans at the helm’). But neither the Industrial Revolution nor those earlier changes have left unambiguous geological signals of human activity that are synchronous around the globe (see ‘Landscape architecture’).

    This week in Nature, two researchers propose that a potential marker for the start of the Anthropocene could be a noticeable drop in atmospheric CO2 concentrations between 1570 and 1620, which is recorded in ice cores (see page 171). They link this change to the deaths of some 50 million indigenous people in the Americas, triggered by the arrival of Europeans. In the aftermath, forests took over 65 million hectares of abandoned agricultural fields — a surge of regrowth that reduced global CO2.

    Landscape architecture

    A model of land use, based on human-population estimates, suggests that people modified substantial parts of the continents even thousands of years ago.

    Land used intensively by humans.

    8,000 years before present (bp)

    8,000 years before present (bp)

    1,000 years before present (bp)

    anthropocene-slideshow-5

    Present

    anthropocene-slideshow-10

    Source: E. C. Ellis Phil. Trans. R. Soc. A 369, 1010–1035 (2011).

    In the working group, Zalasiewicz and others have been talking increasingly about another option — using the geological marks left by the atomic age. Between 1945 and 1963, when the Limited Nuclear Test Ban Treaty took effect, nations conducted some 500 above-ground nuclear blasts. Debris from those explosions circled the globe and created an identifiable layer of radioactive elements in sediments. At the same time, humans were making geological impressions in a number of other ways — all part of what has been called the Great Acceleration of the modern world. Plastics started flooding the environment, along with aluminium, artificial fertilizers, concrete and leaded petrol, all of which have left signals in the sedimentary record.

    In January, the majority of the 37-person working group offered its first tentative conclusion. Zalasiewicz and 25 other members reported5 that the geological markers available from the mid-twentieth century make this time “stratigraphically optimal” for picking the start of the Anthropocene, whether or not it is formally defined. Zalasiewicz calls it “a candidate for the least-worst boundary”.

    The group even proposed a precise date: 16 July 1945, the day of the first atomic-bomb blast. Geologists thousands of years in the future would be able to identify the boundary by looking in the sediments for the signature of long-lived plutonium from mid-century bomb blasts or many of the other global markers from that time.

    A many-layered debate

    The push to formalize the Anthropocene upsets some stratigraphers. In 2012, a commentary published by the Geological Society of America6 asked: “Is the Anthropocene an issue of stratigraphy or pop culture?” Some complain that the working group has generated a stream of publicity in support of the concept. “I’m frustrated because any time they do anything, there are newspaper articles,” says Stan Finney, a stratigraphic palaeontologist at California State University in Long Beach and the chair of the ICS, which would eventually vote on any proposal put forward by the working group. “What you see here is, it’s become a political statement. That’s what so many people want.”

    Finney laid out some of his concerns in a paper7 published in 2013. One major question is whether there really are significant records of the Anthropocene in global stratigraphy. In the deep sea, he notes, the layer of sediments representing the past 70 years would be thinner than 1 millimetre. An even larger issue, he says, is whether it is appropriate to name something that exists mainly in the present and the future as part of the geological timescale.

    “It’s become a political statement. That’s what so many people want.”

    Some researchers argue that it is too soon to make a decision — it will take centuries or longer to know what lasting impact humans are having on the planet. One member of the working group, Erle Ellis, a geographer at the University of Maryland, Baltimore County, says that he raised the idea of holding off with fellow members of the group. “We should set a time, perhaps 1,000 years from now, in which we would officially investigate this,” he says. “Making a decision before that would be premature.”

    That does not seem likely, given that the working group plans to present initial recommendations by 2016.

    Some members with different views from the majority have dropped out of the discussion. Walker and others contend that human activities have already been recognized in the geological timescale: the only difference between the current warm period, the Holocene, and all the interglacial times during the Pleistocene is the presence of human societies in the modern one. “You’ve played the human card in defining the Holocene. It’s very difficult to play the human card again,” he says.

    Walker resigned from the group a year ago, when it became clear that he had little to add. He has nothing but respect for its members, he says, but he has heard concern that the Anthropocene movement is picking up speed. “There’s a sense in some quarters that this is something of a juggernaut,” he says. “Within the geologic community, particularly within the stratigraphic community, there is a sense of disquiet.”

    Zalasiewicz takes pains to make it clear that the working group has not yet reached any firm conclusions.“We need to discuss the utility of the Anthropocene. If one is to formalize it, who would that help, and to whom it might be a nuisance?” he says. “There is lots of work still to do.”

    Any proposal that the group did make would still need to pass a series of hurdles. First, it would need to receive a supermajority — 60% support — in a vote by members of the Quaternary subcommission. Then it would need to reach the same margin in a second vote by the leadership of the full ICS, which includes chairs from groups that study the major time blocks. Finally, the executive committee of the International Union of Geological Sciences must approve the request.

    At each step, proposals are often sent back for revision, and they sometimes die altogether. It is an inherently conservative process, says Martin Head, a marine stratigrapher at Brock University in St Catharines, Canada, and the current head of the Quaternary subcommission. “You are messing around with a timescale that is used by millions of people around the world. So if you’re making changes, they have to be made on the basis of something for which there is overwhelming support.”

    Some voting members of the Quaternary subcommission have told Nature that they have not been persuaded by the arguments raised so far in favour of the Anthropocene. Gibbard, a friend of Zalasiewicz’s, says that defining this new epoch will not help most Quaternary geologists, especially those working in the Holocene, because they tend not to study material from the past few decades or centuries. But, he adds: “I don’t want to be the person who ruins the party, because a lot of useful stuff is coming out as a consequence of people thinking about this in a systematic way.”

    If a proposal does not pass, researchers could continue to use the name Anthropocene on an informal basis, in much the same way as archaeological terms such as the Neolithic era and the Bronze Age are used today. Regardless of the outcome, the Anthropocene has already taken on a life of its own. Three Anthropocene journals have started up in the past two years, and the number of papers on the topic is rising sharply, with more than 200 published in 2014.

    By 2019, when the new fossil hall opens at the Smithsonian’s natural history museum, it will probably be clear whether the Anthropocene exhibition depicts an official time unit or not. Wing, a member of the working group, says that he does not want the stratigraphic debate to overshadow the bigger issues. “There is certainly a broader point about human effects on Earth systems, which is way more important and also more scientifically interesting.”

    As he walks through the closed palaeontology hall, he points out how much work has yet to be done to refashion the exhibits and modernize the museum, which opened more than a century ago. A hundred years is a heartbeat to a geologist. But in that span, the human population has more than tripled. Wing wants museum visitors to think, however briefly, about the planetary power that people now wield, and how that fits into the context of Earth’s history. “If you look back from 10 million years in the future,” he says, “you’ll be able to see what we were doing today.”

    Nature 519, 144–147 (12 March 2015), doi:10.1038/519144a

    References

    1. Walker, M. et alJ. Quat. Sci. 24317 (2009).
    2. Crutzen, P. J. & Stoermer, E. F. IGBP Newsletter 411718 (2000).
    3. Zalasiewicz. J. et alGSA Today 18(2), 48 (2008).
    4. Ruddiman, W. F. Ann. Rev. Earth. Planet. Sci. 414568 (2013).
    5. Zalasiewicz, J. et alQuatern. Int. http://dx.doi.org/10.1016/j.quaint.2014.11.045 (2015).
    6. Autin, W. J. & Holbrook, J. M. GSA Today 22(7), 6061 (2012).
    7. Finney, S. C. Geol. Soc. Spec. Publ. 3952328 (2013).

    Sociology & Its Discontents (Synthetic Zero)

     

    “Does the discipline of Sociology still have a role to play in the 21st century?To examine where we are at with Sociology in 2015, Philip Dodd is joined by three leading practitioners, the LSE’s Richard Sennett, Frank Furedi from the University of Kent, and Monika Krause at Goldsmiths, as well as the journalist and author, Peter Oborne”

    AUDIO

    I think we can safely leave sociology to the last century without any meaningful loss to our abilities to understand and reform as needed, anyone disagree?

    Avatar de dmfsynthetic zerØ


    “Does the discipline of Sociology still have a role to play in the 21st century?To examine where we are at with Sociology in 2015, Philip Dodd is joined by three leading practitioners, the LSE’s Richard Sennett, Frank Furedi from the University of Kent, and Monika Krause at Goldsmiths, as well as the journalist and author, Peter Oborne”

    I think we can safely leave sociology to the last century without any meaningful loss to our abilities to understand and reform as needed, anyone disagree?

    Ver o post original

    Welcome to Global Warming’s Terrifying New Era (Slate)

    By Eric Holthaus

    19 March 2015

    466467728-in-this-handout-image-provided-by-unicef-the-storm

    Storm damage in Port Vila, Vanuatu. Photo by UNICEF via Getty Images

    On Wednesday, the National Oceanic and Atmospheric Administration announcedthat Earth’s global temperature for February was among the hottest ever measured. So far, 2015 is tracking above record-warm 2014—which, when combined with the newly resurgent El Niño, means we’re on pace for another hottest year in history.

    In addition to the just-completed warmest winter on record globally (despite the brutal cold and record snow in the eastern U.S.), new data on Thursday from the National Snow and Ice Data Center show that this year’s peak Arctic sea ice reached its lowest ever maximum extent, thanks to “an unusual configuration of the jet stream” that greatly warmed the Pacific Ocean near Alaska.

    But here’s the most upsetting news. It’s been exactly 30 years since the last time the world was briefly cooler than its 20th-century average. Every single month since February 1985 has been hotter than the long-term average—that’s 360 consecutive months.

    More than just being a round number, the 30-year streak has deeper significance. In climatology, a continuous 30-year stretch of data is traditionally what’s used to define what’s “normal” for a given location. In a very real way, we can now say that for our given location—the planet Earth—global warming is now “normal.” Forget debating—our climate has officially changed.

    This 30-year streak should change the way we think and talk about this issue. We’ve entered a new era in which global warming is a defining characteristic and a fundamental driver of what it means to be an inhabitant of planet Earth. We should treat it that way. For those who care about the climate, that may mean de-emphasizing statistics and science and beginning to talk more confidently about the moral implications of continuing on our current path.

    Since disasters disproportionately impact the poor, climate change is increasingly an important economic and social justice issue. The pope will visit the United States later this year as part of a broader campaign by the Vatican to directly influence the outcome of this year’s global climate negotiations in Paris—recent polling data show his message may be resonating, especially with political conservatives and nonscience types. Two-thirds of Americans now believe that world leaders are morally obligated to take steps to reduce carbon.

    Scientists and journalists have debated the connection between extreme weather and global warming for years, but what’s happening now is different. Since weather impacts virtually every facet of our lives (at least in a small way), and since climate change is affecting weather at every point in the globe every day (at least in a small way), that makes it at the same time incredibly difficult to study and incredibly important. Formal attribution studies that attempt to scientifically tease out whether global warming “caused” individual events are shortsighted and miss the point. It’s time for a change in tack. The better question to ask is: How do we as a civilization collectively tackle the weather extremes we already face?

    In the aftermath of the nearly unprecedented power and destructive force of Cyclone Pam’s landfall in the remote Pacific island nation of Vanuatu—where survivors were forced to drink saltwater—emerges perhaps the best recent example I’ve seen of a government acknowledging this changed climate in a scientifically sound way:

    Cyclone Pam is a consequence of climate change since all weather is affected by the planet’s now considerably warmer climate. The spate of extreme storms over the past decade—of which Pam is the latest—is entirely consistent in science with the hottest ever decade on record.

    The statement was from the government of the Philippines, the previous country to suffer a direct strike by a Category 5 cyclone—Haiyan in 2013. As chair of the Climate Vulnerable Forum negotiating bloc, the Philippines also called for a strengthening of ambition in the run-up to this year’s global climate agreement in Paris.

    The cost of disasters of all types is rising around the globe as population and wealth increase and storms become more fierce. This week in Japan, 187 countries agreed on a comprehensive plan to reduce loss of life from disasters as well as their financial impact. However, the disaster deal is nonbinding and won’t provide support to the most vulnerable countries.

    Combining weather statistics and photos of devastated tropical islands with discussions of political and economic winners and losers is increasingly necessary as climate change enters a new era. We’re no longer describing the problem. We’re telling the story of how humanity reacts to this new normal.

    As the Guardian’s Alan Rusbridger, in an editorial kickoff of his newspaper’s newly heightened focus on climate, said, “the mainstream argument has moved on.” What’s coming next isn’t certain, but it’s likely to be much more visceral and real than steadily upward sloping lines on a graph.

    Received wisdom about mental illness challenged by new report (Science Daily)

    Date: March 11, 2015

    Source: British Psychological Society

    Summary: A new report challenges received wisdom about the nature of mental illness and has led to widespread media coverage and debate in the UK. Many people believe that schizophrenia is a frightening brain disease that makes people unpredictable and potentially violent, and can only be controlled by medication. However the UK has been at the forefront of research into the psychology of psychosis conducted over the last twenty years, and which reveals that this view is false.


    21st March 2015 will see the US launch of the British Psychological Society’s Division of Clinical Psychology’s ground-breaking report ‘Understanding Psychosis and Schizophrenia’.

    The report, which will be launched at 9am at the Cooper Union, Manhattan, NYC by invitation of the International Society for Psychological and Social approaches to Psychosis (ISPS), challenges received wisdom about the nature of mental illness and has led to widespread media coverage and debate in the UK.

    Many people believe that schizophrenia is a frightening brain disease that makes people unpredictable and potentially violent, and can only be controlled by medication. However the UK has been at the forefront of research into the psychology of psychosis conducted over the last twenty years, and which reveals that this view is false.

    Rather:

    • The problems we think of as ‘psychosis’ — hearing voices, believing things that others find strange, or appearing out of touch with reality — can be understood in the same way as other psychological problems such as anxiety or shyness.
    • They are often a reaction to trauma or adversity of some kind which impacts on the way we experience and interpret the world.
    • They rarely lead to violence.
    • No-one can tell for sure what has caused a particular person’s problems. The only way is to sit down with them and try and work it out.
    • Services should not insist that people see themselves as ill. Some prefer to think of their problems as, for example, an aspect of their personality which sometimes gets them into trouble but which they would not want to be without.
    • We need to invest much more in prevention by attending to inequality and child maltreatment.

    Concentrating resources only on treating existing problems is like mopping the floor while the tap is still running.

    The report is entitled ‘Understanding psychosis and schizophrenia: why people sometimes hear voices, believe things that others find strange, or appear out of touch with reality, and what can help’. It has been written by a group of eminent clinical psychologists drawn from eight UK universities and the UK National Health Service, together with people who have themselves experienced psychosis. It provides an accessible overview of the current state of knowledge, and its conclusions have profound implications both for the way we understand ‘mental illness’ and for the future of mental health services. ?

    The report’s editor, Consultant Clinical Psychologist Anne Cooke from the Salomons Centre for Applied Psychology, Canterbury Christ Church University, said: “The finding that psychosis can be understood and treated in the same way as other psychological problems such as anxiety is one of the most important of recent years, and services need to change accordingly.

    In the past we have often seen drugs as the most important form of treatment. Whilst they have a place, we now need to concentrate on helping each person to make sense of their experiences and find the support that works for them. My dream is that our report will contribute to a sea change in attitudes so that rather than facing prejudice, fear and discrimination, people who experience psychosis will find those around them accepting, open-minded and willing to help.”

    Dr Geraldine Strathdee, NHS England’s National Clinical Director for Mental Health, said: “I am a passionate advocate of supporting people to develop an understanding of the events and difficulties that led them to mental health services.

    That is the first step to getting back in control, and this important report will be a vital resource both for them and for those of us who design and deliver services. The British Psychological Society are a great force for change right at the grass roots of frontline services, in both acute care and long term conditions, and are at the forefront of innovations that integrate physical and psychological care in primary care, community and acute hospital settings.”

    Rt Hon Norman Lamb, UK Minister of State for Care and Support, said: “I strongly welcome the publication of this report. The Government is committed to the provision of psychological therapies, and has recently announced that, for the first time, maximum waiting times will be introduced for NHS mental health services, including for Early Intervention in Psychosis.

    We have also committed substantial resources to support the provision of psychological care for people with a range of mental health problems, including psychosis. I am delighted, therefore, to add my voice in recommending this report, which explains in everyday language the psychological science of why people sometimes hear voices, believe things other people find strange, or appear out of touch with reality. I am particularly pleased that it is the product of a partnership between expert psychologists in universities and NHS Trusts, and experts by experience — people who have themselves experienced psychosis. It helps us to understand such experiences better, to empathise with those who are distressed by them and to appreciate why the Government has made the psychological care of mental health problems a priority.”

    Professor Jamie Hacker-Hughes, President Elect of the British Psychological Society, said: “This report will be remembered as a milestone in psychological health.”

    Jacqui Dillon, Chair of the UK Hearing Voices Network, said “This report is an example of the amazing things that are possible when professionals and people with personal experience work together. Both the report’s content and the collaborative process by which it has been written are wonderful examples of the importance and power of moving beyond ‘them and us’ thinking in mental health.”

    Beth Murphy, Head of Information at the UK Mental Health Charity Mind, said: “We welcome this report which highlights the range of ways in which we can understand experiences such as hearing voices. Anyone of us can experience problems with our mental health, whether we are diagnosed or not.

    People describe and relate to their own experiences in very different ways and it’s important that services can accommodate the complex and varied range of experiences that people have. This can only be done by offering the widest possible range of treatments and therapies and by treating the person as whole, rather than as a set of symptoms.”

    Como a ciência é vista em São Paulo (Fapesp)

    16 de março de 2015

    Agência FAPESP – Uma pesquisa feita pelo Datafolha apontou que a profissão de cientista é a terceira mais admirada pela população (61%), depois das de professor (77%) e médico (70%). Outro destaque é que, apesar de 88% considerarem muito importante investir em ciência e tecnologia, 70% acham insuficiente o investimento atual feito pelo país no setor e 86% acham que o governo deve financiar a pesquisa científica, mesmo que isso não traga benefícios imediatos.

    Entre pesquisadores, melhores recursos financeiros e credibilidade são considerados os principais fatores para a escolha da FAPESP como agência de fomento para seus estudos.

    Os números são de pesquisas feitas pelo Datafolha com três públicos no Estado de São Paulo: população geral, cientistas e formadores de opinião.

    A pesquisa com a população geral foi feita em 138 cidades no Estado de São Paulo. Foram realizadas 3.217 entrevistas com homens e mulheres de 16 anos ou mais, de todas as classes sociais. A pesquisa quantitativa contou com abordagem pessoal dos entrevistados mediante aplicação de questionário estruturado com cerca de 25 minutos de duração.

    Dos entrevistados, 63% disseram ter algum interesse em ciência e tecnologia e 26%, muito interesse. O percentual com muito interesse no assunto “Ciência e Tecnologia” (26%) foi superior ao de “Economia e Empresas” (24%), “Moda” (14%), “Política” (12%) e “Curiosidades sobre pessoas famosas” (7%). Os assuntos de maior interesse foram “Medicina e Saúde” (51%), “Alimentação e Consumo” (45%), “Meio Ambiente e Ecologia” (39%), “Religião” (38%), “Esportes” (32%) e “Cinema, Arte e Cultura” (30%).

    A população disse obter informações frequentes sobre ciência e tecnologia principalmente na TV (31%), na internet (24%) e em conversa com amigos (21%), seguido por jornais (18%) e revistas (10%).

    Para 39%, a pesquisa científica no país está atrasada e 51% concordaram com a afirmação de que, ao tomarem as decisões, os políticos deveriam levar mais em conta as evidências científicas do que a opinião pública.

    Para o presidente da FAPESP, Celso Lafer, “a pesquisa feita pelo Datafolha mostra a importância que a população atribui à ciência e o respeito que tem pelos cientistas. Em segundo lugar, evidencia a clara percepção de que cabe ao Estado apoiar a pesquisa científica, mesmo quando ela possa não trazer benefícios imediatos, e que a iniciativa privada também pode aumentar seus investimentos no setor”, disse.

    Ao mesmo tempo que a população valoriza a ciência e a atividade científica, a pesquisa revela que seu desconhecimento a respeito das instituições de pesquisa é grande: de acordo com o levantamento do Datafolha, 77% não sabem mencionar o nome de uma instituição no setor, nem mesmo de universidades. Ao serem apresentados a nomes de instituições, 26% disseram já ter ouvido falar da FAPESP, mas, desses, 65% não souberam dizer o que a faz a Fundação.

    O conhecimento científico e tecnológico foi considerado de “muita utilidade”, principalmente no “cuidado com a saúde e prevenção de doenças” (70%), na “compreensão do mundo” (51%) e na “preservação do entorno de minha casa e do meio ambiente” (47%).

    “A alta prioridade que a população dá ao apoio à pesquisa e o valor que dá à profissão científica ecoam o sentimento verificado em outros países e estimulam a comunidade científica paulista a obter cada vez mais e melhores resultados de impacto científico, social, e econômico. A pesquisa destaca também a necessidade de maior empenho das instituições na demonstração e associação de seus nomes aos resultados”, disse Carlos Henrique de Brito Cruz, diretor científico da FAPESP.

    A opinião dos pesquisadores

    A pesquisa do Datafolha com pesquisadores apoiados pela FAPESP resultou de 505 entrevistas, feitas com homens e mulheres no Estado de São Paulo.

    O governo foi citado como o principal financiador de pesquisa científica no país e os entrevistados defenderam que as empresas aumentem seus investimentos. Para 67% dos entrevistados o país é “intermediário” em pesquisa científica e, para 80%, tem investimento insuficiente.

    “Melhores recursos financeiros” e “credibilidade” são os principais fatores para a escolha da FAPESP, segundo a pesquisa.

    “O público mais diretamente envolvido reconhece a contribuição da FAPESP e ressalta a sua credibilidade. Em resumo, os dados confirmam o apoio do contribuinte paulista às atividades da FAPESP”, disse Lafer.

    Praticamente a totalidade (99%) acredita na contribuição da pesquisa científica para o crescimento do país e defende a independência dos cientistas.

    Dos entrevistados, 60% consideraram que o país tem muito destaque em agricultura e pecuária e apenas 6% acham que tem muito destaque em desenvolvimento de tecnologias.

    Em relação à satisfação com o desenvolvimento científico da área de atuação, 55% disseram estar satisfeitos, contra 44% que se declararam insatisfeitos – 1% não respondeu. Dos que se mostraram satisfeitos, 31% apontaram como principal motivo o “reconhecimento ou destaque internacional” e 29%, “avanços e desenvolvimento na área de pesquisa”.

    A maioria considera a profissão de cientista pouco atrativa para os jovens por ter baixos salários e pouco prestígio e 58% consideram que a vocação pelo conhecimento é a principal motivação dos cientistas.

    O apoio da FAPESP aos pesquisadores entrevistados se dá por meio de Bolsas de Doutorado (36%), Bolsas de Pós-doutorado (30%), Auxílio à Pesquisa – Regular (26%), Bolsas de Mestrado (26%), Bolsas de Iniciação Científica (22%), Auxílio à Pesquisa – Projeto Temático (5%), Programa de Pesquisa Inovadora em Pequenas Empresas, PIPE (3%), Jovem Pesquisador (2%) e outros (6%).

    Do total, 85% tiveram apoio para pesquisa de outra instituição, principalmente do Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq).

    Formadores de opinião

    O Datafolha também realizou uma pesquisa com formadores de opinião. Foram feitas 30 entrevistas em profundidade: 15 com jornalistas e 15 com professores do ensino médio de escolas públicas e particulares, ambos no Estado de São Paulo.

    O estudo observou que tanto jornalistas como professores têm o hábito de buscar informações sobre ciência e tecnologia, sobretudo na internet. Enquanto os professores costumam ler mais comunicações específicas da sua área, os jornalistas leem diversos meios de comunicação.

    A maioria concorda que a linguagem dos artigos sobre ciência e tecnologia está mais fácil hoje em dia, assim como o acesso à informação científica. De acordo com os entrevistados, o ensino de ciências nas escolas precisa melhorar e há falta de estímulos e capacitação, tanto para professores como para alunos.

    O grau de satisfação com a pesquisa científica no Brasil foi considerado regular. Os entrevistados citaram a falta de investimento e a baixa tradição em pesquisa como aspectos negativos. Por outro lado, acham que o Brasil forma grandes cientistas, mas que esses muitas vezes atuam fora do país.

    Todos os entrevistados consideram que o volume de investimentos na área, atualmente, não é suficiente. Segundo eles, são necessários mais investimentos para melhorar a pesquisa científica, para melhorar a qualidade de vida e para garantir o avanço de que o país necessita, tanto de parte do governo como da iniciativa privada.

    Entre os jornalistas entrevistados, a FAPESP foi a instituição de fomento à pesquisa mais conhecida. Os entrevistados que conhecem a FAPESP têm uma imagem positiva dela – a de instituição séria. Todos os participantes são a favor da existência de instituições públicas de apoio à pesquisa científica no país.

    Os resultados das pesquisas feitas pelo Datafolha estão disponíveis em: