Arquivo da tag: Mediação tecnológica

Science, Journalism, and the Hype Cycle: My piece in tomorrow’s Wall Street Journal (Discovery Magazine)

I think one of the biggest struggles a science writer faces is how to accurately describe the promise of new research. If we start promising that a preliminary experiment is going to lead to a cure for cancer, we are treating our readers cruelly–especially the readers who have cancer. On the other hand, scoffing at everything is not a sensible alternative, because sometimes preliminary experiments really do lead to great advances. In the 1950s, scientists discovered that bacteria can slice up virus DNA to avoid getting sick. That discovery led, some 30 years later, to biotechnology–to an industry that enabled, among other things, bacteria to produce human insulin.

This challenge was very much on my mind as I recently read two books, which I review in tomorrow’s Wall Street Journal. One is on gene therapy–a treatment that inspired wild expectations in the 1990s, then crashed, and now is coming back. The other is epigenetics, which seems to me to be in the early stages of the hype cycle. You can read the essay in full here. [see post below]

March 9th, 2012 5:33 PM by Carl Zimmer

Hope, Hype and Genetic Breakthroughs (Wall Street Journal)

By CARL ZIMMER

I talk to scientists for a living, and one of my most memorable conversations took place a couple of years ago with an engineer who put electrodes in bird brains. The electrodes were implanted into the song-generating region of the brain, and he could control them with a wireless remote. When he pressed a button, a bird singing in a cage across the lab would fall silent. Press again, and it would resume its song.

I could instantly see a future in which this technology brought happiness to millions of people. Imagine a girl blind from birth. You could implant a future version of these wireless electrodes in the back of her brain and then feed it images from a video camera.

As a journalist, I tried to get the engineer to explore what seemed to me to be the inevitable benefits of his research. To his great credit, he wouldn’t. He wasn’t even sure his design would ever see the inside of a human skull. There were just too many ways for it to go wrong. He wanted to be very sure that I understood that and that I wouldn’t claim otherwise. “False hope,” he warned me, “is a sinful thing.”

EPEGINE1

Stephen Voss. Gene therapy allowed this once-blind dog to see again.

Over the past two centuries, medical research has yielded some awesome treatments: smallpox wiped out with vaccines, deadly bacteria thwarted by antibiotics, face transplants. But when we look back across history, we forget the many years of failure and struggle behind each of these advances.

This foreshortened view distorts our expectations for research taking place today. We want to believe that every successful experiment means that another grand victory is weeks away. Big stories appear in the press about the next big thing. And then, as the years pass, the next big thing often fails to materialize. We are left with false hope, and the next big thing gets a reputation as the next big lie.

In 1995, a business analyst named Jackie Fenn captured this intellectual whiplash in a simple graph. Again and again, she had seen new advances burst on the scene and generate ridiculous excitement. Eventually they would reach what she dubbed the Peak of Inflated Expectations. Unable to satisfy their promise fast enough, many of them plunged into the Trough of Disillusionment. Their fall didn’t necessarily mean that these technologies were failures. The successful ones slowly emerged again and climbed the Slope of Enlightenment.

When Ms. Fenn drew the Hype Cycle, she had in mind dot-com-bubble technologies like cellphones and broadband. Yet it’s a good model for medical advances too. I could point to many examples of the medical hype cycle, but it’s hard to think of a better one than the subject of Ricki Lewis’s well-researched new book, “The Forever Fix”: gene therapy.

The concept of gene therapy is beguilingly simple. Many devastating disorders are the result of mutant genes. The disease phenylketonuria, for example, is caused by a mutation to a gene involved in breaking down a molecule called phenylalanine. The phenylalanine builds up in the bloodstream, causing brain damage. One solution is to eat a low-phenylalanine diet for your entire life. A much more appealing alternative would be to somehow fix the broken gene, restoring a person’s metabolism to normal.

In “The Forever Fix,” Ms. Lewis chronicles gene therapy’s climb toward the Peak of Inflated Expectations over the course of the 1990s. A geneticist and the author of a widely used textbook, she demonstrates a mastery of the history, even if her narrative sometimes meanders and becomes burdened by clichés. She explains how scientists learned how to identify the particular genes behind genetic disorders. They figured out how to load genes into viruses and then to use those viruses to insert the genes into human cells.

EPEGINE2

Stephen Voss. Alisha Bacoccini is tested on her ability to read letters, at UPenn Hospital, in Philadelphia, PA on Monday, June 23, 2008. Bacoccini is undergoing an experimental gene therapy trial to improve her sight.

By 1999, scientists had enjoyed some promising successes treating people—removing white blood cells from leukemia patients, for example, inserting working genes, and then returning the cells to their bodies. Gene therapy seemed as if it was on the verge of becoming standard medical practice. “Within the next decade, there will be an exponential increase in the use of gene therapy,” Helen M. Blau, the then-director of the gene-therapy technology program at Stanford University, told Business Week.

Within a few weeks of Ms. Blau’s promise, however, gene therapy started falling straight into the Trough. An 18-year-old man named Jesse Gelsinger who suffered from a metabolic disorder had enrolled in a gene-therapy trial. University of Pennsylvania scientists loaded a virus with a working version of an enzyme he needed and injected it into his body. The virus triggered an overwhelming reaction from his immune system and within four days Gelsinger was dead.

Gene therapy nearly came to a halt after his death. An investigation revealed errors and oversights in the design of Gelsinger’s trial. The breathless articles disappeared. Fortunately, research did not stop altogether. Scientists developed new ways of delivering genes without triggering fatal side effects. And they directed their efforts at one part of the body in particular: the eye. The eye is so delicate that inflammation could destroy it. As a result, it has evolved physical barriers that keep the body’s regular immune cells out, as well as a separate battalion of immune cells that are more cautious in their handling of infection.

It occurred to a number of gene-therapy researchers that they could try to treat genetic vision disorders with a very low risk of triggering horrendous side effects of the sort that had claimed Gelsinger’s life. If they injected genes into the eye, they would be unlikely to produce a devastating immune reaction, and any harmful effects would not be able to spread to the rest of the body.

Their hunch paid off. In 2009 scientists reported their first success with gene therapy for a congenital disorder. They treated a rare form of blindness known as Leber’s congenital amaurosis. Children who were once blind can now see.

As “The Forever Fix” shows, gene therapy is now starting its climb up the Slope of Enlightenment. Hundreds of clinical trials are under way to see if gene therapy can treat other diseases, both in and beyond the eye. It still costs a million dollars a patient, but that cost is likely to fall. It’s not yet clear how many other diseases gene therapy will help or how much it will help them, but it is clearly not a false hope.

Gene therapy produced so much excitement because it appealed to the popular idea that genes are software for our bodies. The metaphor only goes so far, though. DNA does not float in isolation. It is intricately wound around spool-like proteins called histones. It is studded with caps made of carbon, hydrogen and oxygen atoms, known as methyl groups. This coiling and capping of DNA allows individual genes to be turned on and off during our lifetimes.

The study of this extra layer of control on our genes is known as epigenetics. In “The Epigenetics Revolution,” molecular biologist Nessa Carey offers an enlightening introduction to what scientists have learned in the past decade about those caps and coils. While she delves into a fair amount of biological detail, she writes clearly and compellingly. As Ms. Carey explains, we depend for our very existence as functioning humans on epigenetics. We begin life as blobs of undifferentiated cells, but epigenetic changes allow some cells to become neurons, others muscle cells and so on.

Epigenetics also plays an important role in many diseases. In cancer cells, genes that are normally only active in embryos can reawaken after decades of slumber. A number of brain disorders, such as autism and schizophrenia, appear to involve the faulty epigenetic programming of genes in neurons.

Scientists got their first inklings about epigenetics decades ago, but in the past few years the field has become hot. In 2008 the National Institutes of Health pledged $190 million to map the epigenetic “marks” on the human genome. New biotech start-ups are trying to carry epigenetic discoveries into the doctor’s office. The FDA has approved cancer drugs that alter the pattern of caps on tumor-cell DNA. Some studies on mice hint that it may be possible to treat depression by taking a pill that adjusts the coils of DNA in neurons.

People seem to be getting giddy about the power of epigenetics in the same way they got giddy about gene therapy in the 1990s. No longer is our destiny written in our DNA: It can be completely overwritten with epigenetics. The excitement is moving far ahead of what the science warrants—or can ever deliver. Last June, an article on the Huffington Post eagerly seized on epigenetics, woefully mangling two biological facts: one, that experiences can alter the epigenetic patterns in the brain; and two, that sometimes epigenetic patterns can be passed down from parents to offspring. The article made a ridiculous leap to claim that we can use meditation to change our own brains and the brains of our children—and thereby alter the course of evolution: “We can jump-start evolution and leverage it on our own terms. We can literally rewire our brains toward greater compassion and cooperation.” You couldn’t ask for a better sign that epigenetics is climbing the Peak of Inflated Expectations at top speed.

The title “The Epigenetics Revolution” unfortunately adds to this unmoored excitement, but in Ms. Carey’s defense, the book itself is careful and measured. Still, epigenetics will probably be plunging soon into the Trough of Disillusionment. It will take years to see whether we can really improve our health with epigenetics or whether this hope will prove to be a false one.

The Forever Fix

By Ricki LewisSt. Martin’s, 323 pages, $25.99

The Epigenetics Revolution

By Nessa CareyColumbia, 339 pages, $26.95

—Mr. Zimmer’s books include “A Planet of Viruses and Evolution: Making Sense of Life,” co-authored with Doug Emlen, to be published in July.

The QWERTY Effect: The Keyboards Are Changing Our Language! (The Atlantic)

MAR 8 2012, 1:30 PM ET

Could the layout of letters on a keyboard be shaping how we feel about certain words?

UnderwoodKeyboard1.jpg

It’s long been thought that how a word sounds — its very phonemes — can be related in some ways to what that word means. But language is no longer solely oral. Much of our word production happens not in our throats and mouths but on our keyboards. Could that process shape a word’s meaning as well?

That’s the contention of an intriguing new paper by linguists Kyle Jasmin and Daniel Casasanto. They argue that because of the QWERTY keyboard’s asymmetrical shape (more letters on the left than the right), words dominated by right-side letters “acquire more positive valences” — that is to say, they become more likable. Their argument is that because its easier for your fingers to find the correct letters for typing right-side dominated words, the words subtly gain favor in your mind.

As Dave Mosher of Wired explains:

In their first experiment, the researchers analyzed 1,000-word indexes from English, Spanish and Dutch, comparing their perceived positivity with their location on the QWERTY keyboard. The effect was slight but significant: Right-sided words scored more positively than left-sided words.

With newer words, the correlation was stronger. When the researchers analyzed words coined after the QWERTY keyboard’s invention, they found that right-sided words had more positive associations than left-sided words.

In another experiment, 800 typists recruited through Amazon.com’s Mechanical Turk service rated whether made-up words felt positive or negative. A QWERTY effect also emerged in those words.

Jasmin cautioned that words’ literal meanings almost certainly outweigh their QWERTY-inflected associations, and said the study only shows a correlation rather than clear cause-and-effect. Also, while a typist’s left- or right-handedness didn’t seem to matter, Jasmin said there’s not yet enough data to be certain.

Jasmin and Casasanto leave open the question whether the effect may also be the result of subtle cultural preferences for things on the right-hand side. Additionally, they say, “There is about a 90 percent chance that the QWERTY inventor was right-handed,” so it’s possible that biases he carried, may have subconsciously place more likable sounds on the right. However, they say, “such implicit associations would be based on the peculiar roles these letters play in English words or sounds. The finding of similar QWERTYeffects across languages suggests that, even if English-based [biases] influenced QWERTY’s design, QWERTY has now ‘infected’ typers of other languages with similar associations.”

Could Many Universities Follow Borders Bookstores Into Oblivion? (The Chronicle of Higher Education)

March 7, 2012, 7:44 pm
By Marc Parry

Atlanta — Higher education’s spin on the Silicon Valley garage. That was the vision laid out in September, when the Georgia Institute of Technology announced a new lab for disruptive ideas, the Center for 21st Century Universities. During a visit to Atlanta last week, I checked in to see how things were going, sitting down with Richard A. DeMillo, the center’s director and Georgia Tech’s former dean of computing, and Paul M.A. Baker, the center’s associate director. We talked about challenges and opportunities facing colleges at a time of economic pain and technological change—among them the chance that many universities might follow Borders Bookstores into oblivion.

Q. You recently wrote that universities are “bystanders” at the revolution happening around them, even as they think they’re at the center of it. How so?

Mr. DeMillo: It’s the same idea as the news industry. Local newspapers survived most of the last century on profits from classified ads. And what happened? Craigslist drove profits out of classified ads for local newspapers. If you think that it’s all revolving around you, and you’re going to be able to impose your value system on this train that’s leaving the station, that’s going to lead you to one set of decisions. Think of Carnegie Mellon, with its “Four Courses, Millions of Users” idea [which became the Open Learning Initiative], or Yale with the humanities courses, thinking that what the market really wants is universal access to these four courses at the highest quality. And really what the market is doing is something completely different. The higher-education market is reinventing what a university is, what a course is, what a student is, what the value is. I don’t know why anyone would think that the online revolution is about reproducing the classroom experience.

Q. So what is the revolution about?

Mr. DeMillo: You don’t know where events are going to take higher education. But if you want to be an important institution 20 years from now, you have to position yourself so that you can adapt to whatever those technology changes are. Whenever you have this kind of technological change, where there’s a large incumbency, the incumbents are inherently at a disadvantage. And we’re the incumbents.

Q. What are some of the most important changes happening now?

Mr. DeMillo: What you’re seeing, for example, is technology enabling a single master teacher to reach students on an individualized basis on a scale that is unprecedented. So when Sebastian Thrun offers his Intro to Robotics course and gets 150,000 students—that’s a big deal.

Why is it a big deal? Well, because people who want to learn robotics want to learn from the master. And there’s something about the medium that he uses that makes that connection intimate. It’s not the same kind of connection that you get by pointing a camera at the front of the room and letting someone write on a whiteboard. These guys have figured out how to design a way of explaining the material that connects with people at scale. So Stanford all of a sudden becomes a place with a network of stakeholders that’s several orders of magnitude larger than it was 10 years ago. Every one of those students in India that wants to connect to Stanford now—connect to a mentor—now has a way to connect by bypassing their local institutions. Every institution that can’t offer a robotics course now has a way of offering a robotics course.

I think what you see happening now with the massive open courses is going to fundamentally change the business models. It’s going to put the notion of value front and center. Why would I want a credential from this university? Why would I want to pay tuition to this university? It really ups the stakes.

Mr. Baker: There used to be something called Borders, you may remember. Think of Borders, the bookstore, “X, Y, Z University,” the bookstore. If you’ve got Amazon as an analogue for these massively open courses, there is still a model where people actually go into bookstores because sometimes they want to touch, or they like hanging out, or there’s other value offered by that. What it means is that the university needs to rethink what it’s doing, how it’s doing it.

And how it innovates in a way of surviving in the face of this. If I can do the Amazon equivalent of this open course, why should I come here? Well, maybe you shouldn’t. And that’s a client that is lost.

Mr. DeMillo: All you have to do is add up the amount of money spent on courses. Just take an introduction to computer science. Add up the amount of money that’s spent nationwide on introductory programming courses. It’s a big number, I’ll bet. What is the value received for that spend? If, in fact, there’s a large student population that can be served by a higher-quality course, what’s the argument for spending all that money on 6,000 introduction to programming courses?

Q. You really think that many universities could go the way of Borders?

Mr. DeMillo: Yeah. Well, you can see it already. We lost, in this university system, four institutions this year.

Mr. Baker: The University System of Georgia merged four institutions into other ones that were geographically within 50 miles. The programs essentially were replicated. And in an environment in which you’ve got reduced resources, you can’t afford to have essentially identical programs 50 miles apart.

Q. So what sort of learning landscape do you think might emerge?

Mr. DeMillo: One thing that you might see is highly tuned curricula, students being able to select from a range of things that they want to learn and a range of mentors that they want to interact with, whether you think of it as hacking degrees or pulling assessments from a menu of different universities. What does that mean for the individual university? It means that a university has to figure out where its true value sits in that landscape.

Mr. Baker: Another thing we’re looking at is development of a value index to try to calculate, to be vulgar, the return on investment. Our idea is to try to figure out ways of determining what constitutes value for a student, based on four or five personas. So for, let’s say, a mom returning at 50 who wants an education—she’s going to value certain things differently than a 17-year-old rocket scientist coming to Tech who wants to get through in three years and knows exactly what she wants to do.

Mr. Demillo: Jeff Selingo wrote a column about this, having one place to go to figure out the economic value of a degree from a university. It’s a great idea, but why focus only on the paycheck as an economic value? There are lots of indicators of value. Do students from this university go to graduate school by a disproportionately large number? Do they get fellowships? Are they people who stay in their profession for a long period of time? You start to build up a picture of what students tell you, of what alumni tell you, was the value of that education. Can we pull these metrics together and then say something interesting about our institution and by extension others?

Q. What other projects is your center working on right now?

Mr. DeMillo: The Khan Academy—small bursts of knowledge that may or may not be included in a curriculum—was a really interesting idea.

Can students generate this kind of material in a way that’s useful for other students? That’s the genesis of our TechBurstcompetition [in which students create short videos that explain a single topic].

It turns out there’s a lot of interest on the part of the students at Georgia Tech in teaching what they know to their peers. The interesting part of the project is the unexpected things that you get. We had a discussion yesterday about mistakes. This is student-generated stuff, so is it right? Not all the time. Which causes great angst on the part of traditionalists, because now we have Georgia Tech TechBurst video that has errors in it. If these were instructional videos that we were marketing, that would be a very big deal. But they’re not. They’re the start of a thread of conversation among students. There’s one on gerrymandering. So it’s a political-science video, it’s cutely produced, but in some sense it’s not exactly right. And so what you would expect is now other students will come along and annotate that video, and say, well, that’s not exactly what gerrymandering is. And you’ll start to see this students-teaching-students peer-tutoring process taking place in real time.

Q. What about the massive open online course Georgia Tech will run in the fall?

Mr. DeMillo: The idea of a massive open course is something that people normally apply to introductory courses. What happens when you look at a massive open advanced seminar? A seminar room with 10,000 students, 50,000 students—what does that even mean? We’ve got some people here that have been blogging for quite a while about advanced topics. In fact, one of the blogs—Godel’s Lost Letter, by Professor Dick Lipton of Georgia Tech, and Ken Regan of the University at Buffalo—is about advanced computer theory, so it’s a very mathematical blog. It’s in the top 0.1 percent of WordPress blogs. A typical day is 5,000 to 10,000 page views. A hot day is 100,000. The question is can we take this blogging format and turn it into an online seminar.

Q. How would that work?

Mr. DeMillo: The blog is essentially an expression of a master teacher’s understanding of a field to people that want to learn about it. We think that there are some very simple layers that can be built under the existing blogging format that can essentially turn it into a massive open online seminar. It’s also a way of conducting scientific research. When you think about what happens in this blog, it celebrates the process of scientific discovery. I’ll just give you one example. Last year about this time some industrial scientist claimed that he had solved one of the outstanding problems in this area. In the normal course of events, the scientist would have written up the paper, would have sent it to a conference. It would have been refereed. Nine months later the paper would have been presented at the conference. People would have talked about it. It would have been written up to submit to a journal. Refereeing would have taken a couple of years for that. Well, the paper got submitted to Lipton’s blog. It just caused a flurry of activity. So thousands and thousands of scientists flocked to this paper, and essentially speeded up the refereeing of the paper, shortening the time from five years to a couple of weeks. It turns out that people came to believe that the claim was not valid, and the paper was incorrect. But what an education for future research students. You get to see the process of scientific discovery in action.

This is an interesting bookend to the idea of a massive open course. Because the people that are thinking about the massive open online courses for introductory material have a set of considerations. Students are at different levels of achievement. Assessment is very important. The credentialing process is dictated by whether or not you want credit. If you go to the other end of the curriculum, and say, well, what happens when we try to do these advanced courses at scale, credentialing is completely different. Assessment is completely different. You can’t rely on the same automation that you could in the introductory courses. Social networks become extremely important if you’re going to do this stuff at scale, because one professor can’t deal with 100,000 readers. He has to have a network of trusted people who would be able to answer questions. The anticipation is that a whole new set of problems would come up with these kinds of courses.

This conversation has been edited and shortened.

The Importance Of Mistakes (NPR)

February 28, 2012
by ADAM FRANK

It takes a lot of cabling to make the Oscillation Project with Emulsion-Racking Apparatus (OPERA) run at the Gran Sasso National Laboratory (LNGS) in Italy.Alberto Pizzoli/AFP/Getty Images. It takes a lot of cabling to make the Oscillation Project with Emulsion-Racking Apparatus (OPERA) run at the Gran Sasso National Laboratory (LNGS) in Italy.

How do people handle the discovery of their own mistake? Some folks might shrug it off. Some folks might minimize its effect. Some folks might even step in with a lie. Most people, we hope, would admit the mistake. But how often do we expect them to announce it to the world from a hilltop. How often do we expect them to tell us — in the clearest language possible — that they screwed up, providing every detail possible about the nature of the mistake?

That’s exactly what’s required in science. As embarrassing as it might seem to most people, admitting a mistake is really the essence of scientific heroism.

Which brings us, first, to faster-than-light neutrinos and then to climate science.

Last week rumors began to circulate that the (potential) discovery of neutrinos traveling faster than the speed of light may get swept into the dustbin of scientific history. The news (rumors really) first circulated via Science Insider.

“According to sources familiar with the experiment, the 60 nanoseconds discrepancy appears to come from a bad connection between a fiber optic cable that connects to the GPS receiver used to correct the timing of the neutrinos’ flight and an electronic card in a computer.”

Oops.

The story goes on to say that once the cable was tightened the Einstein-busting result disappeared. While “sources familiar with the experiment” might not seem enough to start singing funeral dirges, (who was the source, Deep Neutrino?), CERN released its own statement that points in a similar direction. No one can say for sure yet, but it appears that the faster-than-light hoopla is likely to go away.

So what are we to make of this? A loose cable seems pretty lame on the face of it. “Dude, Everybody with a cable box and a 32-inch flat screen knows you got to check the cable!”

There is no doubt that, as mistakes go, researchers running the neutrino experiments would rather have something a bit more sexy to offer if their result was disproven. (How about tiny corrections due to seismic effects?) Still, I’m betting the OPERA experiment had a heck of a lot more cables than your TV so, perhaps, we should be more understanding.

More importantly, no matter how it happens making mistakes is exactly what scientists are supposed to do. “Our whole problem is to make mistakes as fast possible,” John Wheeler once said.

What make science so powerful is not just the admission of mistakes but also the detailing of mistakes. While the OPERA group might now wish they had waited a bit longer to make their announcement, there is no shame in the mistake in-and-of itself. If they step into the spotlight and tell the world what happened, then they deserve to be counted as heroes just as much as if they’d broken Einstein’s theory.

And that is where we can see the connection to climate, evolution and all the other fronts in the ever-expanding war on science. Last week at the AAAS meeting in Vancouver, Nina Fedoroff, a distinguished agricultural scientist and president of that body, made a bold and frightening statement (especially for someone in such a position of authority). Fedoroff told her audience, as The Guardian reported:

“‘We are sliding back into a dark era,’ she said. ‘And there seems little we can do about it. I am profoundly depressed at just how difficult it has become merely to get a realistic conversation started on issues such as climate change or genetically modified organisms.'”

See video: http://bcove.me/ajmi39pd

The spectacle of watching politicians fall over each other to distance themselves from research validated by armies of scientists is more than depressing. Our current understanding of climate, for example, represents the work of thousands of human beings all working to make mistakes as fast possible, all working to root out error as fast as possible. There is no difference between what happens in climate science or evolutionary biology and any other branch of science.

Honest people asking the best of themselves push forward in their own fields. They watch their work and those of their colleagues closely, always looking for mistakes, cracks in reasoning, subtle flaws in logic. When they are found, the process is set in motion: critique, defend, critique, root out. When science deniers trot out the same tired talking points, talking points with no scientific validity, they ignore (or fail to understand) their argument’s lack of credibility.

Eventually, science always finds its mistakes. Eventually we find some kind of truth, unless, of course, mistakes are forced on us from outside of science. That, however, is an error of another kind entirely.

Ocupe a sala de aula? (Valor Econômico)

JC e-mail 4408, de 19 de Dezembro de 2011.

Artigo de Dani Rodrik publicado no Valor Econômico de hoje (19).

No início de novembro, um grupo de estudantes abandonou um conhecido curso de Harvard de introdução à economia, “Ciências Econômicas 10”, lecionado por meu colega Greg Mankiw. A reclamação: o curso propaga ideologia conservadora disfarçada de ciência econômica e ajuda a perpetuar a desigualdade social.

Os estudantes fazem parte do crescente coro de protestos contra as ciências econômicas modernas da forma como são ensinadas nas principais instituições acadêmicas do mundo. As ciências econômicas sempre tiveram seus críticos, é claro, mas a crise financeira e suas sequelas lhes deram nova munição, que parece validar as antigas acusações contra as suposições pouco realistas da profissão, assim como sua retificação dos mercados e desprezo pelas preocupações sociais.

Mankiw, por sua vez, achou que os estudantes que protestavam estavam “mal informados”. As ciências econômicas não têm ideologia, retorquiu. Citou John Maynard Keynes e destacou que as ciências econômicas são um método que ajuda as pessoas a pensar mais claramente e a alcançar respostas corretas, sem conclusões políticas predeterminadas.

De fato, embora possa entender-se o ceticismo de quem não esteve imerso em anos de estudos avançados de economia, os trabalhos feitos pelos alunos em um curso típico de doutorado em economia produzem uma variedade desconcertante de receitas políticas, dependendo do contexto específico. Algumas das estruturas que os economistas usam para analisar o mundo favorecem o livre mercado, enquanto outras não. Na verdade, boa parte das análises econômicas são voltadas a compreender como a intervenção dos governos pode melhorar o desempenho econômico. E motivações não econômicas e comportamentos socialmente cooperativos são cada vez mais parte dos assuntos estudados por economistas.

Como o grande economista internacional Carlos Diaz-Alejandro, já falecido, disse certa vez, “atualmente, qualquer estudante universitário esperto, se escolher suas suposições […] cuidadosamente, pode produzir um modelo consistente, recomendando praticamente quaisquer medidas políticas às quais ele fosse favorável inicialmente”. E isso foi na década de 70! Um economista aprendiz não precisa mais ser particularmente esperto para produzir conclusões de políticas não ortodoxas.

Ainda assim, os economistas precisam aguentar acusações de que não saem das raias ideológicas, porque eles mesmos são seus piores inimigos no que se refere a aplicar suas teorias no mundo real. Em vez de comunicar todo o arsenal de perspectivas que sua disciplina oferece, eles mostram confiança excessiva em soluções em particular – frequentemente aquelas que melhor se encaixam em suas próprias ideologias.

Vejamos a crise financeira mundial. A macroeconomia e as finanças não carecem das ferramentas necessárias para entender como a crise surgiu e se desenrolou. De fato, a literatura acadêmica está repleta de modelos de bolhas financeiras, informações assimétricas, distorções dos incentivos, crises autorrealizáveis e risco sistêmico. Nos anos que levaram à crise, no entanto, muitos economistas menosprezaram as lições desses modelos em favor dos que tratavam sobre a eficiência e o poder de autocorreção dos mercados, o que, na esfera das políticas, resultou em supervisão inadequada dos mercados financeiros pelos governos.

Em meu livro “O Paradoxo da Globalização”, imagino o seguinte experimento. Consiste em que um jornalista ligue a um professor de economia e pergunte se um acordo de livre comércio com o país X ou Y seria uma boa ideia. Podemos ter quase certeza de que o economista, assim como a ampla maioria das pessoas na profissão, se mostrará empolgado em seu apoio ao livre comércio.

Em outra situação, o repórter não se identifica e diz ser um estudante no seminário universitário avançado do professor sobre teoria do comércio internacional. Ele faz a mesma pergunta: O livre comércio é bom? Duvido que a resposta será tão rápida e sucinta. Na verdade, é provável que o professor se sinta bloqueado com a pergunta. “O que você quer dizer com “bom”?”, ele perguntará. “E “bom” para quem?”

O professor, então, entrará em uma longa e cansativa exegese, que acabará culminando em uma declaração pesadamente evasiva: “Então, se a longa lista de condições que acabei de descrever for cumprida e supondo que podemos tributar os beneficiários para compensar os que saíram perdendo, um comércio mais livre tem o potencial para melhorar o bem-estar de todos.” Se estivesse em dia inspirado, o professor poderia até acrescentar que o impacto do livre comércio no índice de crescimento da economia não seria claro e dependeria de um conjunto inteiramente diferente de requisitos.

A afirmação direta e incondicional sobre os benefícios do livre comércio agora foi transformada em uma declaração adornada com todos os tipos de “se” e “mas”. Estranhamente, o conhecimento que o professor transmite de boa vontade e com grande orgulho a seus estudantes avançados é considerado impróprio (ou perigoso) para o público em geral.

O ensino das ciências econômicas no nível universitário sofre do mesmo problema. Em nosso empenho para mostrar as joias da coroa da profissão de forma imaculada – a eficiência do mercado, a mão invisível, a vantagem comparativa – nós pulamos as complicações e nuances do mundo real, tão conhecidas como são na disciplina. É como se os cursos de introdução à física presumissem um mundo sem gravidade, porque assim tudo ficaria muito mais simples.

Aplicadas apropriadamente, com uma dose saudável de senso comum, as ciências econômicas nos teriam preparado para a crise financeira e nos indicado a direção certa para consertar o que a causou. Mas a ciência econômica que precisamos é a do tipo da “sala de seminário” e não a do tipo “geral”. Precisamos das ciências econômicas que reconheçam suas limitações e saibam que a mensagem apropriada depende do contexto.

Negligenciar a diversidade de orientações intelectuais dentro de sua disciplina não torna os economistas melhores analistas do mundo real. Nem os torna mais populares. (Tradução de Sabino Ahumada)

Dani Rodrik é professor de Economia Política na Harvard University e autor de “The Globalization Paradox: Democracy and the Future of the World Economy” (o paradoxo da globalização: democracia e o futuro da economia mundial, em inglês).

New climate emails leaked ahead of talks (CBS)

November 22, 2011 2:15 PM

The Climatic Research Unit at the University of East Anglia in Norwich, England. (AP)  

LONDON – The British university whose leaked emails caused a global climate science controversy in 2009 says it has discovered a potentially much larger data breach.

University of East Anglia spokesman Simon Dunford said that while academics didn’t have the chance yet to examine the roughly 5,000 emails apparently dumped into the public domain Tuesday, a small sample examined by the university “appears to be genuine.”

The university said in a statement that the emails did not appear to be the result of a new hack or leak. Instead, the statement said that the emails appeared to have been stolen two years ago and held back until now “to cause maximum disruption” to the imminent U.N. climate talks next week in Durban, South Africa.

If that is confirmed, the timing and nature of the leak would follow the pattern set by the so-called “Climategate” emails, which caught prominent scientists stonewalling critics and discussing ways to keep opponents’ research out of peer-reviewed journals.

Those hostile to mainstream climate science claimed the exchanges proved that the threat of global warming was being hyped, and their publication helped destabilize the failed U.N. climate talks in Copenhagen, Denmark, which followed several weeks later.

Although several reviews have since vindicated the researchers’ science, some of their practices – in particular efforts to hide data from critics – have come under strong criticism.

The content of the new batch of emails couldn’t be immediately verified – The Associated Press has not yet been able to secure a copy – but climate skeptic websites carried what they said were excerpts.

Although their context couldn’t be determined, the excerpts appeared to show climate scientists talking in conspiratorial tones about ways to promote their agenda and freeze out those they disagree with. There are several mentions of “the cause” and discussions of ways to shield emails from freedom of information requests.

Penn State University Prof. Michael Mann – a prominent player in the earlier controversy whose name also appears in the latest leak – described the latest leak as “a truly pathetic episode,” blaming agents of the fossil fuel industry for “smear, innuendo, criminal hacking of websites, and leaking out-of-context snippets of personal emails.”

He said the real story in the emails was “an attempt to dig out 2-year-old turkey from Thanksgiving ’09. That’s how desperate climate change deniers have become.”

Bob Ward, with the London School of Economics’ Grantham Research Institute on Climate Change, said in an email that he wasn’t surprised by the leak.

“The selective presentation of old email messages is clearly designed to mislead the public and politicians about the strength of the evidence for man-made climate change,” he said. “But the fact remains that there is very strong evidence that most the indisputable warming of the Earth over the past half century is due to the burning of fossil fuels and other human activities.”

The source of the latest leaked emails was unclear. The perpetrator of the original hack has yet to be unmasked, although British police have said their investigation is still active.

Climate researchers cleared of malpractice
An End to Climategate? Penn State Clears Michael Mann
Why climate change skeptics remain skeptical

Parques eólicos valem uma Belo Monte (Valor Econômico)

JC e-mail 4397, de 02 de Dezembro de 2011.

Os investimentos em eólicas em todo o País vão somar R$ 30 bilhões até 2014 para que 280 parques sejam erguidos, com capacidade de gerar mais de 7,2 mil megawatts (MW) de energia – metade para consumo efetivo. São números comparáveis com os da hidrelétrica de Belo Monte, a usina que tem gerado críticas até de artistas globais.

O que não se pode comparar entre Belo Monte e eólicas é a ampla aceitação que os projetos de ventos ganharam entre ambientalistas, que acreditam ser uma das formas de geração de energia mais limpas do mundo. Nessa onda, tradicionais geradoras de energia hidrelétrica começaram a investir pesado nesse segmento para se tornarem “renováveis”.

Os dois casos mais marcantes neste ano foram da Renova, que ganhou um aporte de capital da Cemig, por meio da Light; e da CPFL Energia. Essa última investiu bilhões de reais em compra de ativos e também apostou em uma fusão com a Ersa, do banco Pátria, e criou a CPFL Renováveis. A empresa tem hoje em operação 210 MW de eólicas e constrói parques que vão somar 550 MW, a maior parte na cidade de Parazinho, ao norte de Natal, no Rio Grande do Norte.

Os ventos potiguares são tão promissores que até 2014 o Estado vai abrigar sozinho um terço de todos os investimentos do país para a construção de 83 parques com capacidade de gerar 2,3 mil MW. De acordo com o secretário de desenvolvimento do Estado, Benito Gama, para o próximo leilão de energia do governo federal, que acontece este mês, foram concedidas licenças ambientais para 62 novos parques na região. “A implantação das torres eólicas já gera em algumas cidades mais empregos que a própria prefeitura”, afirma o secretário estadual.

Em Parazinho, são ao todo 700 empregos diretos gerados pelas obras da CPFL. A empresa está colocando 98 torres nos parques Santa Clara e que tiveram a energia vendida no primeiro leilão do governo federal, em 2009. “Só para Santa Clara arrendamos 2,2 mil hectares de terras, de grandes fazendeiros”, conta o diretor de operações da CPFL Renováveis, João Martin.

As torres e aerogeradores da CPFL são fornecidos pela Wobben e fabricados dentro do próprio canteiro de obras da empresa. As torres são todas com acabamento de concreto, diferentemente daquelas que estão chegando à região de Caetité, na Bahia, para atender a Renova.

A GE é a principal fornecedora na Bahia. As torres são de aço e todas transportadas de Pernambuco até Caetité. A Renova, neste momento, está erguendo 180 torres na região, que vão gerar pouco menos de 300 MW. Mas o projeto total chegará a 1,1 mil MW, sendo que 400 MW são de energia que foi vendida para a Light. O vice-presidente de operações da Renova e um dos fundadores da empresa, Renato Amaral, diz que foi estratégico para a empresa fazer a parceria com a Light justamente para vender a energia no mercado livre. Os preços do mercado regulado caíram fortemente e a competição está cada vez mais dura, com cada vez mais grupos estrangeiros chegando ao Brasil. A eólica que no Proinfa, a preços sem correção de cinco anos atrás, foi vendida a mais de R$ 200 o MW, chegou a R$ 100 no último leilão, que aconteceu em meados deste ano.

Next Buddha Will Be A Collective (p2pfoundation.net)

Religious and spiritual expression is always embedded in societal structures. If social structures are moving towards the form of distributed networks, what kind of evolution of spiritual expression can we expect? In this essay, we will first describe the general societal changes that we see emerging, and expect to become more prevalent in the future, then examine to what degree these changes will have an impact on individual and collective spiritual expression. The reader has to bear with us in the first general part, which explains the peer to peer dynamic, in order to understand its application to spirituality, which is the subject of the second part of the essay. Finally, in the third and final part, we will discuss a few concrete examples.

Read it here.

SUMMARY OF THE 34TH SESSION OF THE INTERGOVERNMENTAL PANEL ON CLIMATE CHANGE (Earth Negotiations Bulletin)

Volume 12 Number 522 – Monday, 21 November 2011

The 34th session of the Intergovernmental Panel on Climate Change (IPCC) was held from 18-19 November 2011 in Kampala, Uganda. The session was attended by more than two hundred participants, including representatives from governments, the United Nations, and intergovernmental and observer organizations. Participants focused primarily on the workstreams resulting from the consideration of the InterAcademy Council (IAC) Review of the IPCC processes and procedures, namely those on: procedures, conflict of interest policy, and communications strategy.

The Panel adopted the revised Procedures for the Preparation, Review, Acceptance, Adoption, Approval and Publication of IPCC Reports, as well as the Implementation Procedures and Disclosure Form for the Conflict of Interest Policy. The Panel also formally accepted the Summary for Policy Makers (SPM) of the Special Report on Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation (SREX), approved by WGs I and II at their joint meeting from 14-17 November 2011. Delegates also addressed issues such as the programme and budget, matters related to other international bodies, and progress reports.

A BRIEF HISTORY OF THE IPCC

The IPCC was established in 1988 by the World Meteorological Organization (WMO) and the UN Environment Programme (UNEP). Its purpose is to assess scientific, technical and socio-economic information relevant to understanding the risks associated with human-induced climate change, its potential impacts, and options for adaptation and mitigation. The IPCC does not undertake new research, nor does it monitor climate-related data, but it conducts assessments on the basis of published and peer-reviewed scientific and technical literature.

The IPCC has three Working Groups (WGs): WGI addresses the scientific aspects of the climate system and climate change; WGII addresses the vulnerability of socio-economic and natural systems to climate change, impacts of climate change and adaptation options; and WGIII addresses options for limiting greenhouse gas emissions and mitigating climate change. Each WG has two Co-Chairs and six Vice-Chairs, except WGIII, which for the Fifth Assessment cycle has three Co-Chairs. The Co-Chairs guide the WGs in fulfilling the mandates given to them by the Panel and are assisted in this task by Technical Support Units (TSUs).

The IPCC also has a Task Force on National Greenhouse Gas Inventories (TFI). TFI oversees the IPCC National Greenhouse Gas Inventories Programme, which aims to develop and refine an internationally agreed methodology and software for the calculation and reporting of national greenhouse gas emissions and removals, and to encourage the use of this methodology by parties to the United Nations Framework Convention on Climate Change (UNFCCC). The Task Group on Data and Scenario Support for Impact and Climate Analysis (TGICA) is an entity set up to address WG needs for data, especially WGII and WGIII. The TGICA facilitates distribution and application of climate change related data and scenarios, and oversees a Data Distribution Centre, which provides data sets, scenarios of climate change and other environmental and socio-economic conditions, and other materials.

The IPCC Bureau is elected by the Panel for the duration of the preparation of an IPCC assessment report (approximately six years). Its role is to assist the IPCC Chair in planning, coordinating and monitoring the work of the IPCC. The Bureau is composed of climate change experts representing all regions. Currently, the Bureau comprises 31 members: the Chair of the IPCC, the Co-Chairs of the three WGs and the Bureau of the TFI (TFB), the IPCC Vice-Chairs, and the Vice-Chairs of the three WGs. The IPCC Secretariat is located in Geneva, Switzerland, and is hosted by the WMO.

IPCC PRODUCTS: Since its inception, the IPCC has prepared a series of comprehensive assessments, special reports and technical papers that provide scientific information on climate change to the international community and are subject to extensive review by experts and governments.

The IPCC has so far undertaken four comprehensive assessments of climate change, each credited with playing a key role in advancing negotiations under the UNFCCC: the First Assessment Report was completed in 1990; the Second Assessment Report in 1995; the Third Assessment Report in 2001; and the Fourth Assessment Report (AR4) in 2007. At its 28th session in 2008, the IPCC decided to undertake a Fifth Assessment Report (AR5) to be completed in 2014.

The latest Assessment Reports are structured into three volumes, one for each WG. Each volume is comprised of a SPM, a Technical Summary and an underlying assessment report. All assessment sections of the reports undergo a thorough review process, which takes place in three stages: a first review by experts; a second review by experts and governments; and a third review by governments. Each SPM is approved line-by-line by each respective WG. The Assessment Report also includes a Synthesis Report (SYR), highlighting the most relevant aspects of the three WG reports, and a SPM of the SYR, which is approved line-by-line by the Panel. More than 450 lead authors, 800 contributing authors, 2500 expert reviewers and 130 governments participated in the elaboration of the AR4.

In addition to the comprehensive assessments, the IPCC produces special reports, methodology reports and technical papers, focusing on specific issues related to climate change. Special reports prepared by the IPCC include: Aviation and the Global Atmosphere (1999); Land Use, Land-use Change and Forestry (2000); Methodological and Technical Issues in Technology Transfer (2000); Safeguarding the Ozone Layer and the Global Climate System (2005); Carbon Dioxide Capture and Storage (2005); Renewable Energy Sources and Climate Change Mitigation (SRREN) (2011); and, most recently, the Special Report on Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation (SREX) (2011). Technical papers have been prepared on Climate Change and Biodiversity (2002) and on Climate Change and Water (2008), among others.

The IPCC also produces methodology reports or guidelines to assist countries in reporting on greenhouse gases. The IPCC Guidelines for National Greenhouse Gas Inventories were first released in 1994 and a revised set was completed in 1996. Additional Good Practice Guidance reports were approved by the Panel in 2000 and 2003. The latest version, the IPCC Guidelines on National Greenhouse Gas Inventories, was approved by the Panel in 2006.

For all this work and its efforts to “build up and disseminate greater knowledge about manmade climate change, and to lay the foundations that are needed to counteract such change,” the IPCC was awarded the Nobel Peace Prize, jointly with former US Vice President Al Gore, in December 2007.

IPCC-28: This session was held from 9-10 April 2008, in Budapest, Hungary, with discussions centering on the future of the IPCC, including key aspects of its work programme such as WG structure, main type and timing of future reports, and the future structure of the IPCC Bureau and the TFB. At this session, the IPCC agreed to prepare the AR5 and to retain the current structure of its WGs. In order to enable significant use of new scenarios in the AR5, the Panel requested the Bureau to ensure delivery of the WGI report by early 2013 and completion of the other WG reports and the SYR at the earliest feasible date in 2014. The Panel also agreed to prepare the SRREN Report, to be completed by 2010. Earth Negotiations Bulletin coverage of IPCC 28 can be found at:http://www.iisd.ca/climate/ipcc28

IPCC-29: This session, which commemorated the IPCC’s 20th anniversary, was held from 31 August to 4 September 2008, in Geneva, Switzerland. At this time, the Panel elected the new IPCC Bureau and the TFB, and re-elected Rajendra Pachauri (India) as IPCC Chair. The Panel also continued its discussions on the future of the IPCC and agreed to create a scholarship fund for young climate change scientists from developing countries with the funds from the Nobel Peace Prize. It also asked the Bureau to consider a scoping meeting on the SREX, which took place from 23-26 March 2009 in Oslo, Norway. Earth Negotiations Bulletin coverage of IPCC-29 can be found at: http://www.iisd.ca/climate/ipcc29

IPCC-30: This session was held from 21-23 April 2009 in Antalya, Turkey. At the meeting, the Panel focused mainly on the near-term future of the IPCC and provided guidance for an AR5 scoping meeting, which was held in Venice, Italy, from 13-17 July 2009. The Panel also gathered climate change experts to propose the chapter outlines of WG contributions to the AR5. Earth Negotiations Bulletincoverage of IPCC 30 can be found at: http://www.iisd.ca/climate/ipcc30

IPCC-31: This session was held from 26-29 October 2009 in Bali, Indonesia. Discussions focused on approval of the proposed AR5 chapter outlines developed by participants at the Venice scoping meeting. The Panel also considered progress on the implementation of decisions taken at IPCC 30 regarding the involvement of scientists from developing countries and countries with economies in transition, use of electronic technologies, and the longer-term future of the IPCC. Earth Negotiations Bulletin coverage of IPCC 31 can be found at: http://www.iisd.ca/climate/ipcc31

INTERACADEMY COUNCIL REVIEW: In response to public criticism of the IPCC related to inaccuracies in the AR4 and the Panel’s response, as well as questions about the integrity of some of its members, UN Secretary-General Ban Ki-moon and IPCC Chair Rajendra Pachauri requested the IAC to conduct an independent review of the IPCC processes and procedures and to present recommendations to strengthen the IPCC and ensure the on-going quality of its reports. The IAC presented its results in a report in August 2010. The IAC Review makes recommendations regarding: management structure; a communications strategy, including a plan to respond to crises; transparency, including criteria for selecting participants and the type of scientific and technical information to be assessed; and consistency in how the WGs characterize uncertainty.

IPCC-32: This session, held from 11-14 October 2010 in Busan, Republic of Korea, addressed the recommendations of the IAC Review. The Panel adopted a number of decisions in response to the IAC Review, including on the treatment of grey literature and uncertainty, and on a process to address errors in previous reports. To address recommendations that required further examination, the Panel established task groups on processes and procedures, communications, conflict of interest policy, and management and governance. The Panel also accepted a revised outline for the AR5 SYR. Earth Negotiations Bulletin coverage of IPCC 32 can be found at:http://www.iisd.ca/climate/ipcc32

SRREN: The eleventh session of WGIII met from 5-8 May 2011 in Abu Dhabi, United Arab Emirates, and approved the Special Report on Renewable Energy Sources and Climate Change Mitigation (SRREN) and its SPM. Discussions focused, among others, on chapters addressing sustainable development, biomass and policy. Key findings of the SRREN include that the technical potential for renewable energies is substantially higher than projected future energy demand, and that renewable energies play a crucial role in all mitigation scenarios.

IPCC-33: The session, held from 10-13 May 2011 in Abu Dhabi, United Arab Emirates, focused primarily on follow-up actions to the IAC Review of the IPCC processes and procedures. The Panel decided to establish an Executive Committee, adopted a Conflict of Interest Policy, and introduced several changes to the rules of procedure. The Panel also endorsed the actions of WGIII in relation to SRREN and its SPM and considered progress on the preparation of the AR5. Earth Negotiations Bulletin coverage of IPCC 33 can be found at: http://www.iisd.ca/vol12/enb12500e.html

SREX: The First joint session of IPCC WGs I and II, which took place on 14-17 November in Kampala, Uganda, accepted the Special Report on Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation (SREX) and approved its SPM. The SREX addressed the interaction of climatic, environmental and human factors leading to adverse impacts of climate extremes and disasters, options for managing the risks posed by impacts and disasters, and the important role that non-climatic factors play in determining impacts.

IPCC-34 REPORT

IPCC Chair Rajendra Pachauri opened the 34th session of the Intergovernmental Panel on Climate Change on Friday, 18 November 2011, highlighting ongoing work related to the Fifth Assessment Report (AR5) and progress in the implementation of the InterAcademy Council (IAC) recommendations. He also referred to the communications strategy and the need to ensure policy relevance and reach out to policymakers. Pachauri said it was critically important that the results of the Special Report on Renewable Energy Sources and Climate Change Mitigation (SRREN) and the Special Report on Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation (SREX) be presented to the United Nations Framework Convention on Climate Change (UNFCCC) Conference of the Parties (COP) in Durban, South Africa. He emphasized the significance of the meeting being held in Africa, given the findings related to climate change impacts and development challenges in the region, and thanked Uganda for hosting the meeting and Norway for its support.

Norwegian Ambassador Thorbjørn Gaustadsæther highlighted that the SREX is an important tool for understanding, taking actions, and making decisions on managing the risks of extreme events and disasters. He noted that extreme weather events and their negative impacts are apparent everywhere, including in Uganda, for fishermen on the Lake Victoria who experience reduced catch, as well as in his native Norway, which experiences dramatic flooding, shrinking Arctic ice and other events. He said the SREX would be presented to governments at the Durban UNFCCC meeting and would provide a good basis for them to take action. He thanked the Ugandan government for its hospitality and said Norway was pleased to have contributed to the organization of the meeting.

Peter Gilruth, on behalf of UNEP Executive Director Achim Steiner, stressed the potential of the SREX, including as a foundation on which the disaster risk reduction and the climate change communities can build stronger bridges, and as a basis for environment and development work. He noted various UNEP initiatives and assessment reports, including the Programme of Research on Climate Change Vulnerability, Impacts and Adaptation, the fifth Global Environmental Outlook and the Emissions Gap Assessment, and invited delegates to participate in the “Eye on Earth” summit in December to build partnerships on knowledge sharing.

Florin Vladu, on behalf of Christiana Figueres, Executive Secretary of the UNFCCC, updated the plenary on developments in the negotiating process, highlighting the achievements of the Cancun Agreements in establishing an institutional infrastructure, but noting a failure to address the future of the Kyoto Protocol and a mitigation framework. Vladu said that in Durban countries face a challenge to find a viable way forward, but expressed hope that the conference will help build confidence in post-2012 climate finance through clarity on long-term finance and making the Green Climate Fund operational. Vladu highlighted that the UNFCCC process has benefited from an active research dialogue with the IPCC, most recently in the form of a presentation on the SRREN at the Subsidiary Body for Scientific and Technological Advice (SBSTA) session in June 2011. He also noted the special role of the IPCC in the UNFCCC review of the adequacy of the goal of limiting average global temperature below 2 degrees Celsius and the overall progress towards achieving this goal, which is scheduled to commence in 2013. On SREX, he said the report would contribute both to the work of SBSTA, and Adaptation Framework, and work programme on loss and damage, once those become operational.

Noting that this has been a transformative year for the IPCC, Jeremiah Lengoasa, on behalf of World Meteorological Organization (WMO) Secretary-General Michel Jarraud, reaffirmed support for the work of the Panel and emphasized the importance of the IPCC’s work and procedures remaining relevant and timely. He welcomed the AR5 preparations moving ahead as scheduled and stressed that the AR5 will provide a strong basis for decision-making, including in relation to water resources, agriculture and food security. He also highlighted the role of the WMO Global Framework for Climate Services, to be launched in the near future, to further assist in decision-making.

Maria Mutagamba, Minister for Water and Environment, Uganda, expressed warm greetings from the people of Uganda and welcomed delegates to the country traditionally known as the Pearl of Africa. She said that it is with great pride that Uganda continues to participate actively in the work of the IPCC and hosts this meeting, and thanked Norway, which co-funded the session. She said that Uganda has already started experiencing extreme weather events attributed to climate change such as severe droughts, floods and increased frequency of landslides. Highlighting the inevitability of climate change, she noted that her country has adaptation policies in place. On mitigation, she underlined Uganda’s early efforts under the Clean Development Mechanism. She further noted the need to strengthen national meteorological and hydrological services in developing countries and thus expressed support for the WMO Global Framework for Climate Services. She also suggested the IPCC continue to consider the role of indigenous knowledge in areas where peer-reviewed literature is unavailable or insufficient as well as issues of technology transfer to developing countries and dissemination of information.

The Panel then observed a minute of silence for the untimely and sad passing away of Mama Konate, UNFCCC SBSTA Chair and IPCC colleague.

APPROVAL OF THE DRAFT REPORT OF THE 33RD SESSION

The draft report of IPCC-33 (IPCC-XXXIV/Doc. 2, Rev.1) was adopted on Friday morning with a minor editorial amendment. Belgium noted the lack of reference in the meeting minutes to the Expert Meeting on Geoengineering and the participation of media representatives in at that meeting.

SPECIAL REPORT ON EXTREME EVENTS AND DISASTERS

This issue (IPCC-XXXIV/Doc. 21) was taken up by the plenary on Friday morning. The IPCC plenary formally accepted the actions taken at the Joint Session of Working Groups I and II on the SREX, including approving its Summary for Policy Makers (SPM). Underscoring the importance and usefulness of the SREX, Austria said that, among others, this landmark report introduces terminology to be understood both by the risk management and the climate change community, identifies a range of practices and options to reduce risk, and provides clarity on what the most vulnerable sectors, groups and areas are, making it of tremendous use for taking appropriate actions.

PREPARATION OF THE FIFTH ASSESSMENT REPORT (AR5)

The item (IPCC-XXXIV/Doc. 5) was presented to the plenary on Friday afternoon. Chair Pachauri recalled that the Panel had issued a clear mandate to start very early with the AR5 Synthesis Report (SYR), and Leo Meyer, Head of the SYR Technical Support Unit (TSU), reported on process and management issues related to the SYR (IPCC-XXXIV/Doc. 5). Meyer noted, inter alia: the inclusion of the IPCC Vice-Chairs on the SYR writing team since they have responsibilities related to cross-cutting issues; the possibility of a workshop on UNFCCC Article 2, which could feed into the UNFCCC review of the adequacy of the Convention’s ultimate goal; and the suggestion to reduce the time of eight weeks allowed for government comments on the final draft of the SPM to six weeks given the compressed timeline of the SYR.

On the time frame, the US suggested, and the Panel agreed, to seven weeks instead of the six weeks proposed for government comments.

With regard to a possible workshop on UNFCCC Article 2, Chair Pachauri suggested inviting general comments by governments. Emphasizing the importance of the IPCC retaining distance from the policy process, the US, supported by New Zealand, Canada, Saudi Arabia and others, opposed the suggestion. Saudi Arabia underscored that the issue of Article 2 is very sensitive. The Panel agreed to have the Bureau consider the matter at its next meeting.

REVIEW OF THE IPCC PROCESSES AND PROCEDURES

CONFLICT OF INTEREST POLICY: This issue (IPCC-XXXIV/Doc. 8, Rev. 1) was first addressed in the plenary on Friday and then in several meetings of a contact group co-chaired by Andrej Kranjc (Slovenia) and Jongikhaya Witi (South Africa), with Samuel Duffett (UK) as Rapporteur. The workstream on the Conflict of Interest (COI) Policy arose in response to the recommendations made in the IAC Review to develop and adopt a rigorous COI Policy. At IPCC-33 delegates adopted a COI Policy and extended the mandate of the Task Group on COI in order to develop proposals for annexes to the COI Policy covering Implementation Procedures and the Disclosure Form.

Contact group discussions focused on the draft Implementation Procedures prepared by the Task Group. During the group’s first meeting, Co-Chair Kranjc noted that the Task Group held four teleconferences in between sessions and that the WGs already have experience applying the COI Policy on an interim basis. Rapporteur Duffett then explained the proposed decision-making process on COI, noting there would be different procedures for Bureaux members and non-Bureaux members.

The discussions centered on several issues, including: which body determines whether an individual has a COI; the role of the COI Expert Advisory Group; which body is responsible for the final decision in cases of COI; cases of tolerance of COI for non-Bureaux members; and principles for considering COI issues.

On a body to determine whether an individual has a COI, the proposal of the Task Group was to form a special committee comprised of representatives from each of the six WMO regional groups. Some participants noted that implementation of COI policies is a relatively simple and technical procedure and in most cases there is no COI, so it would be an additional burden to establish a new committee and conduct elections for its members. In this regard, they suggested making use of existing bodies and assigning this function to the Executive Committee. They also suggested that the Executive Committee members would be the ones most interested in maintaining the integrity of the IPCC. Others expressed concern about Bureaux members who are part of the Executive Committee making decisions on their own COI. A compromise was reached on establishing a COI Committee composed of voting members of the Executive Committee and representatives of WMO and UNEP, with a recusal clause.

Delegates also developed principles for considering COI issues, introducing those in relation to exploring options for resolution of COI and an appeals procedure. The group added a provision requiring members of bodies involved in considering COI issues to recuse themselves from a discussion on their own COI.

The Task Group proposed that the Expert Advisory Group, which would be comprised of three representatives from WMO and UNEP, review COI forms of Bureaux nominees. However, some expressed a concern about this approach and a change was introduced that the COI Committee consults the Expert Advisory Group when it deems necessary.

Further discussion took place on which body would be responsible for a final decision on COI. An opinion was expressed that all final decisions should be made in plenary; however, others raised concerns about maintaining the confidentiality of personal information in that case. The contact group elaborated on an appeals procedure, assigning a function to the IPCC Bureau to review a COI determination on request by the individual in question.

On COI in relation to non-Bureaux members, several supported some flexibility in this regard as there are too few experts in some areas and those are often involved with industries or organizations. Delegates developed the relevant procedures on the tolerance of COI in such cases.

In the final plenary, the Panel adopted the Implementation Procedures and Disclosure Form for the COI Policy with minor editorial corrections. Chair Pachauri said COI was clearly one of the trickiest and most complex issues to address in relation to the IAC Review.

The US expressed its satisfaction with an “excellent” outcome on COI, in particular regarding the creation of a body that will implement the COI Policy effectively and very soon, composed of those with a strong interest in ensuring the integrity of its outcomes.

Canada noted that the contact group discussions were exceedingly positive and that the Implementation Procedures for the COI Policy will provide an effective process to promote transparency. The Netherlands underlined the enormous importance of the documents on COI for the transparency and integrity of the Panel, and its acceptance by the outside world. Thanking all members of the Task Group, Australia congratulated the plenary on a “groundbreaking” COI mechanism for many international organizations, both in substance and in the procedure of how it was developed.

Secretary Christ asked the plenary how the set of documents on COI should be integrated into IPCC regulations and suggested a paragraph be added that states these documents constitute an appendix to the Principles Governing the IPCC Work. To this, the US replied that more consideration is needed before the documents are elevated to the level of principles and suggested leaving them as standalone documents. The Panel agreed to the suggestion.

Final Decision: In its decision, the Panel, inter alia:

adopts the COI Implementation Procedures and decides that the Procedures will apply to individuals who are subject to the COI Policy;
decides to establish a COI Committee comprising all elected members of the Executive Committee and two additional members with appropriate legal expertise from UNEP and WMO, appointed by those organizations;
decides to establish an Expert Advisory Group on COI and invites the Secretary-General of WMO and the Executive Director of UNEP to select members of the COI Expert Advisory Group and to facilitate the establishment of the COI committee as soon as possible;
notes that the WG and Task Force Bureaux have adopted interim arrangements for dealing with COI issues and that those arrangements are broadly consistent with the COI Policy;
decides that, to ensure a smooth transition, the existing interim arrangements will continue to operate, with respect to individuals who are not Bureau members until the Executive Committee decides that the implementation procedures apply to those individuals;
requests IPCC and TFI Bureaux members to submit a COI Form to the Secretariat within three months;
decides to receive a report on the operation of the COI Expert Advisory Group and the COI Committee within twelve months of their establishment and to review their operations, as appropriate, within twelve months after the next Bureaux election(s); and
notes that the COI Committee will develop its own methods of working and will apply those on an interim basis pending approval by the Panel, and decides that the COI Committee should submit its methods of working to the Panel within twelve months of its establishment.
Implementation Procedures: The Procedures address the following:

The overall purpose of the Implementation Procedures is to ensure that COIs are identified, communicated to the relevant parties and manage to avoid any adverse impact of IPCC balance, products and processes, and also to protect the individual, the IPCC and the public interest.
In their scope, the Implementation Procedures apply to all COIs and all individuals defined in the COI Policy, and compliance with the COI Policy and the Procedures is mandatory.
The Implementation Procedures further set out the review process on COI for IPCC and Task Force Bureaux members prior to and after their appointment. According to this process, the COI Disclosure Forms for all nominees should be submitted to the Secretariat to be reviewed by a COI Committee. The COI Committee may request advice from the Expert Advisory Group on COI. If the COI Committee determines that a nominee has a COI that cannot be resolved, the individual will not be eligible for election to the Bureau.
The Implementation Procedures also outline the review process for Coordinating Lead Authors, Lead Authors, Review Editors and TSUs prior to and after their appointment. In this case, Disclosure Forms are submitted to relevant TSUs and reviewed by WG or Task Force Bureaux. The document defines exceptional circumstances in which a COI in relation to non-Bureaux members may be tolerated, that is when an individual can provide a unique contribution and when a COI can be managed. Such cases should be disclosed. The document also outlines the process to deal with a COI after the appointment of non-Bureaux members, including updating information, review and an appeal procedure.
The Implementation Procedures set out principles for considering COI issues that are applied to all bodies involved in advising on and deciding COI issues. In this regard, they require those bodies to consult the relevant individual regarding potential COIs and explore the resolution options as well as provide for an appeal procedure. The document also requires members of the bodies involved in consideration of COI issues to recuse themselves when being a subject of consideration.
The Implementation Procedures further contain provisions on processing and storage of information to ensure confidentiality of submitted information.
The document further sets out the composition and functions of the COI Committee and Expert Advisory Group on COI.
Annex B to the Implementation Procedures also contains a COI Disclosure Form.
PROCEDURES: This issue (IPCC-XXXIV/Doc. 9, Add. 1) was first introduced in the plenary on Friday and then taken up by a contact group co-chaired by Eduardo Calvo (Peru) and Øyvind Christophersen (Norway), with Arthur Petersen (Netherlands) as Rapporteur. Work centered on the finalization of revisions to the Appendix A to the Principles Governing IPCC Work: Procedures for the Preparation, Review, Acceptance, Adoption, Approval and Publication of IPCC Reports, which started at IPCC-32. The Panel adopted the revised Procedures Appendix in plenary on Saturday, completing the work of the Task Group on Procedures.

Discussions in the contact group centered on the production and treatment of guidance material, the selection of participants to IPCC workshops and expert meetings, matters related to the transparency, quality and efficiency of the review process, anonymous expert review, and SPM approval sessions.

On guidance material, Belgium and others called for stating that guidance material needs to be taken into account in the preparation of the reports in addition to stating what guidance material is, while others cautioned against excessively normative language. The group agreed leave the text as is.

On the selection of participants to IPCC workshops and expert meetings, the group addressed text related to the distinction between these two types of meetings.

On matters related to the transparency, quality and efficiency of the review process, the group considered the Revised Guidance Note on the Role of Review Editors (IPCC-XXXIV/Doc. 9, Add.1) prepared by the WG and TFI Bureaux. The group also addressed the current practice of expanding the number of Review Editors per chapter. After some discussion, the group agreed that there was a need to limit the number of Review Editors to four per chapter.

On text related to open invitations for expert reviewers, recommendations were made to circulate second in addition to First Order Draft Reports by WG/TFB Co-Chairs for review. In relation to inviting as wide a group of experts as possible, Review Editors were added to a list of potentially nominated experts. Text was also added on notifying Government Focal Points when this process starts.

On anonymous expert review, the group discussed the need to ensure the appropriate flexibility and agreed to add text that clarifies that the procedures do not prescribe WGs and the TFI to use either anonymous or named expert reviews. In order to document past experience with anonymous expert reviews by WGIII and the TFI during the AR4, the group agreed to include the Note by the Task Group on Procedures on IPCC Anonymous Expert Review: Past experiences and arguments in favor or against (Appendix 3 of IPCC-XXXIV/Doc. 9) in an annex to the Report of IPCC-34.

On the process for the SPM approval, the group addressed text on the process for sending government comments to the Second Order Draft prior to the plenary approval session of the SPM, bringing the procedures in line with current practice.

During the final plenary, Austria noted that, although important progress was made, there is a need to further strengthen the Procedures, in particular related to the calibrated uncertainty language of assessments, to increase transparency and traceability of the decisions of authors so these can be understood in the future. He also proposed further addressing the management and working rules for the writing teams so they are the same across WGs. With regard to calibrated language, New Zealand drew attention to the existing Guidance Paper on Uncertainties and cautioned against having the Panel decide on this, stressing that this should be the province of the WGs.

The European Union (EU) asked for clarification on whether participating organizations are also considered in the round of comments by governments for SPM approval. Co-Chair Christophersen responded that this was not brought up or considered by the group. The EU noted that it would be useful to introduce this in the future given the EU’s particular character. Australia proposed, and the Panel agreed, to record the EU’s concern in the minutes of the meeting along with Austria’s suggestion.

Final Decision: The decision on Procedures addresses the following:

On the IPCC guidance material, the Panel decides that guidance material is a category of IPCC supporting material aimed to guide and assist in the preparation of IPCC reports and Technical Papers. The Panel also clarifies who is responsible and who may commission guidance material.
On selection of participants to IPCC Workshops and Expert Meetings, the Panel elaborates on the distinction between these two types of meetings, including their composition, and establishes that the WG/TFI Bureaux or the IPCC Chair will report to the IPCC Bureau and Panel on the process of selection of participants, including a description of how the selection criteria have been applied.
On matters related to transparency, quality and efficiency of the review process, the IPCC welcomes the revised Guidance Note on Review Editors and finds that the recommendations of the IAC on the Review Editors have been taken adequately into account. The Panel also encourages the implementation of this revised Guidance Note in the AR5 and invites the WG Co-Chairs to monitor progress in their WG progress reports. In addition, the Panel decides that to provide a balanced and complete assessment of current information, each WG/TFI Bureau should normally select two to four Review Editors per chapter and per technical summary of each Report. Furthermore, it decides that the WG/TFI Bureaux shall seek the participation of reviewers encompassing the range of scientific, technical and socio-economic views, expertise, and geographical representation, and shall actively undertake to promote and invite as wide a range of experts as possible.
On anonymous expert review, the Panel decides: not to amend the IPCC Procedures; not to preclude a different approach in the future; and to include the Note by the Task Group on Procedures on IPCC Anonymous Expert Review: Past experiences and arguments in favor or against (Appendix 3 of IPCC-XXXIV/Doc. 9) in an annex to the Report of IPCC-34.
On the process for the SPM approval, the Panel specifies the process for governments submitting written comments prior to the plenary approval session.
GOVERNANCE AND MANAGEMENT: This item (IPCC-XXXIV/Doc. 19) was taken up in the opening plenary on Friday. IPCC Chair Pachauri explained that both Co-Chairs of the Task Group on Governance and Management, David Warrilow (UK) and Taha Zatari (Saudi Arabia) were unable to come to Kampala, and that Task Group Co-Chair Warrilow suggested postponing the consideration of the matter until IPCC-35 and proposed holding IPCC-35 in the middle of 2012 rather than in the second half of the year. The UK explained that this will provide for a prompt response to the IAC recommendations and will allow moving forward with the AR5. The UK also proposed that if holding an earlier session is not possible, two sessions could be held next year instead of one. Several countries highlighted that an earlier meeting should not coincide with preparatory meetings for the United Nations Conference on Sustainable Development (Rio+20) and the Conference itself.

Delegates agreed to postpone the consideration of the item until IPCC-35.

COMMUNICATIONS STRATEGY: This item (IPCC-XXXIV/Doc. 20) was addressed in plenary on Friday. Secretary Christ recalled that IPCC-33 agreed on guidance on a communications strategy and requested the Secretariat to elaborate on the strategy according to that guidance. She noted delays with hiring a senior communications specialist who will not be on board for several months and in this context explained that the Secretariat asked its long-term consultant, Charlie Methven, to help prepare the draft communications strategy in order to respond to the plenary’s request.

Methven then elaborated on the main points of the proposed strategy. Highlighting the unique challenges the IPCC faces, he underlined that the future communications system should be a resource rather than a typical corporate structure. At the same time, he said, it should provide a central communication function and a stronger link between various elements of the IPCC, including the WGs and their TSUs. Noting the already existing ad hoc support on communications across WGs, Methven said these practices should be incorporated to make for a more accountable and coherent structure. He also mentioned that the proposed strategy is achievable within the current level of funding.

Chair Pachauri then requested guidance from the plenary on major pillars of the draft strategy.

Many, including New Zealand, US, Austria and Japan, expressed a deep concern about the delay with hiring a senior communications specialist who should be involved in the development of the strategy. Chair Pachauri explained that the hiring process is conducted according to WMO procedures but an individual had been selected and the discussion is now on a compensation package. He noted that this person cannot start immediately after accepting the offer, and that the selected candidate is not aware of the IPCC process sufficiently to actively contribute to its communications strategy.

Referring to the unique nature of the IPCC, the US highlighted the important role of WG Co-Chairs in communication of relevant products and that the proposed communications structure should not be independent from the WGs. He highlighted in this regard that a senior communication specialist should be facilitative in nature and expressed concern that the Executive Committee had no interaction with candidates for this role. Pachauri explained it was difficult to engage all members of the Executive Committee and that some of them were involved in developing the draft communications strategy.

Austria suggested preparing a Panel’s letter to WMO highlighting the urgency of hiring a communications person for the IPCC. He also suggested there should be a role for governments in the communications strategy, especially when it comes to regional matters. Switzerland underlined the importance of scientific integrity in the communication of the IPCC’s work, which often means “sticking literally to what has been said.” Australia proposed that a strategy should be forward-looking and contain a clear set of communications objectives: what to communicate, to whom and how. Several delegates suggested the document be forwarded to the full Executive Committee and Bureau for discussion.

Pachauri concluded that the draft communications strategy would now be discussed by a small group comprising representatives of the WGs, TFI, Secretariat and consultant Methven before being forwarded to the Executive Committee, Bureau and eventually the plenary.

In the final plenary on Saturday, Belgium recalled its proposal to re-establish a Task Force on Outreach and Communications Strategy, noting that such a Task Force had existed but disappeared when Pachauri became Chair, and to collect written comments by governments to advance the issue. Chair Pachauri supported the proposal and suggested Belgium submit it in written form. On a request for clarification by IPCC Vice-Chair Jean-Pascal van Ypersele, Chair Pachauri confirmed agreement at the Executive Committee meeting to have one of the IPCC Vice-Chairs involved in the group in charge of formulating the communications strategy.

The UK proposed, and the Panel agreed, to circulate the new draft communications strategy for comments and revision before the next session. Chair Pachauri said the Executive Committee will come up with a timetable to do so.

MATTERS RELATED TO UNFCCC AND OTHER INTERNATIONAL BODIES

During the opening plenary session, Chair Pachauri informed the Panel that, in contrast to all previous occasions when the IPCC had addressed the UNFCCC COP in plenary, he had now been asked to only present at SBSTA in Durban. He emphasized that this was an issue of institutions, not of personalities. Many countries expressed their disappointment and underscored the importance of conveying the IPCC’s findings to the COP directly, possibly also at the high-level segment. South Africa noted the concerns expressed on the participation of the IPCC at Durban and assured that the matter would receive proper attention by the upcoming COP Presidency.

A drafting group prepared a letter to the UNFCCC, which was distributed to the Panel for approval. The letter, addressed to the UNFCCC Executive Secretary, expressed the Panel’s disappointment and noted the inappropriateness of the decision, underscoring the strategic importance of having the IPCC address the UNFCCC at the COP level as has been the case since the first COP. The letter called for conveying the message to the current and upcoming COP Presidencies. The US, Saudi Arabia and New Zealand called for reflecting on the wisdom of this mode of communication and proposed Chair Pachauri speak again informally to the UNFCCC Executive Secretary on this matter.

On Saturday morning, Chair Pachauri informed the Panel that, after further communication, the UNFCCC Executive Secretary had written to say that she had consulted with the South African delegation and that, although the opening session of UNFCCC COP 17 will be more of a ceremonial nature, the IPCC would be invited to address the COP on Wednesday, 30 November, when it takes up substantive matters.

RULES OF PROCEDURE FOR THE ELECTION OF THE IPCC BUREAU AND ANY TASK FORCE BUREAU

In plenary on Saturday, Secretary Christ invited the Panel to provide guidance on how provisions arising from the review of IPCC processes and procedures at IPCC-33 and 34 are to be reflected in the revision to Appendix C to the Principles Governing IPCC Work: Rules of Procedure for the Election of the IPCC Bureau and Any Task Force Bureau (IPCC-XXXIV/Doc. 7). New Zealand, with Malaysia and Australia, noted that there was no representative from Region V (South-West Pacific) on the WGIII Bureau, and that the revised text leaves open the possibility that someone from Region V is not on the WGIII Bureau. Australia also highlighted that Region V does not have representation on the Executive Committee and said that these issues should be a high priority for IPCC-36. Secretary Christ said that the Secretariat would distribute a text to governments taking into consideration suggestions from IPCC-33 and 34, and would make this a high priority agenda item for IPCC-36.

IPCC PROGRAMME AND BUDGET AND FINANCIAL PROCEDURES FOR THE IPCC

During Friday’s opening plenary session, Secretary Christ gave an overview of issues related to the IPCC Trust Fund Programme and Budget (IPCC-XXXIV/Doc. 3, Rev.1) and the adoption of the revised “Appendix B to the Principles Governing IPCC Work: Financial procedures for the IPCC” (IPCC-XXXIV/Doc. 4, Corr. 1). She noted the need to address the greater cost of the publication and translation of the SRREN and an additional expert meeting on wetlands by TGICA, and urged resolution on the revised Appendix B in order to allow auditing of IPCC accounts.

The Financial Task Team, co-chaired by IPCC Vice-Chair Ismail A.R. El Gizouli (Sudan) and Nicolas Beriot (France), met to address these issues, convening twice on Friday. On Saturday morning, Co-Chair Beriot presented the deliberations of the Task Team to plenary, noting that the meetings had been well attended. He highlighted changes made to Appendix B, including the addition of a paragraph on the Financial Task Team and the revision of a paragraph that grants authority to the Secretariat to adjust allocations in the event that the IPCC Trust Fund is less than the approved budget. On Appendix B, the WMO and EU queried the implication of the IPCC Trust Fund being administered under International Public Sector Accounting Standards. Secretary Christ clarified that the text was drafted with the WMO legal consul, and expressed hope that in negotiating future agreements with the EU the various financial requirements will be reconciled.

Co-Chair Beriot highlighted two other Financial Task Team recommendations to the Panel in relation to simplifying language on procedural matters in the revised Appendix B no later than IPCC-37 and greater flexibility in financing travel arrangements for experts or members of the Bureau from developing countries. The UK and Austria recommended adding a second plenary session next year in order to have enough time to respond to the IAC Review; however, after further discussion, the Panel agreed that a four-day plenary session would be preferable to two two-day plenary sessions because of both time and resource constraints. New Zealand also suggested that teleconferences can be used for preparation meetings prior the next IPCC session.

Final Decision: In its decision, the Panel, inter alia:

approves the modified 2011 budget with respect to cost-related increases in the translation and publication of the SRREN;
approves the modified 2012 budget, which includes cost-related increases in the preparation of the 2013 IPCC Guidelines on Wetlands;
approves the revised “Appendix B to the Principles Governing IPCC Work: Financial Procedures for the IPCC” (IPCC-XXXIV/Doc. 4, Corr.1) with modifications, which include adding the Financial Task Team and granting authority to the Secretariat to make adjustments to allocations if there is a budget shortfall;
requests the Secretariat simplify language in the revised Appendix B document to improve clarity and readability no later than IPCC-37;
notes the forecast budget for 2013 and the indicative budgets for 2014 and 2015;
urges governments from developed countries to continue providing financial support for travel of experts to IPCC meetings;
requests that countries maintain their contributions in 2011 and 2012 and invites governments, which may be able to do so, to increase their level of contributions to the IPCC Trust Fund or to contribute in case they have not done so; and
endorses the expression of concern regarding the imposition of travel plans and arrangements on some experts or members of the Bureau from developing countries, with little concern to the particular traveler constraints and commitments, and that this be relate to the WMO Secretary-General.
PROGRESS REPORTS

AR5, PROGRESS REPORTS OF WGs I, II AND III: The WG Co-Chairs presented on progress since IPCC-33. WGII Co-Chair Vicente Barros (Argentina) highlighted a range of on-going expert, regional expert and lead author meetings, and Head of WGII TSU Kristie Ebi discussed the draft chapter writing schedule (IPCC-XXXIV/Doc. 10).

Head of WGIII TSU Jan Minx highlighted a range of expert and lead author meetings, and noted changes to the WGIII AR5 schedule and the writing process, which include a review of cross-chapter consistency and a policy to remove inactive authors (IPCC-XXXIV/Doc. 18, Rev.1).

WGI Co-Chair Thomas Stocker discussed a variety of expert meetings, including a Joint Expert Meeting in Lima, Peru, on Geoengineering in June 2011; a second WGI Lead Author meeting held in Brest, France in July 2011, which engaged primarily with cross-chapter issues; and a third Lead Author WGI meeting to be held in Marrakech, Morocco in April 2012. Stocker noted that on 16 December 2011 the First Order Draft of the WGI contribution to the AR5 will become available for an eight-week expert review (IPCC-XXXIV/Doc. 14).

TASK GROUP ON DATA AND SCENARIO SUPPORT FOR IMPACT AND CLIMATE ANALYSIS (TGICA): Due to the absence of TGICA representatives at the meeting, Chair Pachauri referred the plenary to the report of the Task Group (IPCC-XXXIV/Doc. 13).

TASK FORCE ON NATIONAL GREENHOUSE GAS INVENTORIES: TFB Co-Chair Thelma Krug (Brazil) reviewed progress on the 2013 Supplement to the 2006 IPCC Guidelines for National Greenhouse Gas Inventories: Wetlands (2013 Wetlands Supplement) work programme (IPCC-XXXIV/Doc. 12), and noted that a recent Lead Author meeting in Japan identified the scope and coverage of each chapter and addressed several cross-cutting and interacting issues. A Zero Order Draft is expected to be ready for the first science meeting next year. Co-Chair Krug also highlighted ongoing expert meetings and the success of an open symposium hosted in Japan on 22 August 2011, which aimed to explain the purpose and achievement of the TFI to the public.

SRREN: Head of WGIII TSU Jan Minx introduced this issue (IPCC-XXXIV/Doc. 17), noting the outreach activities and publication process timeline.

CROSS-CUTTING THEMES: IPCC Vice-Chair Hoesung Lee (Republic of Korea) discussed the coordination of cross-cutting themes for the AR5 SYR, highlighting that a questionnaire has been prepared and will be sent to the WGs to gain input into how the IPCC Vice-Chairs should best facilitate this process.

IPCC SCHOLARSHIP PROGRAMME: Secretary Christ updated the plenary on progress with the IPCC Scholarship Programme (IPCC-XXXIV/Doc. 16), noting that a total of nine students and researchers from developing countries had been awarded scholarships for the period 2011-2012. She said these included a postgraduate student from Uganda, Jamiat Nanteza, who would be working on climate-related disaster management issues. Secretary Christ stressed that the Secretariat does not have sufficient capacity to continue fundraising activities as there are no specific funds allocated for that work. She said they have been in contact with the UN Foundation that can conduct fundraising in the US but there would be charges involved.

Chair Pachauri underlined that the Programme had been launched with great success, highlighting many applications from the least developed countries, and said guidance is needed from the plenary on how to keep the Programme going. He said given the number of applications, it would be desirable to award at least 40 to 50 scholarships. The US expressed caution regarding this suggestion as it might require a big commitment from the IPCC leadership and Secretariat. He noted that this might also influence how the IPCC is perceived as an assessment body and recalled that when the Programme was launched there was no expectation this would become a major workstream. Belgium expressed interest in the opinion of the Board of Trustees to the Programme.

Chair Pachauri suggested this matter would be discussed at the Bureau meeting, which would provide a paper with a set of options on further direction for the Programme and ways to reduce the workload burden on the Secretariat, to be presented at the next IPCC session.

TIME AND PLACE OF THE NEXT SESSION

Croatia presented its offer to host the next session in Dubrovnik or elsewhere on the Adriatic Coast at a time to be determined.

Recalling the untimely death of SBSTA Chair Mama Konate, IPCC Vice-Chair van Ypersele called for always scheduling a break between any WG or approval session and a plenary session scheduled back-to-back in a way that, insofar as possible, respects participants’ health and wellbeing.

OTHER BUSINESS AND CLOSING OF THE SESSION

Secretary Christ presented on the outcome of the 16th WMO Congress related to the IPCC. She also noted that WMO had not yet decided on the request by IPCC-32 to WMO to not convert their in-cash contribution into in-kind contribution.

Also, Secretary Christ drew attention to a notification from UN Headquarters that the Republic of South Sudan was admitted as a new Member State by the UN General Assembly on 14 July 2011, and that the official name of the Libyan Arab Jamahiriya had been changed to Libya (IPCC-XXXIV/INF.2). The Panel agreed to reflect these changes in the necessary amendments. South Sudan has therefore become a new member of the IPCC, bringing the total of its members to 195 countries.

In his final remarks, Chair Pachauri thanked the government and people of Uganda for their hospitality and excellent organization of the meeting. The session closed at 4:45 pm with a dance performance celebrating Africa by Francis Hayes, conference officer, and local organizers.

A BRIEF ANALYSIS OF IPCC-34

THE CHALLENGE OF CHANGE

It was just a little over a year ago, in October 2010 in Busan, Republic of Korea, when Sir Peter Williams, Vice-President of The Royal Society, UK, presented the major findings and recommendations of the InterAcademy Council (IAC) review of the IPCC processes and procedures. The review was called for by UN Secretary-General Ban Ki-moon and IPCC Chair Rajendra Pachauri to address major criticisms of the IPCC’s work as a result of the discovery of a small number of serious factual errors in the Fourth Assessment Report, allegations of conflicts of interest among those involved in the assessment, and failure to respond adequately to these charges. The IAC report contained recommendations on reforming IPCC’s management and governance, communications strategy, and processes and procedures.

Since then, the IPCC has been busy addressing these recommendations, enacting changes that it hopes will make it more solid and able to weather the intense public scrutiny and attacks by climate change skeptics. At the same time, the IPCC has had to focus on its work on the Fifth Assessment Report (AR5), the cornerstone of its activities. With the IPCC midway through the AR5 cycle, these changes stand to have an impact on the AR5. It is a useful moment in time to begin to assess how much the decisions taken so far have led to substantive changes in the IPCC. This brief analysis will address these questions.

IMPLEMENTING CHANGE

IPCC-34 came at a time when the most difficult decisions in response to the IAC review have already been taken or are well advanced. A variety of organizational, procedural, governance and policy changes were made prior to the Kampala meeting. These include the establishment of an Executive Committee to provide management oversight and address emerging issues on behalf of the Panel between sessions; limiting the terms of office for key Bureau positions; the development of a conflict of interest policy; and increasing transparency in its procedures, including clarifying the selection of participants at expert meetings, authors and others. Other critical issues that have been tackled include a clear policy for correcting errors, strengthening of the review process, and improved guidance for authors, including on evaluation of evidence and consistent treatment of uncertainty.

This session in Kampala concentrated on completing revisions to the Procedures for the IPCC reports. As a result, the Panel finalized its work on the production and treatment of guidance material, the selection of participants to IPCC workshops and expert meetings, matters related to the transparency, quality and efficiency of the review process, anonymous expert review, and approval sessions for Summaries for Policy Makers.

Perhaps most notably, at this session the IPCC agreed on the Implementation Procedures for the Conflict of Interest Policy, which had been developed at IPCC-33. The agreement represented a source of much satisfaction among participants, who feel that the decision taken here allows for prompt implementation and adequate oversight by those who are most interested in maintaining the integrity of the IPCC—that is, the Panel’s Executive Committee. Importantly, implementation of the new comprehensive Conflict of Interest Policy will contribute to increased transparency of the IPCC process—just what the Panel needs to ensure the credibility of its findings.

To the dismay of many, however, the development and implementation of a comprehensive communications strategy is still incomplete. The IPCC has long acknowledged that its outreach and communication is critically deficient and attempts had been initiated to address it in the past, such as the first IPCC communications strategy in 2005-2006, which included the recruitment of a communications officer. The IAC review reinforced this criticism, finding that communication was a major weakness, and recommended the development of a communications strategy, including guidelines on who should speak on behalf of the IPCC. More than a year later, however, the IPCC still has no strategy in place and has not appointed a senior communications officer. In Kampala, the draft communications strategy was met with wide discontent. Many felt a senior communications professional should have been involved in the preparation of the strategy. In addition, others were concerned that the draft strategy had not been discussed by the Executive Committee prior to its presentation before the IPCC. With both the strategy and the appointment delayed, lack of progress on communications elicited much frustration among participants in Kampala and many others in the climate change community alike, and remains a critical gap in the response of the IPCC to the IAC review.

ASSESSING THE QUALITY OF CHANGE

Although it is too early to judge the transformational extent of the changes introduced in the IPCC as a result of the IAC review, it is useful to note some signs of the effects of these changes.

The most evident and welcome changes relate to increased transparency in the IPCC processes and procedures. There is more transparency and consistency over different stages of the assessment process, including the preparation, review, and endorsement of IPCC reports. There is a policy in place to address real or potential conflict of interest among all participants. There is even a better understanding of how the Panel is run, including its management structure, and roles and responsibilities. All these are critically important.

Changes affecting the quality of management and governance are, however, more difficult to see and assess. Having good rules is the start, but adherence and practice is what makes a difference. The fact that the Executive Committee was not consulted or involved in the recruitment of the senior communications professional came as a surprise to many.

One question was how the changes resulting from the IAC review would affect progress on the AR5. In many ways, the IAC review came at a convenient time for the IPCC—having just completed the Fourth Assessment Report and with the bulk of work concentrated on the Working Groups (WGs) as they initiated the AR5. In fact, many of the changes implemented had already been initiated by the WGs, including on a conflict of interest policy, guidance on the treatment of uncertainties and other guidance on procedures. Even the Executive Committee is a formalization of the previous Executive Team. As to the deliverables, the approval in the space of six months of two timely Special Reports –on Renewable Energy Sources and Climate Change Mitigation and on Managing the Risks of Extreme Events and Disasters to Advance Adaptation (SREX) —comes as evidence that the IAC review has not distracted the IPCC from its core business.

As one participant noted, the IAC review was not meant to illicit a revolution but an evolution. The significance of the IPCC reforms will only become apparent as new challenges arise. Assessing the quality of change, that is whether the reforms that the IPCC has already undertaken will actually lead to making the Panel stronger in front of the increased public scrutiny, remains to be seen.

Unfortunately, the lack of a comprehensive communications strategy stands in the way of making the Panel’s reforms and its work evident to the outside world. Communicating the complex science of climate extremes and impacts as presented in the SREX could have already benefited from it. That is why most participants see rapid progress on a communications strategy as vital to ensure success in the implementation of the IPCC changes. While progress on the AR5 is going well, the impact of the IPCC’s findings, and consequently its relevance, will be significantly influenced by how it is communication to the outside world.

UPCOMING MEETINGS

Joint 9th Meeting of the Vienna Convention COP and 23rd Montreal Protocol MOP: The 23rd session of the Meeting of the Parties to the Montreal Protocol on Substances that Deplete the Ozone Layer (MOP 23) and ninth meeting of the Conference of the Parties to the Vienna Convention for the Protection of the Ozone Layer (COP 9) are taking place in Bali. dates: 21-25 November 2011 location:Bali, Indonesia contact: Ozone Secretariat phone: +254-20-762-3851 fax: +254-20-762-4691 email: ozoneinfo@unep.org www:http://ozone.unep.org

UNFCCC COP 17 and COP/MOP 7: The 17th session of the UNFCCC Conference of the Parties (COP 17) and the 7th session of the Meeting of the Parties (MOP 7) to the Kyoto Protocol will take place in Durban, South Africa. The 35th session of the Subsidiary Body for Implementation (SBI), the 35th session of the Subsidiary Body for Scientific and Technological Advice (SBSTA), the Ad Hoc Working Group on Further Commitments for Annex I Parties under the Kyoto Protocol (AWG-KP), and the Ad Hoc Working Group on Long-term Cooperative Action under the Convention (AWG-LCA) will also meet. dates: 28 November – 9 December 2011 location: Durban, South Africa contact: UNFCCC Secretariat phone: +49-228-815-1000 fax: +49-228-815-1999 email: secretariat@unfccc.int www:http://unfccc.int/ and http://www.cop17durban.com

Eye on Earth Summit: The Eye on Earth Summit: Pursuing a Vision is being organized under the theme “Dynamic system to keep the world environmental situation under review.” This event will launch the global environmental information network (EIN) strengthening initiative and address major policy and technical issues. dates: 12-15 December 2011 location: Abu Dhabi, United Arab Emirates contact: Marije Heurter, Eye on Earth Event Coordinator phone: +971-2-693-4516 email: Marije.heurter@ead.ae orEoecommunity@ead.ae www: http://www.eyeonearthsummit.org/

Fifth World Future Energy Summit: The fifth World Future Energy Summit will take place from 16-19 January 2012, in Abu Dhabi, United Arab Emirates. The Summit will concentrate on energy innovation in policy implementation, technology development, finance and investment approaches, and existing and upcoming projects. The Summit will seek to set the scene for future energy discussions in 2012 with leading international speakers from government, industry, academia and finance, to share insights, expertise and cutting edge advances in technology. dates: 16-19 January 2012 location: Abu Dhabi, United Arab Emirates contact: Naji El Haddad phone: +971-2-409-0499 email:naji.haddad@reedexpo.ae www: http://www.worldfutureenergysummit.com/

IPCC WGIII AR5 Second Expert meeting on Scenarios: Scenarios have a key role in the WGIII contribution to the AR5 as an integrative element. Authors from all relevant chapters will meet to coordinate and integrate the scenario activities across chapters.dates: 17-18 March 2012 location: Wellington, New Zealand contact: IPCC Secretariat phone: +41-22-730-8208 fax: +41-22-730-8025 email:IPCC-Sec@wmo.int www: http://www.ipcc.ch/

UN Conference on Sustainable Development: The UNCSD (or Rio+20) will mark the 20th anniversary of the UN Conference on Environment and Development, which convened in Rio de Janeiro, Brazil dates: 20-22 June 2012 location: Rio de Janeiro, Brazil contact: UNCSD Secretariat email:uncsd2012@un.org www: http://www.uncsd2012.org/

IPCC WGIII AR5 Expert Meeting for Businesses and NGOs: Based on the good experiences made during the SRREN, WGIII will organize and execute an Expert Meeting for Businesses and NGOs. The meeting aims to gather structured input for consideration by the AR5 authors from these communities. The meeting will take place during the Expert Review Period (22 June – 20 August 2012). date: to be determined location: to be determined contact: IPCC Secretariat phone: +41-22-730-8208 fax: +41-22-730-8025 email:IPCC-Sec@wmo.int www: http://www.ipcc.ch/

IPCC 35th Session: The 35th session of the IPCC will consider pending issues arising from the consideration of the IAC Review of the IPCC processes and procedures, namely those on: governance and management, and communications strategy. dates: to be determined location: Croatia contact: IPCC Secretariat phone: +41-22-730-8208 fax: +41-22-730-8025 email:IPCC-Sec@wmo.intwww: http://www.ipcc.ch/

Exterminate a species or two, save the planet (RT)

Published: 26 January, 2011, 14:43

Edited: 15 April, 2011, 05:18

 Biologists have suggested a mathematical model, which will hopefully predict which species need to be eliminated from an unstable ecosystem, and in which order, to help it recover.

The counterintuitive idea to kill living things for the sake of biodiversity conservation comes from the complex connections presented in ecosystems. Eliminate a predator, and its prey thrives and shrinks the amount of whatever it has for its own food. Such “cascading” impacts along the “food webs” can be unpredictable and sometimes catastrophic.

Sagar Sahasrabudhe and Adilson Motter of Northwestern University in the US have shown that in some food web models, the timely removal or suppression of one or several species can do quite the opposite and mitigate the damage caused by local extinction. The paper is described in Nature magazine.

The trick is not an easy one, since the timing of removal is just as important as the targeted species. A live example Sahasrabudhe and Motter use is that of island foxes on the Channel Islands off the coast of California. When feral pigs were introduced in the ecosystem, they attracted golden eagles, which preyed on foxes as well. Simply reversing the situation by removing the pigs would make the birds switch solely to foxes, which would eventually make them extinct. Instead, conservation activists captured and relocated the eagles before eradicating the pigs, saving the fox population.

Of course conservation scientists are not going to start taking decisions based on the models straight away. Real ecosystems are not limited to predator and prey relationships, and things like parasitism, pollination and nutrient dynamics have to be taken into account as well. On the other hand, ecosystems were thought to be too complex to be modeled at all some eight years ago, Martinez says. Their work gives more confidence that it will have practical uses in nearest future.

Arjun Appadurai: A Nation of Business Junkies (Anthropology News)

Guest Columnist
Arjun Appadurai

By Anthropology News on November 3, 2011

I first came to this country in 1967. I have been either a crypto-anthropologist or professional anthropologist for most of that time. Still, because I came here with an interest in India and took the path of least resistance in choosing to maintain India as my principal ethnographic referent, I have always been reluctant to offer opinions about life in these United States. I have begun to do so recently, but mainly in occasional blogs, twitter posts and the like. Now seems to be a good time to ponder whether I have anything to offer to public debate about the media in this country. Since I have been teaching for a few years in a distinguished department of media studies, I feel emboldened to offer my thoughts in this new AN Forum.

My examination of changes in the media over the last few decades is not based on a scientific study. I read the New York Times every day, the Wall Street Journal occasionally, and I subscribe to The Atlantic, Harper’s, The New York Review of Books, the Economist, and a variety of academic journals in anthropology and area studies. I get a smattering of other useful media pieces from friends on Facebook and other social media sites. I also use the Internet to keep up with as much as I can from the press in and about India. At various times in the past, I have subscribed to The Nation, Money Magazine, Foreign Policy, the Times Literary supplement and a few other periodicals.

I have long been interested in how culture and economy interact. Today, I want to make an observation about the single biggest change I have seen over my four decades in the United States, which is a growing and now hegemonic domination of the news and of a great deal of opinion, both in print and on television, by business news. Business news was a specialized affair in the late 1960’s, confined to a few magazines such as Money and Fortune, and to newspapers and TV reporters (not channels). Now, it is hard to find anything but business as the topic of news in all media. Consider television: if you spend even three hours surfing between CNN and BBC on any given day ( surfing for news about Libya or about soccer, for example) you will find yourself regularly assaulted by business news, not just from London, New York and Washington, but from Singapore, Hong Kong, Mumbai and many other places. Look at the serious talk shows and chances are that you will find a talking CEO, describing what’s good about his company, what’s bad about the government and how to read his company’s stock prices. Channels like MSNBC are a form of endless, mind-numbing Jerry Lewis telethon about the economy, with more than a hint of the desperation of the Depression era movie “They Shoot Horses Don’t They?”, as they bid the viewer to make insane bets and to mourn the fallen heroes of failed companies and fired CEO’s.

Turn to the newspapers and things get worse. Any reader of the New York Times will find it hard to get away from the business machine. Start with the lead section, and stories about Obama’s economic plans, mad Republican proposals about taxes, the Euro-crisis and the latest bank scandal will assault you. Some relief is provided by more corporate news: the exit of Steve Jobs, the Op-Ed piece about the responsibilities of the super-rich by Warren Buffet, Donald Trump advertising his new line of housewares to go along with his ugly homes and buildings. Turn to the sports section: it is littered with talk of franchises, salaries, trades, owner antics, stadium projects and more. I need hardly say anything about the section on “Business” itself, which has now virtually become redundant. And if you are still thirsty for more business news, check out the “Home”, “Lifestyle” and Real Estate sections for news on houses you can’t afford and mortgage financing gimmicks you have never heard off. Some measure of relief is to be in the occasional “Science Times” and in the NYT Book Review, which do have some pieces which are not primarily about profit, corporate politics or the recession.

The New York Times is not to blame for this. They are the newspaper of “record’ and that means that they reflect broader trends and cannot be blamed for their compliance with bigger trends. Go through the magazines when you take a flight to Detroit or Mumbai and there is again a feast of news geared to the “business traveler”. This is when I catch up on how to negotiate the best deal, why this is the time to buy gold and what software and hardware to use when I make my next presentation to General Electric. These examples could be multiplied in any number of bookstores, newspaper kiosks, airport lounges, park benches and dentist’s offices.

What does all this reflect? Well, we were always told that the business of America is business. But now we are gradually moving into a society in which the business of American life is also business. Who are we now? We have become (in our fantasies) entrepreneurs, start-up heroes, small investors, consumers, home-owners, day-traders, and a gallery of supporting business types, and no longer fathers, mothers, friends or neighbors. Our very citizenship is now defined by business, whether we are winners or losers. Everyone is an expert on pensions, stocks, retirement packages, vacation deals, credit- card scams and more. Meanwhile, as Paul Krugman has argued in a brilliant recent speech to some of his fellow economists, this discipline, especially macro-economics, has lost all its capacities to analyze, define or repair the huge mess we are in.

The gradual transformation of the imagined reader or viewer into a business junkie is a relatively new disease of advanced capitalism in the United States. The avalanche of business knowledge and information dropping on the American middle-classes ought to have helped us predict – or avoid – the recent economic meltdown, based on crazy credit devices, vulgar scams and lousy regulation. Instead it has made us business junkies, ready to be led like sheep to our own slaughter by Wall Street, the big banks and corrupt politicians. The growing hegemony of business news and knowledge in the popular media over the last few decades has produced a collective silence of the lambs. It is time for a bleat or two.

Dr. Arjun Appadurai is a prominent contemporary social-cultural anthropologist, having formerly served as Provost and Senior Vice President for Academic Affairs at The New School in NYC. He has held various professorial chairs and visiting appointments at some of top institutions in the United States and Europe. In addition, he has served on several scholarly and advisory bodies in the United States, Latin America, Europe and India. Dr. Appadurai is a prolific writer having authored numerous books and scholarly articles. The nature and significance of his contributions throughout his academic career have earned him the reputation as a leading figure in his field. He is the author of The Future as a Cultural Fact: Essays on the Global Condition (Verso: forthcoming 2012).

Ken Routon is the contributing editor of Media Notes. He is a visiting professor of cultural anthropology at the University of New Orleans and the author of Hidden Powers of the State in the Cuban Imagination (University Press of Florida, 2010).

Castles in the Desert: Satellites Reveal Lost Cities of Libya (Science Daily)

ScienceDaily (Nov. 7, 2011) — Satellite imagery has uncovered new evidence of a lost civilisation of the Sahara in Libya’s south-western desert wastes that will help re-write the history of the country. The fall of Gaddafi has opened the way for archaeologists to explore the country’s pre-Islamic heritage, so long ignored under his regime.

Satellite image of area of desert with archaeological interpretation of features: fortifications are outlined in black, areas of dwellings are in red and oasis gardens are in green. (Credit: Copyright 2011 Google, image copyright 2011 DigitalGlobe)

Using satellites and air-photographs to identify the remains in one of the most inhospitable parts of the desert, a British team has discovered more than 100 fortified farms and villages with castle-like structures and several towns, most dating between AD 1-500.

These “lost cities” were built by a little-known ancient civilisation called the Garamantes, whose lifestyle and culture was far more advanced and historically significant than the ancient sources suggested.

The team from the University of Leicester has identified the mud brick remains of the castle-like complexes, with walls still standing up to four metres high, along with traces of dwellings, cairn cemeteries, associated field systems, wells and sophisticated irrigation systems. Follow-up ground survey earlier this year confirmed the pre-Islamic date and remarkable preservation.

“It is like someone coming to England and suddenly discovering all the medieval castles. These settlements had been unremarked and unrecorded under the Gaddafi regime,” says the project leader David Mattingly FBA, Professor of Roman Archaeology at the University of Leicester.

“Satellite imagery has given us the ability to cover a large region. The evidence suggests that the climate has not changed over the years and we can see that this inhospitable landscape with zero rainfall was once very densely built up and cultivated. These are quite exceptional ancient landscapes, both in terms of the range of features and the quality of preservation,” says Dr Martin Sterry, also of the University of Leicester, who has been responsible for much of the image analysis and site interpretation.

The findings challenge a view dating back to Roman accounts that the Garamantes consisted of barbaric nomads and troublemakers on the edge of the Roman Empire.

“In fact, they were highly civilised, living in large-scale fortified settlements, predominantly as oasis farmers. It was an organised state with towns and villages, a written language and state of the art technologies. The Garamantes were pioneers in establishing oases and opening up Trans-Saharan trade,” Professor Mattingly said.

The professor and his team were forced to evacuate Libya in February when the anti-Gaddafi revolt started, but hope to be able to return to the field as soon as security is fully restored. The Libyan antiquities department, badly under-resourced under Gaddafi, is closely involved in the project. Funding for the research has come from the European Research Council who awarded Professor Mattingly an ERC Advanced Grant of nearly 2.5m euros, the Leverhulme Trust, the Society for Libyan Studies and the GeoEye Foundation.

“It is a new start for Libya’s antiquities service and a chance for the Libyan people to engage with their own long-suppressed history,” says Professor Mattingly.

“These represent the first towns in Libya that weren’t the colonial imposition of Mediterranean people such as the Greeks and Romans. The Garamantes should be central to what Libyan school children learn about their history and heritage.”

Desafios do “tsunami de dados” (FAPESP)

Lançado pelo Instituto Microsoft Research-FAPESP de Pesquisas em TI, o livro O Quarto Paradigma debate os desafios da eScience, nova área dedicada a lidar com o imenso volume de informações que caracteriza a ciência atual

07/11/2011

Por Fábio de Castro

Agência FAPESP – Se há alguns anos a falta de dados limitava os avanços da ciência, hoje o problema se inverteu. O desenvolvimento de novas tecnologias de captação de dados, nas mais variadas áreas e escalas, tem gerado um volume tão imenso de informações que o excesso se tornou um gargalo para o avanço científico.

Nesse contexto, cientistas da computação têm se unido a especialistas de diferentes áreas para desenvolver novos conceitos e teorias capazes de lidar com a enxurrada de dados da ciência contemporânea. O resultado é chamado de eScience.

Esse é o tema debatido no livro O Quarto Paradigma – Descobertas científicas na era da eScience, lançado no dia 3 de novembro pelo Instituto Microsoft Research-FAPESP de Pesquisas em TI.

Organizado por Tony Hey, Stewart Tansley, Kristin Tolle – todos da Microsoft Research –, a publicação foi lançada na sede da FAPESP, em evento que contou com a presença do diretor científico da Fundação, Carlos Henrique de Brito Cruz.

Durante o lançamento, Roberto Marcondes Cesar Jr., do Instituto de Matemática e Estatística (IME) da Universidade de São Paulo (USP), apresentou a palestra “eScience no Brasil”. “O Quarto Paradigma: computação intensiva de dados avançando a descoberta científica” foi o tema da palestra de Daniel Fay, diretor de Terra, Energia e Meio Ambiente da MSR.

Brito Cruz destacou o interesse da FAPESP em estimular o desenvolvimento da eScience no Brasil. “A FAPESP está muito conectada a essa ideia, porque muitos dos nossos projetos e programas apresentam essa necessidade de mais capacidade de gerenciar grandes conjuntos de dados. O nosso grande desafio está na ciência por trás dessa capacidade de lidar com grandes volumes de dados”, disse.

Iniciativas como o Programa FAPESP de Pesquisa sobre Mudanças Climáticas Globais (PFPMCG), o BIOTA-FAPESP e o Programa FAPESP de Pesquisa em Bioenergia (BIOEN) são exemplos de programas que têm grande necessidade de integrar e processar imensos volumes de dados.

“Sabemos que a ciência avança quando novos instrumentos são disponibilizados. Por outro lado, os cientistas normalmente não percebem o computador como um novo grande instrumento que revoluciona a ciência. A FAPESP está interessada em ações para que a comunidade científica tome consciência de que há grandes desafios na área de eScience”, disse Brito Cruz.

O livro é uma coleção de 26 ensaios técnicos divididos em quatro seções: “Terra e meio ambiente”, “Saúde e bem-estar”, “Infraestrutura científica” e “Comunicação acadêmica”.

“O livro fala da emergência de um novo paradigma para as descobertas científicas. Há milhares de anos, o paradigma vigente era o da ciência experimental, fundamentada na descrição de fenômenos naturais. Há algumas centenas de anos, surgiu o paradigma da ciência teórica, simbolizado pelas leis de Newton. Há algumas décadas, surgiu a ciência computacional, simulando fenômenos complexos. Agora, chegamos ao quarto paradigma, que é o da ciência orientada por dados”, disse Fay.

Com o advento do novo paradigma, segundo ele, houve uma mudança completa na natureza da descoberta científica. Entraram em cena modelos complexos, com amplas escalas espaciais e temporais, que exigem cada vez mais interações multidisciplinares.

“Os dados, em quantidade incrível, são provenientes de diferentes fontes e precisam também de abordagem multidisciplinar e, muitas vezes, de tratamento em tempo real. As comunidades científicas também estão mais distribuídas. Tudo isso transformou a maneira como se fazem descobertas”, disse Fay.

A ecologia, uma das áreas altamente afetadas pelos grandes volumes de dados, é um exemplo de como o avanço da ciência, cada vez mais, dependerá da colaboração entre pesquisadores acadêmicos e especialistas em computação.

“Vivemos em uma tempestade de sensoriamento remoto, sensores terrestres baratos e acesso a dados na internet. Mas extrair as variáveis que a ciência requer dessa massa de dados heterogêneos continua sendo um problema. É preciso ter conhecimento especializado sobre algoritmos, formatos de arquivos e limpeza de dados, por exemplo, que nem sempre é acessível para o pessoal da área de ecologia”, explicou.

O mesmo ocorre em áreas como medicina e biologia – que se beneficiam de novas tecnologias, por exemplo, em registros de atividade cerebral, ou de sequenciamento de DNA – ou a astronomia e física, à medida que os modernos telescópios capturam terabytes de informação diariamente e o Grande Colisor de Hádrons (LHC) gera petabytes de dados a cada ano.

Instituto Virtual

Segundo Cesar Jr., a comunidade envolvida com eScience no Brasil está crescendo. O país tem 2.167 cursos de sistemas de informação ou engenharia e ciências da computação. Em 2009, houve 45 mil formados nessas áreas e a pós-graduação, entre 2007 e 2009, tinha 32 cursos, mil orientadores, 2.705 mestrandos e 410 doutorandos.

“A ciência mudou do paradigma da aquisição de dados para o da análise de dados. Temos diferentes tecnologias que produzem terabytes em diversos campos do conhecimento e, hoje, podemos dizer que essas áreas têm foco na análise de um dilúvio de dados”, disse o membro da Coordenação da Área de Ciência e Engenharia da Computação da FAPESP.

Em 2006, a Sociedade Brasileira de Computação (SBC) organizou um encontro a fim de identificar os problemas-chave e os principais desafios para a área. Isso levou a diferentes propostas para que o Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq) criasse um programa específico para esse tipo de problema.

“Em 2009, realizamos uma série de workshops na FAPESP, reunindo, para discutir essa questão, cientistas de áreas como agricultura, mudanças climáticas, medicina, transcriptômica, games, governo eletrônico e redes sociais. A iniciativa resultou em excelentes colaborações entre grupos de cientistas com problemas semelhantes e originou diversas iniciativas”, disse César Jr.

As chamadas do Instituto Microsoft Research-FAPESP de Pesquisas em TI, segundo ele, têm sido parte importante do conjunto de iniciativas para promover a eScience, assim como a organização da Escola São Paulo de Ciência Avançada em Processamento e Visualização de Imagens Computacionais. Além disso, a FAPESP tem apoiado diversos projetos de pesquisa ligados ao tema.

“A comunidade de eScience em São Paulo tem trabalhado com profissionais de diversas áreas e publicado em revistas de várias delas. Isso é indicação de qualidade adquirida pela comunidade para encarar o grande desafio que teremos nos próximos anos”, disse César Jr., que assina o prefácio da edição brasileira do livro.

  • O Quarto Paradigma
    Organizadores: Tony Hey, Stewart Tansley e Kristin Tolle
    Lançamento: 2011
    Preço: R$ 60
    Páginas: 263
    Mais informações: www.ofitexto.com.br

Scientists Find Evidence of Ancient Megadrought in Southwestern U.S. (Science Daily)

ScienceDaily (Nov. 6, 2011) — A new study at the the University of Arizona’s Laboratory of Tree-Ring Research has revealed a previously unknown multi-decade drought period in the second century A.D. The findings give evidence that extended periods of aridity have occurred at intervals throughout our past.

A cross section of wood shows the annual growth rings trees add with each growing season. Dark bands of latewood form the boundary between each ring and the next. Counting backwards from the bark reveals a tree’s age. (Credit: Photo by Daniel Griffin/Laboratory of Tree-Ring Research)

Almost 900 years ago, in the mid-12th century, the southwestern U.S. was in the middle of a multi-decade megadrought. It was the most recent extended period of severe drought known for this region. But it was not the first.

The second century A.D. saw an extended dry period of more than 100 years characterized by a multi-decade drought lasting nearly 50 years, says a new study from scientists at the University of Arizona.

UA geoscientists Cody Routson, Connie Woodhouse and Jonathan Overpeck conducted a study of the southern San Juan Mountains in south-central Colorado. The region serves as a primary drainage site for the Rio Grande and San Juan rivers.

“These mountains are very important for both the San Juan River and the Rio Grande River,” said Routson, a doctoral candidate in the environmental studies laboratory of the UA’s department of geosciences and the primary author of the study, which is upcoming in Geophysical Research Letters.

The San Juan River is a tributary for the Colorado River, meaning any climate changes that affect the San Juan drainage also likely would affect the Colorado River and its watershed. Said Routson: “We wanted to develop as long a record as possible for that region.”

Dendrochronology is a precise science of using annual growth rings of trees to understand climate in the past. Because trees add a normally clearly defined growth ring around their trunk each year, counting the rings backwards from a tree’s bark allows scientists to determine not only the age of the tree, but which years were good for growth and which years were more difficult.

“If it’s a wet year, they grow a wide ring, and if it’s a dry year, they grow a narrow ring,” said Routson. “If you average that pattern across trees in a region you can develop a chronology that shows what years were drier or wetter for that particular region.”

Darker wood, referred to as latewood because it develops in the latter part of the year at the end of the growing season, forms a usually distinct boundary between one ring and the next. The latewood is darker because growth at the end of the growing season has slowed and the cells are more compact.

To develop their chronology, the researchers looked for indications of climate in the past in the growth rings of the oldest trees in the southern San Juan region. “We drove around and looked for old trees,” said Routson.

Literally nothing is older than a bristlecone pine tree: The oldest and longest-living species on the planet, these pine trees normally are found clinging to bare rocky landscapes of alpine or near-alpine mountain slopes. The trees, the oldest of which are more than 4,000 years old, are capable of withstanding extreme drought conditions.

“We did a lot of hiking and found a couple of sites of bristlecone pines, and one in particular that we honed in on,” said Routson.

To sample the trees without damaging them, the dendrochronologists used a tool like a metal screw that bores a tiny hole in the trunk of the tree and allows them to extract a sample, called a core. “We take a piece of wood about the size and shape of a pencil from the tree,” explained Routson.

“We also sampled dead wood that was lying about the land. We took our samples back to the lab where we used a visual, graphic technique to match where the annual growth patterns of the living trees overlap with the patterns in the dead wood. Once we have the pattern matched we measure the rings and average these values to generate a site chronology.”

“In our chronology for the south San Juan mountains we created a record that extends back 2,200 years,” said Routson. “It was pretty profound that we were able to get back that far.”

The chronology extends many years earlier than the medieval period, during which two major drought events in that region already were known from previous chronologies.

“The medieval period extends roughly from 800 to 1300 A.D.,” said Routson. “During that period there was a lot of evidence from previous studies for increased aridity, in particular two major droughts: one in the middle of the 12th century, and one at the end of the 13th century.”

“Very few records are long enough to assess the global conditions associated with these two periods of Southwestern aridity,” said Routson. “And the available records have uncertainties.”

But the chronology from the San Juan bristlecone pines showed something completely new:

“There was another period of increased aridity even earlier,” said Routson. “This new record shows that in addition to known droughts from the medieval period, there is also evidence for an earlier megadrought during the second century A.D.”

“What we can see from our record is that it was a period of basically 50 consecutive years of below-average growth,” said Routson. “And that’s within a much broader period that extends from around 124 A.D. to 210 A.D. — about a 100-year-long period of dry conditions.”

“We’re showing that there are multiple extreme drought events that happened during our past in this region,” said Routson. “These megadroughts lasted for decades, which is much longer than our current drought. And the climatic events behind these previous dry periods are really similar to what we’re experiencing today.”

The prolonged drought in the 12th century and the newly discovered event in the second century A.D. may both have been influenced by warmer-than-average Northern Hemisphere temperatures, Routson said: “The limited records indicate there may have been similar La Nina-like background conditions in the tropical Pacific Ocean, which are known to influence modern drought, during the two periods.”

Although natural climate variation has led to extended dry periods in the southwestern U.S. in the past, there is reason to believe that human-driven climate change will increase the frequency of extreme droughts in the future, said Routson. In other words, we should expect similar multi-decade droughts in a future predicted to be even warmer than the past.

Routson’s research is funded by fellowships from the National Science Foundation, the Science Foundation Arizona and the Climate Assessment of the Southwest. His advisors, Woodhouse of the School of Geography and Development and Overpeck of the department of geosciences and co-director of the UA’s Institute of the Environment, are co-authors of the study.

The Human Cause of Climate Change: Where Does the Burden of Proof Lie? (Science Daily)

ScienceDaily (Nov. 3, 2011) — The debate may largely be drawn along political lines, but the human role in climate change remains one of the most controversial questions in 21st century science. Writing in WIREs Climate Change Dr Kevin Trenberth, from the National Center for Atmospheric Research, argues that the evidence for anthropogenic climate change is now so clear that the burden of proof should lie with research which seeks to disprove the human role.

Polar bear on melting ice. Experts argue that the evidence for anthropogenic climate change is now so clear that the burden of proof should lie with research which seeks to disprove the human role. (Credit: iStockphoto/Kristian Septimius Krogh)

In response to Trenberth’s argument a second review, by Dr Judith Curry, focuses on the concept of a ‘null hypothesis’ the default position which is taken when research is carried out. Currently the null hypothesis for climate change attribution research is that humans have no influence.

“Humans are changing our climate. There is no doubt whatsoever,” said Trenberth. “Questions remain as to the extent of our collective contribution, but it is clear that the effects are not small and have emerged from the noise of natural variability. So why does the science community continue to do attribution studies and assume that humans have no influence as a null hypothesis?”

To show precedent for his position Trenberth cites the 2007 report by the Intergovernmental Panel on Climate Change which states that global warming is “unequivocal,” and is “very likely” due to human activities.

Trenberth also focused on climate attribution studies which claim the lack of a human component, and suggested that the assumptions distort results in the direction of finding no human influence, resulting in misleading statements about the causes of climate change that can serve to grossly underestimate the role of humans in climate events.

“Scientists must challenge misconceptions in the difference between weather and climate while attribution studies must include a human component,” concluded Trenberth. “The question should no longer be is there a human component, but what is it?”

In a second paper Dr Judith Curry, from the Georgia Institute of Technology, questions this position, but argues that the discussion on the null hypothesis serves to highlight fuzziness surrounding the many hypotheses related to dangerous climate change.

“Regarding attribution studies, rather than trying to reject either hypothesis regardless of which is the null, there should be a debate over the significance of anthropogenic warming relative to forced and unforced natural climate variability,” said Curry.

Curry also suggested that the desire to reverse the null hypothesis may have the goal of seeking to marginalise the climate sceptic movement, a vocal group who have challenged the scientific orthodoxy on climate change.

“The proponents of reversing the null hypothesis should be careful of what they wish for,” concluded Curry. “One consequence may be that the scientific focus, and therefore funding, would also reverse to attempting to disprove dangerous anthropogenic climate change, which has been a position of many sceptics.”

“I doubt Trenberth’s suggestion will find much support in the scientific community,” said Professor Myles Allen from Oxford University, “but Curry’s counter proposal to abandon hypothesis tests is worse. We still have plenty of interesting hypotheses to test: did human influence on climate increase the risk of this event at all? Did it increase it by more than a factor of two?”

Ministro participa da inauguração de radar meteorológico do Ceará (Ascom do governo do Ceará)

JC e-mail 4378, de 04 de Novembro de 2011.

Aloizio Mercadante e o governador do Ceará, Cid Gomes, inauguraram o Radar Meteorológico Banda-S, em Quixeramobim (CE). Equipamento ajudará na previsão de secas e cheias.

Previsão de secas e cheias, mudanças climáticas e todos os eventos ligados a meteorologia passam a ser informados pela Fundação Cearense de Meteorologia e Recursos Hídricos (Funceme) com mais previsão, já que agora o órgão conta com um novo equipamento para captação dessas informações.

O novo Radar Meteorológico Banda-S foi inaugurado nesta quinta-feira (3) pelo governador Cid Gomes e o ministro da Ciência e Tecnologia, Aloizio Mercadante. Localizado no Morro de Santa Maria, em Quixeramobim, no Sertão Central, o equipamento vai funcionar como parte da Rede Cearense de Radares (RCR), por meio da integração com o Radar Doppler de Banda X instalado em Fortaleza. “Parece um equipamento aparentemente simples, mas por trás existe uma utilidade inimaginável. A tecnologia pode ser um aliado na melhoria da qualidade de vida da população, que é o nosso compromisso”, destacou Cid Gomes durante a inauguração.

Segundo explicou o governador, o novo equipamento pode informar condições climáticas bem específicas, como por exemplo “que no município de Nova Olinda, no Cariri, choveu cinco milímetros”, exemplificou. “Na medida que uma informação dessas é casada com outras, isso vai ajudar a diagnosticar por exemplo um período de seca ou de cheias. Somos um estado com quase 300 mil pequenos agricultores, e eles precisam de informações concretas para cuidar da colheita. E nisso o Radar vai ser bastante útil”, ressaltou Cid Gomes.

O Radar Banda-S tem capacidade para estimar uma precipitação dentro de um raio de 200 quilômetros. Além disso, pode fazer o monitoramento de sistemas meteorológicos que atuam em um alcance de até 400 quilômetros. Por sua capacidade e localização, também será possível obter informações não só do Ceará, como de vários estados nordestinos. “Esse é um instrumento de planejamento agrícola que vai beneficiar também muitos estados do Nordeste, como Paraíba, Pernambuco, Piauí e Rio Grande do Norte”, lembrou Aloizio Mercadante. O ministro também ressaltou sua importância na prevenção de desastres naturais, como longos períodos de seca ou chuvas bem acima da média. “Precisamos entender porque esses eventos acontecem e prevenir as ações que as mudanças climáticas podem causar”, explicou Mercadante.

Para a instalação do Radar Meteorológico Banda-S foram investidos R$ 14 milhões, sendo R$ 10 milhões partiram do Governo Federal, por meio do Ministério da Ciência e Tecnologia e Inovação (MCTI) e R$ 4 milhões do governo do estado do Ceará. Do total, R$ 12 milhões foram utilizados para a compra do equipamento e o restante (R$ 2 milhões) para a melhoria dos acessos ao local (construção de vias) e alimentação energética.

Segundo lembrou o secretário estadual da Ciência e Tecnologia, René Barreira, a instalação do Radar partiu de uma emenda de Ciro Gomes quando deputado federal, que aliado a sensibilidade do ex-presidente Lula, tornou possível a obra. “Com esse importante equipamento vamos ter um zoneamento agrícola e um controle mais efetivo e técnico dos eventos de grande risco”, ressaltou o secretário.

Mathematically Detecting Stock Market Bubbles Before They Burst (Science Daily)

ScienceDaily (Oct. 31, 2011) — From the dotcom bust in the late nineties to the housing crash in the run-up to the 2008 crisis, financial bubbles have been a topic of major concern. Identifying bubbles is important in order to prevent collapses that can severely impact nations and economies.

A paper published this month in the SIAM Journal on Financial Mathematics addresses just this issue. Opening fittingly with a quote from New York Federal Reserve President William Dudley emphasizing the importance of developing tools to identify and address bubbles in real time, authors Robert Jarrow, Younes Kchia, and Philip Protter propose a mathematical model to detect financial bubbles.

A financial bubble occurs when prices for assets, such as stocks, rise far above their actual value. Such an economic cycle is usually characterized by rapid expansion followed by a contraction, or sharp decline in prices.

“It has been hard not to notice that financial bubbles play an important role in our economy, and speculation as to whether a given risky asset is undergoing bubble pricing has approached the level of an armchair sport. But bubbles can have real and often negative consequences,” explains Protter, who has spent many years studying and analyzing financial markets.

“The ability to tell when an asset is or is not in a bubble could have important ramifications in the regulation of the capital reserves of banks as well as for individual investors and retirement funds holding assets for the long term. For banks, if their capital reserve holdings include large investments with unrealistic values due to bubbles, a shock to the bank could occur when the bubbles burst, potentially causing a run on the bank, as infamously happened with Lehman Brothers, and is currently happening with Dexia, a major European bank,” he goes on to explain, citing the significance of such inflated prices.

Using sophisticated mathematical methods, Protter and his co-authors answer the question of whether the price increase of a particular asset represents a bubble in real time. “[In this paper] we show that by using tick data and some statistical techniques, one is able to tell with a large degree of certainty, whether or not a given financial asset (or group of assets) is undergoing bubble pricing,” says Protter.

This question is answered by estimating an asset’s price volatility, which is stochastic or randomly determined. The authors define an asset’s price process in terms of a standard stochastic differential equation, which is driven by Brownian motion. Brownian motion, based on a natural process involving the erratic, random movement of small particles suspended in gas or liquid, has been widely used in mathematical finance. The concept is specifically used to model instances where previous change in the value of a variable is unrelated to past changes.

The key characteristic in determining a bubble is the volatility of an asset’s price, which, in the case of bubbles is very high. The authors estimate the volatility by applying state of the art estimators to real-time tick price data for a given stock. They then obtain the best possible extension of this data for large values using a technique called Reproducing Kernel Hilbert Spaces (RKHS), which is a widely used method for statistical learning.

“First, one uses tick price data to estimate the volatility of the asset in question for various levels of the asset’s price,” Protter explains. “Then, a special technique (RKHS with an optimization addition) is employed to extrapolate this estimated volatility function to large values for the asset’s price, where this information is not (and cannot be) available from tick data. Using this extrapolation, one can check the rate of increase of the volatility function as the asset price gets arbitrarily large. Whether or not there is a bubble depends on how fast this increase occurs (its asymptotic rate of increase).”

If it does not increase fast enough, there is no bubble within the model’s framework.

The authors test their methodology by applying the model to several stocks from the dot-com bubble of the nineties. They find fairly successful rates in their predictions, with higher accuracies in cases where market volatilities can be modeled more efficiently. This helps establish the strengths and weaknesses of the method.

The authors have also used the model to test more recent price increases to detect bubbles. “We have found, for example, that the IPO [initial public offering] of LinkedIn underwent bubble pricing at its debut, and that the recent rise in gold prices was not a bubble, according to our models,” Protter says.

It is encouraging to see that mathematical analysis can play a role in the diagnosis and detection of bubbles, which have significantly impacted economic upheavals in the past few decades.

Robert Jarrow is a professor at the Johnson Graduate School of Management at Cornell University in Ithaca, NY, and managing director of the Kamakura Corporation. Younes Kchia is a graduate student at Ecole Polytechnique in Paris, and Philip Protter is a professor in the Statistics Department at Columbia University in New York.

Professor Protter’s work was supported in part by NSF grant DMS-0906995.

Doctors Can Learn Empathy Through a Computer-Based Tutorial (Science Daily)

ScienceDaily (Oct. 31, 2011) — Cancer doctors want to offer a sympathetic ear, but sometimes miss the cues from patients. To help physicians better address their patients’ fears and worries, a Duke University researcher has developed a new interactive training tool.

The computer tutorial includes feedback on the doctors’ own audio recorded visits with patients, and provides an alternative to more expensive courses.

In a study appearing Nov. 1, 2011, in the Annals of Internal Medicine, the research team found that the course resulted in more empathic responses from oncologists, and patients reported greater trust in their doctors — a key component of care that enhances quality of life.

“Earlier studies have shown that oncologists respond to patient distress with empathy only about a quarter of the time,” said James A. Tulsky, MD, director of the Duke Center for Palliative Care and lead author of the study.

“Often, when patients bring up their worries, doctors change the subject or focus on the medical treatment, rather than the emotional concern. Unfortunately, this behavior sends the message, ‘This is not what we’re here to talk about.'”

Tulsky said cancer doctors have many reasons for avoiding emotionally fraught conversations. Some worry that the exchanges will cause rather than ease stress, or that they don’t have time to address non-medical concerns.

Neither is true, Tulsky said, noting his research shows that asking the right questions during patient visits can actually save time and enhance patient satisfaction.

“Oncologists are among the most devoted physicians — passionately committed to their patients. Unfortunately, their patients don’t always know this unless the doctors articulate their empathy explicitly,” Tulsky said. “It’s a skill set. It’s not that the doctors are uncaring, it’s just that communication needs to be taught and learned.”

The current gold standard for teaching empathy skills is a multiday course that involves short lectures and role-playing with actors hired to simulate clinical situations. Such courses are time-consuming and expensive, costing upwards of $3,000 per physician.

Tulsky’s team at Duke developed a computer program that models what happens in these courses. The doctors receive feedback on pre-recorded encounters, and are able to complete the intervention in their offices or homes in a little more than an hour, at a cost of about $100.

To test its effectiveness, Tulsky and colleagues enrolled 48 doctors at Duke, the Veterans Affairs Medical Center in Durham, NC, and the University of Pittsburgh Medical Center. The research team audio-recorded four to eight visits between the doctors and their patients with advanced cancer.

All the doctors then attended an hour-long lecture on communication skills. Half were randomly assigned to receive a CD-ROM tutorial, the other half received no other intervention.

The CD taught the doctors basic communication skills, including how to recognize and respond to opportunities in conversations when patients share a negative emotion, and how to share information about prognosis. Doctors also heard examples from their own clinic encounters, with feedback on how they could improve. They were asked to commit to making changes in their practice and then reminded of these prior to their next clinic visits.

Afterward, all the doctors were again recorded during patient visits, and the encounters were assessed by both patients and trained listeners who evaluated the conversations for how well the doctors responded to empathic statements.

Oncologists who had not taken the CD course made no improvement in the way they responded to patients when confronted with concerns or fears. Doctors in the trained group, however, responded empathically twice as often as those who received no training. In addition, they were better at eliciting patient concerns, using tactics to promote conversations rather than shut them down.

“Patient trust in physicians increased significantly,” Tulsky said, adding that patients report feeling better when they believe their doctors are on their side. “This is exciting, because it’s an easy, relatively inexpensive way to train physicians to respond to patients’ most basic needs.”

Although the CD course is not yet widely available, efforts are underway to develop it for broader distribution.

In addition to Tulsky, study authors include: Robert M. Arnold; Stewart C. Alexander; Maren K. Olsen; Amy S. Jeffreys; Keri L. Rodriguez; Celette Sugg Skinner; David Farrell; Amy P. Abernethy; and Kathryn I. Pollak.

Funding for the study came from the National Cancer Institute. Study authors reported no conflicts.

Putting the Body Back Into the Mind of Schizophrenia (Science Daily)

ScienceDaily (Oct. 31, 2011) — A study using a procedure called the rubber hand illusion has found striking new evidence that people experiencing schizophrenia have a weakened sense of body ownership and has produced the first case of a spontaneous, out-of-body experience in the laboratory.

These findings suggest that movement therapy, which trains people to be focused and centered on their own bodies, including some forms of yoga and dance, could be helpful for many of the2.2 million people in the United States who suffer from this mental disorder.

The study, which appears in the Oct. 31 issue of the scientific journal Public Library of Science One, measured the strength of body ownership of 24 schizophrenia patients and 21 matched control subjects by testing their susceptibility to the “rubber hand illusion” or RHI. This tactile illusion, which was discovered in 1998, is induced by simultaneously stroking a visible rubber hand and the subject’s hidden hand.

“After a while, patients with schizophrenia begin to ‘feel’ the rubber hand and disown their own hand. They also experience their real hand as closer to the rubber hand.” said Sohee Park, the Gertrude Conaway Vanderbilt Chair of Psychology and Psychiatry, who conducted the study with doctoral candidate Katharine Thakkar and research analysts Heathman Nichols and Lindsey McIntosh.

“Healthy people get this illusion too, but weakly,” Park said. “Some don’t get it at all, and there is a wide range of individual differences in how people experience this illusion that is related to a personality trait called schizotypy, associated with psychosis-proneness.”

Body ownership is one of two aspects of a person’s sense of self awareness. (The other aspect is self-agency, the sense that a person is initiating his or her own actions.) According to the researchers, the finding that schizophrenia patients are more susceptible to the rubber hand illusion suggests that they have a more flexible body representation and weakened sense of self compared to healthy people.

“What’s so interesting about Professor Park’s study is that they have found that the sense of bodily ownership does not diminish among patients with schizophrenia, but it can be extended to other objects more easily,” observed David Gray, Mellon assistant professor of philosophy at Vanderbilt, who is an expert on the philosophy of the mind. He did not participate in the study but is familiar with it. “Much of the literature concerning agency and ownership in schizophrenia focuses on the sense of lost agency over one’s own movements: But, in these cases, the sense of ownership is neither diminished nor extended.”

Before they began the procedure, the researchers gave participants a questionnaire to rate their degree of schizotypy: the extent to which they experience perceptual effects related to the illusion. The researchers found that the individuals who rated higher on the scale were more susceptible to the illusion.

The researchers gauged the relative strength of the RHI by asking participants to estimate the position of the index finger of their hidden hand on rulers placed on top of the box that conceals it before and after stimulation. The stronger the effect, the more the subjects’ estimate of the position of their hidden hand shifted in the direction of the rubber hand. Even the estimates of those who did not experience the effect subjectively shifted slightly.

The rubber hand illusion also has a physiological signature. Scientists don’t know why, but the temperature of the hidden hand drops by a few tenths of a degree when a person experiences the illusion. “It’s almost as if the hand is disowned and rejected, no longer part of the self,” Park commented.

The researchers were surprised when one of the patients undergoing the procedure experienced a full out-of-body experience. He reported that he was floating above his own body for about 15 minutes. According to Park, it is extremely rare to observe spontaneous out-of-body experiences in the laboratory. When they invited the patient back for a second session, he once again had an out-of-body experience during the rubber hand procedure, proving that the experience is repeatable.

“Anomalous experiences of the self were considered to be core features of schizophrenia decades ago but in recent years much of the emphasis has been on cognitive functions such as working memory,” said Park.

According to the psychologist, out-of-body experiences and body ownership are associated with a particular area in the brain called the temporoparietal junction. Lesions in this area and stimulation by strong magnetic fields can elicit out-of-body experiences. The new study suggests that disorders in this part of the brain may also contribute to the symptoms of schizophrenia.

The relationship between schizophrenia and body ownership may help explain the results of a German study published in 2008 that found a 12-week exercise program reduced the symptoms and improved the behavior of a small group of patients with chronic schizophrenia when compared to a control group that did not exercise. The study also found that the exercise increased size of the patients’ hippocampus slightly — a smaller-than-normal hippocampus is a well established symptom of schizophrenia.

“Exercise is inexpensive and obviously has a broad range of beneficial effects, so if it can also reduce the severity of schizophrenia, it is all to the good,” said Park. These findings suggest that focused physical exercise which involves precise body control, such as yoga and dancing, could be a beneficial form of treatment for this disorder.

The study was partly funded by a grant from the National Institutes of Health and the Gertrude Conaway Vanderbilt Endowed Chair.

O futuro da ciência está na colaboração (Valor Econômico)

JC e-mail 4376, de 01 de Novembro de 2011.

Texto de Michael Nielsen publicado no The Wall Street Journal e divulgado pelo Valor Econômico.

Um matemático da Universidade de Cambridge chamado Tim Gowers decidiu em janeiro de 2009 usar seu blog para realizar um experimento social inusitado. Ele escolheu um problema matemático difícil e tentou resolvê-lo abertamente, usando o blog para apresentar suas ideias e como estava progredindo. Ele convidou todo mundo para contribuir com ideias, na esperança de que várias mentes unidas seriam mais poderosas que uma. Ele chamou o experimento de Projeto Polímata (“Polymath Project”).

Quinze minutos depois de Gowers abrir o blog para discussão, um matemático húngaro-canadense publicou um comentário. Quinze minutos depois, um professor de matemática do ensino médio dos Estados Unidos entrou na conversa. Três minutos depois disso, o matemático Terence Tao, da Universidade da Califórnia em Los Angeles, também comentou. A discussão pegou fogo e em apenas seis semanas o problema foi solucionado.

Embora tenham surgido outros desafios e os colaboradores dessa rede nem sempre tenham encontrado todas as soluções, eles conseguiram criar uma nova abordagem para solucionar problemas. O trabalho deles é um exemplo das experiências com ciência colaborativa que estão sendo feitas para estudar desde de galáxias até dinossauros.

Esses projetos usam a internet como ferramenta cognitiva para amplificar a inteligência coletiva. Essas ferramentas são um meio de conectar as pessoas certas com os problemas certos na hora certa, ativando o que é um conhecimento apenas latente.

A colaboração em rede tem o potencial de acelerar extraordinariamente o número de descobertas da ciência como um todo. É provável que assistiremos a uma mudança mais fundamental na pesquisa científica nas próximas décadas do que a ocorrida nos últimos três séculos.

Mas há obstáculos grandes para alcançar essa meta. Embora pareça natural que os cientistas adotem essas novas ferramentas de descobrimento, na verdade eles têm demonstrado uma inibição surpreendente. Iniciativas como o Projeto Polímata continuam sendo exceção, não regra.

Considere a simples ideia de compartilhar dados científicos on-line. O melhor exemplo disso é o projeto do genoma humano, cujos dados podem ser baixados por qualquer um. Quando se lê no noticiário que um certo gene foi associado a alguma doença, é praticamente certo que é uma descoberta possibilitada pela política do projeto de abrir os dados.

Apesar do valor enorme de divulgar abertamente os dados, a maioria dos laboratórios não faz um esforço sistemático para compartilhar suas informações com outros cientistas. Como me disse um biólogo, ele estava “sentado no genoma” de uma nova espécie inteira há mais de um ano. Uma espécie inteira! Imagine as descobertas cruciais que outros cientistas poderiam ter feito se esse genoma tivesse sido carregado num banco de dados aberto.

Por que os cientistas não gostam de compartilhar?

Se você é um cientista buscando um emprego ou financiamento de pesquisa, o maior fator para determinar seu sucesso será o número de publicações científicas que já conseguiu. Se o seu histórico for brilhante, você se dará bem. Se não for, terá problemas. Então você dedica seu cotidiano de trabalho à produção de artigos para revistas acadêmicas.

Mesmo que ache pessoalmente que seria muito melhor para a ciência como um todo se você organizasse e compartilhasse seus dados na internet, é um tempo que o afasta do “verdadeiro” trabalho de escrever os artigos. Compartilhar dados não é algo a que seus colegas vão dar crédito, exceto em poucas áreas.

Há outras áreas em que os cientistas ainda estão atrasados no uso das ferramentas on-line. Um exemplo são os “wikis” criadas por pioneiros corajosos em assuntos como computação quântica, teoria das cordas e genética (um wiki permite o compartilhamento e edição colaborativa de um conjunto de informações interligadas, e o site Wikipedia é o mais conhecido deles).

Os wikis especializados podem funcionar como obras de referência atualizadas sobre as pesquisas mais recentes de um campo, como se fossem livros didáticos que evoluem ultrarrápido. Eles podem incluir descrições de problemas científicos importantes que ainda não foram resolvidos e podem servir de ferramenta para encontrar soluções.

Mas a maioria desses wikis não deu certo. Eles têm o mesmo problema que o compartilhamento de dados: mesmo se os cientistas acreditarem no valor da colaboração, sabem que escrever um único artigo medíocre fará muito mais por suas carreiras. O incentivo está completamente errado.

Para a ciência em rede alcançar seu potencial, os cientistas precisam abraçar e recompensar o compartilhamento aberto de todos os conhecimentos científicos, não só o publicado nas revistas acadêmicas tradicionais. A ciência em rede precisa ser aberta.

Michael Nielsen é um dos pioneiros da computação quântica e escreveu o livro “Reinventing Discovery: The New Era of Networked Science” (Reinventando a Descoberta: A Nova Era da Ciência em Rede, sem tradução para o português), de onde esse texto foi adaptado.

The world at seven billion (BBC)

27 October 2011 Last updated at 23:08 GMT

File photograph of newborn babies in Lucknow, India, in July 2009

As the world population reaches seven billion people, the BBC’s Mike Gallagher asks whether efforts to control population have been, as some critics claim, a form of authoritarian control over the world’s poorest citizens.

The temperature is some 30C. The humidity stifling, the noise unbearable. In a yard between two enormous tea-drying sheds, a number of dark-skinned women patiently sit, each accompanied by an unwieldy looking cloth sack. They are clad in colourful saris, but look tired and shabby. This is hardly surprising – they have spent most of the day in nearby plantation fields, picking tea that will net them around two cents a kilo – barely enough to feed their large families.

Vivek Baid thinks he knows how to help them. He runs the Mission for Population Control, a project in eastern India which aims to bring down high birth rates by encouraging local women to get sterilised after their second child.

As the world reaches an estimated seven billion people, people like Vivek say efforts to bring down the world’s population must continue if life on Earth is to be sustainable, and if poverty and even mass starvation are to be avoided.

There is no doubting their good intentions. Vivek, for instance, has spent his own money on the project, and is passionate about creating a brighter future for India.

But critics allege that campaigners like Vivek – a successful and wealthy male businessman – have tended to live very different lives from those they seek to help, who are mainly poor women.

These critics argue that rich people have imposed population control on the poor for decades. And, they say, such coercive attempts to control the world’s population often backfired and were sometimes harmful.

Population scare

Most historians of modern population control trace its roots back to the Reverend Thomas Malthus, an English clergyman born in the 18th Century who believed that humans would always reproduce faster than Earth’s capacity to feed them.

Giving succour to the resulting desperate masses would only imperil everyone else, he said. So the brutal reality was that it was better to let them starve.

‘Plenty is changed into scarcity’

Thomas Malthus

From Thomas Malthus’ Essay on Population, 1803 edition:

A man who is born into a world already possessed – if he cannot get subsistence from his parents on whom he has a just demand, and if the society do not want his labour, has no claim of right to the smallest portion of food.

At nature’s mighty feast there is no vacant cover for him. She tells him to be gone, and will quickly execute her own orders, if he does not work upon the compassion of some of her guests. If these guests get up and make room for him, other intruders immediately appear demanding the same favour. The plenty that before reigned is changed into scarcity; and the happiness of the guests is destroyed by the spectacle of misery and dependence in every part of the hall.

Rapid agricultural advances in the 19th Century proved his main premise wrong, because food production generally more than kept pace with the growing population.

But the idea that the rich are threatened by the desperately poor has cast a long shadow into the 20th Century.

From the 1960s, the World Bank, the UN and a host of independent American philanthropic foundations, such as the Ford and Rockefeller foundations, began to focus on what they saw as the problem of burgeoning Third World numbers.

The believed that overpopulation was the primary cause of environmental degradation, economic underdevelopment and political instability.

Massive populations in the Third World were seen as presenting a threat to Western capitalism and access to resources, says Professor Betsy Hartmann of Hampshire College, Massachusetts, in the US.

“The view of the south is very much put in this Malthusian framework. It becomes just this powerful ideology,” she says.

In 1966, President Lyndon Johnson warned that the US might be overwhelmed by desperate masses, and he made US foreign aid dependent on countries adopting family planning programmes.

Other wealthy countries such as Japan, Sweden and the UK also began to devote large amounts of money to reducing Third World birth rates.

‘Unmet need’

What virtually everyone agreed was that there was a massive demand for birth control among the world’s poorest people, and that if they could get their hands on reliable contraceptives, runaway population growth might be stopped.

But with the benefit of hindsight, some argue that this so-called unmet need theory put disproportionate emphasis on birth control and ignored other serious needs.

Graph of world population figures

“It was a top-down solution,” says Mohan Rao, a doctor and public health expert at Delhi’s Jawaharlal Nehru University.

“There was an unmet need for contraceptive services, of course. But there was also an unmet need for health services and all kinds of other services which did not get attention. The focus became contraception.”

Had the demographic experts worked at the grass-roots instead of imposing solutions from above, suggests Adrienne Germain, formerly of the Ford Foundation and then the International Women’s Health Coalition, they might have achieved a better picture of the dilemmas facing women in poor, rural communities.

“Not to have a full set of health services meant women were either unable to use family planning, or unwilling to – because they could still expect half their kids to die by the age of five,” she says.

India’s sterilisation ‘madness’

File photograph of Sanjay and Indira Gandhi in 1980

Indira Gandhi and her son Sanjay (above) presided over a mass sterilisation campaign. From the mid-1970s, Indian officials were set sterilisation quotas, and sought to ingratiate themselves with superiors by exceeding them. Stories abounded of men being accosted in the street and taken away for the operation. The head of the World Bank, Robert McNamara, congratulated the Indian government on “moving effectively” to deal with high birth rates. Funding was increased, and the sterilising went on.

In Delhi, some 700,000 slum dwellers were forcibly evicted, and given replacement housing plots far from the city centre, frequently on condition that they were either sterilised or produced someone else for the operation. In poorer agricultural areas, whole villages were rounded up for sterilisation. When residents of one village protested, an official is said to have threatened air strikes in retaliation.

“There was a certain madness,” recalls Nina Puri of the Family Planning Association of India. “All rationality was lost.”

Us and them

In 1968, the American biologist Paul Ehrlich caused a stir with his bestselling book, The Population Bomb, which suggested that it was already too late to save some countries from the dire effects of overpopulation, which would result in ecological disaster and the deaths of hundreds of millions of people in the 1970s.

Instead, governments should concentrate on drastically reducing population growth. He said financial assistance should be given only to those nations with a realistic chance of bringing birth rates down. Compulsory measures were not to be ruled out.

Western experts and local elites in the developing world soon imposed targets for reductions in family size, and used military analogies to drive home the urgency, says Matthew Connelly, a historian of population control at Columbia University in New York.

“They spoke of a war on population growth, fought with contraceptive weapons,” he says. “The war would entail sacrifices, and collateral damage.”

Such language betrayed a lack of empathy with their subjects, says Ms Germain: “People didn’t talk about people. They talked of acceptors and users of family planning.”

Emergency measures

Critics of population control had their say at the first ever UN population conference in 1974.

Karan Singh, India’s health minister at the time, declared that “development is the best contraceptive”.

But just a year later, Mr Singh’s government presided over one of the most notorious episodes in the history of population control.

In June 1975, the Indian premier, Indira Gandhi, declared a state of emergency after accusations of corruption threatened her government. Her son Sanjay used the measure to introduce radical population control measures targeted at the poor.

The Indian emergency lasted less than two years, but in 1975 alone, some eight million Indians – mainly poor men – were sterilised.

Yet, for all the official programmes and coercion, many poor women kept on having babies.

And where they did not, it arguably had less to do with coercive population control than with development, just as Karan Singh had argued in 1974, says historian Matt Connelly.

For example, in India, a disparity in birth rates could already be observed between the impoverished northern states and more developed southern regions like Kerala, where women were more likely to be literate and educated, and their offspring more likely to be healthy.

Women there realised that they could have fewer births and still expect to see their children survive into adulthood.

China: ‘We will not allow your baby to live’

Steven Mosher was a Stanford University anthropologist working in rural China who witnessed some of the early, disturbing moments of Beijing’s One Child Policy.

“I remember very well the evening of 8 March, 1980. The local Communist Party official in charge of my village came over waving a government document. He said: ‘The Party has decided to impose a cap of 1% on population growth this year.’ He said: ‘We’re going to decide who’s going to be allowed to continue their pregnancy and who’s going to be forced to terminate their pregnancy.’ And that’s exactly what they did.”

“These were women in the late second and third trimester of pregnancy. There were several women just days away from giving birth. And in my hearing, a party official said: ‘Do not think that you can simply wait until you go into labour and give birth, because we will not allow your baby to live. You will go home alone’.”

Total control

By now, this phenomenon could be observed in another country too – one that would nevertheless go on to impose the most draconian population control of all.

The One Child Policy is credited with preventing some 400 million births in China, and remains in place to this day. In 1983 alone, more than 16 million women and four million men were sterilised, and 14 million women received abortions.

Assessed by numbers alone, it is said to be by far the most successful population control initiative. Yet it remains deeply controversial, not only because of the human suffering it has caused.

A few years after its inception, the policy was relaxed slightly to allow rural couples two children if their first was not a boy. Boy children are prized, especially in the countryside where they provide labour and care for parents in old age.

But modern technology allows parents to discover the sex of the foetus, and many choose to abort if they are carrying a girl. In some regions, there is now a serious imbalance between men and women.

Moreover, since Chinese fertility was already in decline at the time the policy was implemented, some argue that it bears less responsibility for China’s falling birth rate than its supporters claim.

“I don’t think they needed to bring it down further,” says Indian demographer AR Nanda. “It would have happened at its own slow pace in another 10 years.”

Backlash

In the early 1980s, objections to the population control movement began to grow, especially in the United States.

In Washington, the new Reagan administration removed financial support for any programmes that involved abortion or sterilisation.

“If you give women the tools they need – education, employment, contraception, safe abortion – then they will make the choices that benefit society”

Adrienne Germain

The broad alliance to stem birth rates was beginning to dissolve and the debate become more polarised along political lines.

While some on the political right had moral objections to population control, some on the left saw it as neo-colonialism.

Faith groups condemned it as a Western attack on religious values, but women’s groups feared changes would mean poor women would be even less well-served.

By the time of a major UN conference on population and development in Cairo in 1994, women’s groups were ready to strike a blow for women’s rights, and they won.

The conference adopted a 20-year plan of action, known as the Cairo consensus, which called on countries to recognise that ordinary women’s needs – rather than demographers’ plans – should be at the heart of population strategies.

After Cairo

Today’s record-breaking global population hides a marked long-term trend towards lower birth rates, as urbanisation, better health care, education and access to family planning all affect women’s choices.

With the exception of sub-Saharan Africa and some of the poorest parts of India, we are now having fewer children than we once did – in some cases, failing even to replace ourselves in the next generation. And although total numbers are set to rise still further, the peak is now in sight.

Chinese poster from the 1960s of mother and baby, captioned: Practicing birth control is beneficial for the protection of the health of mother and childChina promoted birth control before implementing its one-child policy

Assuming that this trend continues, total numbers will one day level off, and even fall. As a result, some believe the sense of urgency that once surrounded population control has subsided.

The term population control itself has fallen out of fashion, as it was deemed to have authoritarian connotations. Post-Cairo, the talk is of women’s rights and reproductive rights, meaning the right to a free choice over whether or not to have children.

According to Adrienne Germain, that is the main lesson we should learn from the past 50 years.

“I have a profound conviction that if you give women the tools they need – education, employment, contraception, safe abortion – then they will make the choices that benefit society,” she says.

“If you don’t, then you’ll just be in an endless cycle of trying to exert control over fertility – to bring it up, to bring it down, to keep it stable. And it never comes out well. Never.”

Nevertheless, there remain to this day schemes to sterilise the less well-off, often in return for financial incentives. In effect, say critics, this amounts to coercion, since the very poor find it hard to reject cash.

“The people proposing this argue ‘Don’t worry, everything’ s fine now we have voluntary programmes on the Cairo model’,” says Betsy Hartmann.

“But what they don’t understand is the profound difference in power between rich and poor. The people who provide many services in poor areas are already prejudiced against the people they serve.”

Work in progress

For Mohan Rao, it is an example of how even the Cairo consensus fails to take account of the developing world.

“Cairo had some good things,” he says. “However Cairo was driven largely by First World feminist agendas. Reproductive rights are all very well, but [there needs to be] a whole lot of other kinds of enabling rights before women can access reproductive rights. You need rights to food, employment, water, justice and fair wages. Without all these you cannot have reproductive rights.”

Perhaps, then, the humanitarian ideals of Cairo are still a work in progress.

Meanwhile, Paul Ehrlich has also amended his view of the issue.

If he were to write his book today, “I wouldn’t focus on the poverty-stricken masses”, he told the BBC.

“I would focus on there being too many rich people. It’s crystal clear that we can’t support seven billion people in the style of the wealthier Americans.”

Mike Gallager is the producer of the radio programme Controlling People on BBC World Service

Where do you fit into 7 billion?

The world’s population is expected to hit seven billion in the next few weeks. After growing very slowly for most of human history, the number of people on Earth has more than doubled in the last 50 years. Where do you fit into this story of human life? Fill in your date of birth here to find out.

Vital Details of Global Warming Are Eluding Forecasters (Science)

Science 14 October 2011:
Vol. 334 no. 6053 pp. 173-174
DOI: 10.1126/science.334.6053.173

PREDICTING CLIMATE CHANGE

Richard A. Kerr

Decision-makers need to know how to prepare for inevitable climate change, but climate researchers are still struggling to sharpen their fuzzy picture of what the future holds.

Seattle Public Utilities officials had a question for meteorologist Clifford Mass. They were planning to install a quarter-billion dollars’ worth of storm-drain pipes that would serve the city for up to 75 years. “Their question was, what diameter should the pipe be? How will the intensity of extreme precipitation change?” Mass says. If global warming means that the past century’s rain records are no guide to how heavy future rains will be, he was asked, what could climate modeling say about adapting to future climate change? “I told them I couldn’t give them an answer,” says the University of Washington (UW), Seattle, researcher.

Climate researchers are quite comfortable with their projections for the world under a strengthening greenhouse, at least on the broadest scales. Relying heavily on climate modeling, they find that on average the globe will continue warming, more at high northern latitudes than elsewhere. Precipitation will tend to increase at high latitudes and decrease at low latitudes.

But ask researchers what’s in store for the Seattle area, the Pacific Northwest, or even the western half of the United States, and they’ll often demur. As Mass notes, “there’s tremendous uncertainty here,” and he’s not just talking about the Pacific Northwest. Switching from global models to models focusing on a single region creates a more detailed forecast, but it also “piles uncertainty on top of uncertainty,” says meteorologist David Battisti of UW Seattle.

First of all, there are the uncertainties inherent in the regional model itself. Then there are the global model’s uncertainties at the regional scale, which it feeds into the regional model. As the saying goes, if the global model gives you garbage, regional modeling will only give you more detailed garbage. And still more uncertainties are created as data are transferred from the global to the regional model.

Although uncertainties abound, “uncertainty tends to be downplayed in a lot of [regional] modeling for adaptation,” says global modeler Christopher Bretherton of UW Seattle. But help is on the way. Regional modelers are well into their first extensive comparison of global-regional model combinations to sort out the uncertainties, although that won’t help Seattle’s storm-drain builders.

Most humble origins

Policymakers have long asked for regional forecasts to help them adapt to climate change, some of which is now unavoidable. Even immediate, rather drastic action to curb emissions of greenhouse gases would not likely limit warming globally to 2°C, generally considered the threshold above which “dangerous” effects set in. And nothing at all can be done to reduce the global warming effects expected in the next several decades. They are already locked into climate change.

Sharp but true? Feeding a global climate model’s prediction for midcentury (top) into a regional model gives more details (bottom), but modelers aren’t sure how accurate the details are. CREDIT: NORTH AMERICAN REGIONAL CLIMATE CHANGE ASSESSMENT PROGRAM

So scientists have been doing what they can for decision-makers. Early on, it wasn’t much. A U.S. government assessment released in 2000, Climate Change Impacts on the United States, relied on the most rudimentary regional forecasting technique (Science, 23 June 2000, p. 2113). Expert committee members divided the country into eight regions and then considered what two of their best global climate models had to say about each region over the next century. The two models were somewhat consistent in the far southwest, where the report’s authors found it was likely that warmer and drier conditions would eliminate alpine ecosystems and shorten the ski season.

But elsewhere, there was far less consistency. Over the eastern two-thirds of the contiguous 48 states, for example, the two models couldn’t agree on how much moisture soils would hold in the summer. Kansas corn would either suffer severe droughts more frequently, as one model had it, or enjoy even more moisture than it currently does, as the other indicated. But at least the uncertainties were plain for all to see.

The uncertainties of regional projections nearly faded from view in the next U.S. effort, Global Climate Change Impacts in the United States. The 2009 study drew on not two but 15 global models melded into single projections. In a technique called statistical downscaling, its authors assumed that local changes would be proportional to changes on the larger scales. And they adjusted regional projections of future climate according to how well model simulations of past climate matched actual climate.

Statistical downscaling yielded a broad warming across the lower 48 states with less warming across the southeast and up the West Coast. Precipitation was mostly down, especially in the southwest. But discussion of uncertainties in the modeling fell largely to a footnote (number 110), in which the authors cite a half-dozen papers to support their assertion that statistical downscaling techniques are “well-documented” and thoroughly corroborated.

The other sort of downscaling, known as dynamical downscaling or regional modeling, has yet to be fully incorporated into a U.S. national assessment. But an example of state-of-the-art regional modeling appeared 30 June in Environmental Research Letters. To investigate what will happen in the U.S. wine industry, regional modeler Noah Diffenbaugh of Purdue University in West Lafayette, Indiana, and his colleagues embedded a detailed model that spanned the lower 48 states in a climate model that spanned the globe. The global model’s relatively fuzzy simulation of evolving climate from 1950 to 2039—calculated at points about 150 kilometers apart—then fed into the embedded regional model, which calculated a sharper picture of climate change at points only 25 kilometers apart.

Closely analyzing the regional model’s temperature projections on the West Coast, the group found that the projected warming would decrease the area suitable for production of premium wine grapes by 30% to 50% in parts of central and northern California. The loss in Washington state’s Columbia Valley would be more than 30%. But adaptation to the warming, such as the introduction of heat-tolerant varieties of grapes, could sharply reduce the losses in California and turn the Washington loss into a 150% gain.

Not so fast

A rapidly growing community of regional modelers is turning out increasingly detailed projections of future climate, but many researchers, mostly outside the downscaling community, have serious reservations. “Many regional modelers don’t do an adequate job of quantifying issues of uncertainty,” says Bretherton, who is chairing a National Academy of Sciences study committee on a national strategy for advancing climate modeling. “We’re not confident predicting the very things people are most interested in being predicted,” such as changes in precipitation.

Regional models produce strikingly detailed maps of changed climate, but they might be far off base. “The problem is that precision is often mistaken for accuracy,” Bretherton says. Battisti just doesn’t see the point of downscaling. “I would never use one of these products,” he says.

The problems start with the global models, as critics see it. Regional models must fill in the detail in the fuzzy picture of climate provided by global models, notes atmospheric scientist Edward Sarachik, professor emeritus at UW Seattle. But if the fuzzy picture of the region is wrong, the details will be wrong as well. And global models aren’t very good at painting regional pictures, he says. A glaring example, according to Sarachik, is the way global models place the cooler waters of the tropical Pacific farther west than they are in reality. Such ocean temperature differences drive weather and climate shifts in specific regions halfway around the world, but with the cold water in the wrong place, the global models drive climate change in the wrong regions.

Gregory Tripoli’s complaint about the global models is that they can’t create the medium-size weather systems that they should be sending into any embedded regional model. Tripoli, a meteorologist and modeler at the University of Wisconsin, Madison, cites the case of summertime weather disturbances that churn down off the Rocky Mountains and account for 80% of the Midwest’s summer rainfall. If a regional model forecasting for Wisconsin doesn’t extend to the Rockies, Wisconsin won’t get the major weather events that add up to be climate. And some atmospheric disturbances travel from as far away as Thailand to wreak havoc in the Midwest, he says, so they could never be included in the regional model.

A tougher nut. Predicting the details of precipitation using a regional model (bottom) fed by a global model (top) is even more uncertain than projecting regional temperature change. CREDIT: NORTH AMERICAN REGIONAL CLIMATE CHANGE ASSESSMENT PROGRAM

Even the things the global models get right have a hard time getting into regional models, critics say. “There are a lot of problems matching regional and global models,” Tripoli says. In one problem area, global and regional models usually have different ways of accounting for atmospheric processes such as individual cloud development that neither model can simulate directly, creating further clashes. Even the different philosophies involved in building global models and regional models can lead to mismatches that create phantom atmospheric circulations, Tripoli says. “It’s not straightforward you’re going to get anything realistic,” he says.

Redeeming regional modeling

“You could say all the global and regional models are wrong; some people do say that,” notes regional modeler Filippo Giorgi of the Abdus Salam International Centre for Theoretical Physics in Trieste, Italy. “My personal opinion is we do know something now. A few reports ago, it was really very, very difficult to say anything about regional climate change.”

But Giorgi says that in recent years he has been seeing increasingly consistent regional projections coming from combinations of many different models and from successive generations of models. “This means the projections are more and more reliable,” he says. “I would be confident saying the Mediterranean area will see a general decrease in precipitation in the next decades. I’ve seen this in several generations of models, and we understand the processes underlying this phenomenon. This is fairly reliable information, qualitatively. Saying whether the decrease will be 10% or 50% is a different issue.”

The skill of regional climate forecasting also varies from region to region and with what is being forecast. “Temperature is much, much easier” than precipitation, Giorgi notes. Precipitation depends on processes like atmospheric convection that operate on scales too small for any model to render in detail. Trouble simulating convection also means that higher-latitude climate is easier to project than that of the tropics, where convection dominates.

Regional modeling does have a clear advantage in areas with complex terrain such as mountainous regions, notes UW’s Mass, who does regional forecasting of both weather and climate. In the Pacific Northwest, the mountains running parallel to the coast direct onshore winds upward, predictably wringing rain and snow from the air without much difficult-to-simulate convection.

The downscaling of climate projections should be getting a boost as the Coordinated Regional Climate Downscaling Experiment (CORDEX) gets up to speed. Begun in 2009, CORDEX “is really the first time we’ll get a handle on all these uncertainties,” Giorgi says. Various groups will take on each of the world’s continent-size regions. Multiple global models will be matched with multiple regional models and run multiple times to tease out the uncertainties in each. “It’s a landmark for the regional climate modeling community,” Giorgi says.

 

Science 23 June 2000:
Vol. 288 no. 5474 p. 2113
DOI: 10.1126/science.288.5474.2113

GREENHOUSE WARMING

Dueling Models: Future U.S. Climate Uncertain

Richard A. Kerr

When Congress started funding a global climate change research program in 1990, it wanted to know what all this talk about greenhouse warming would mean for United States voters. Ten years later, a U.S. national assessment, drawing on the best available climate model predictions, concludes that the United States will indeed warm, affecting everything from the western snowpacks that supply California with water to New England’s fall foliage. But on a more detailed level, the assessment often draws a blank. Whether the cornfields of Kansas will be gripped by frequent, severe droughts, as one climate model has it, or blessed with more moisture than they now enjoy, as another predicts, the report can’t say. As much as policy-makers would like to know exactly what’s in store for Americans, the rudimentary state of regional climate science will not soon allow it, and the results of this 3-year effort brought the point home.

“This is the first time we’ve tried to take the physical [climate] system and see what effect it might have on ecosystems and socioeconomic systems,” says Thomas Karl, director of the National Oceanic and Atmospheric Administration’s (NOAA’s) National Climatic Data Center in Asheville, North Carolina, and a co-chair of the committee of experts that pulled together the assessment report “Climate Change Impacts on the United States” (available at http://www.nacc.usgcrp.gov/). “We don’t say we know there’s going to be catastrophic drought in Kansas,” he says. “What we do say is, ‘Here’s the range of our uncertainties.’ This document should get people to think.” If anything is certain, Karl says, it’s that “the past isn’t going to be a very good guide to future climate.”

By chance, the assessment had a handy way to convey the range of uncertainty that regional modeling serves up. The report, which divides the country into eight regions, is based on a pair of state-of-the-art climate models—one from the Canadian Climate Center and one from the U.K. Hadley Center for Climate Research and Prediction—that couple a simulated atmosphere and ocean. The two models solved the problems of simplifying a complex world in different ways, leading to very different predicted U.S. climates. “In terms of temperature, the Canadian model is at the upper end of the warming by 2100” predicted by a range of models, says modeler Eric Barron of Pennsylvania State University, University Park, and a member of the assessment team. “The Hadley model is toward the lower end. The Canadian model is on the dry side, and the Hadley model is on the wet side. We’re capturing a substantial portion of the range of simulations. We tried hard to convey that uncertainty.”

On a broad scale, the report can conclude: “Overall productivity of American agriculture will likely remain high, and is projected to increase throughout the 21st century,” although there will be winners and losers from place to place, and adapting agricultural practice to climate change will be key. Where the models are somewhat consistent, as in the far southwest, the report ventures what could be construed as predictions: “It is likely that some ecosystems, such as alpine ecosystems, will disappear entirely from the region,” or “Higher temperatures are likely to mean … a shorter season for winter activities, such as skiing.” Where the models clash, as on summer soil moisture over the eastern two-thirds of the lower 48 states, it explains the alternatives and suggests ways to adapt, such as switching crops.

The range of possible climate impacts laid out by the models “fairly reflects where we are in the science,” says Karl. But he notes that the effort did lack one important input: Congress mandated the assessment without funding it. “You get what you pay for,” says climatologist Kevin Trenberth of the National Center for Atmospheric Research in Boulder, Colorado. “A lot of it was done hastily.” Karl concedes that everyone involved would have liked to have had more funding delivered more reliably.

Even given more time and money, however, the assessment may not have come up with much better small-scale predictions, given the inherent limitations of the science. Even the best models today can say little that’s reliable about climate change at the regional level, never mind at the scale of a congressional district. Their picture of future climate is fuzzy—they might lump together San Francisco and Los Angeles because the models have such coarse geographic resolution—and the realism of such meteorological phenomena as clouds and precipitation is compromised by the inevitable simplifications of simulating the world in a computer.

“For the most part, these sorts of models give a warming,” says modeler Filippo Giorgi, “but they tend to give very different predictions, especially at the regional level, and there’s no way to say one should be believed over another.” Giorgi and his colleague Raquel Francisco of the Abdus Salam International Center for Theoretical Physics in Trieste, Italy, recently evaluated the uncertainties in five coupled climate models—including the two used in the national assessment—within 23 regions, the continental United States comprising roughly three regions. Giorgi concludes that as the scale of prediction shrinks, reliability drops until for small regions “the model data are not believable at all.”

Add in uncertainties external to the models, such as population and economic growth rates, says modeler Jerry D. Mahlman, director of NOAA’s Geophysical Fluid Dynamics Laboratory in Princeton, New Jersey, and the details of future climate recede toward unintelligibility. Some people in Congress and the policy community had “almost silly expectations there would be enormously useful, small-scale specifics, if you just got the right model. But the right model doesn’t exist,” says Mahlman.

Still, even though the national assessment does not offer the list of region-by-region impacts that Congress might have hoped for, it does show “where we are adaptable and where we are vulnerable,” says global change researcher Stephen Schneider of Stanford University. In 10 years, modelers say, they’ll do better.