Arquivo da tag: comunicação científica

Data fog: Why some countries’ coronavirus numbers do not add up (Al Jazeera)

Reported numbers of confirmed cases have become fodder for the political gristmill. Here is what non-politicians think.

By Laura Winter – 17 Jun 2020

Students at a university in Germany evaluate data from COVID-19 patients [Reuters]
Students at a university in Germany evaluate data from COVID-19 patients [Reuters]

Have you heard the axiom “In war, truth is the first casualty?”

As healthcare providers around the world wage war against the COVID-19 pandemic, national governments have taken to brawling with researchers, the media and each other over the veracity of the data used to monitor and track the disease’s march across the globe.

Allegations of deliberate data tampering carry profound public health implications. If a country knowingly misleads the World Health Organization (WHO) about the emergence of an epidemic or conceals the severity of an outbreak within its borders, precious time is lost. Time that could be spent mobilising resources around the globe to contain the spread of the disease. Time to prepare health systems for a coming tsunami of infections. Time to save more lives.

No one country has claimed that their science or data is perfect: French and US authorities confirmed they had their first coronavirus cases weeks earlier than previously thought.

Still, coronavirus – and the data used to benchmark it – has become grist for the political mill. But if we tune out the voices of politicians and pundits, and listen to those of good governance experts, data scientists and epidemiological specialists, what does the most basic but consequential data – the number of confirmed cases per country – tell us about how various governments around the globe are crunching coronavirus numbers and spinning corona-narratives?

What the good governance advocates say

Similar to how meteorologists track storms, data scientists use models to express how epidemics progress, and to predict where the next hurricane of new infections will batter health systems.

This data is fed by researchers into computer modelling programmes that national authorities and the WHO use to advise countries and aid organisations on where to send medical professionals and equipment, and when to take actions such as issuing lockdown orders.

The WHO also harnesses this data to produce a daily report that news organisations use to provide context around policy decisions related to the pandemic. But, unlike a hurricane, which cannot be hidden, epidemic data can be fudged and manipulated.

“The WHO infection numbers are based on reporting from its member states. The WHO cannot verify these numbers,” said Michael Meyer-Resende, Democracy Reporting International’s executive director.

To date, more than 8 million people have been diagnosed as confirmed cases of COVID-19. Of that number, more than 443,000 have died from the virus, according to Johns Hopkins University.

Those numbers are commonly quoted, but what is often not explained is that they both ultimately hinge on two factors: how many people are being tested, and the accuracy of the tests being administered. These numbers we “fetishise”, said Meyer-Resende, “depend on testing, on honesty of governments and on size of the population”.

“Many authoritarian governments are not transparent with their data generally, and one should not expect that they are transparent in this case,” he said. To test Meyer-Resende’s theory that less government transparency equals less transparent COVID-19 case data, Al Jazeera used Transparency International’s Corruption Perceptions Index and the Economist Intelligence Unit’s Democracy Index as lenses through which to view the number of reported cases of the coronavirus.

Transparency International’s Corruption Perceptions Index

The examination revealed striking differences in the number of confirmed COVID-19 cases that those nations deemed transparent and democratic reported compared to the numbers reported by nations perceived to be corrupt and authoritarian.

Denmark, with a population of roughly six million, is ranked in the top 10 of the most transparent and democratic countries. The country reported on May 1 that it had 9,158 confirmed cases of COVID-19, a ratio of 1,581 confirmed cases per million. That was more than triple the world average for that day – 412 cases per million people – according to available data.

Data Fog graphic 2/Laura Winter

Meanwhile, Turkmenistan, a regular in the basement of governance and corruption indexes, maintains that not one of its roughly six million citizens has been infected with COVID-19, even though it borders and has extensive trade with Iran, a regional epicentre of the pandemic.

Also on May 1, Myanmar, with a population of more than 56 million, reported just 151 confirmed cases of infection, a rate of 2.8 infections per million. That is despite the fact that every day, roughly 10,000 workers cross the border into China, where the pandemic first began.

On February 4, Myanmar suspended its air links with Chinese cities, including Wuhan, where COVID-19 is said to have originated last December (however, a recent study reported that the virus may have hit the city as early as August 2019).

“That just seems abnormal, out of the ordinary. Right?” said Roberto Kukutschka, Transparency International’s research coordinator, in reference to the numbers of reported cases.

“In these countries where you have high levels of corruption, there are high levels of discretion as well,” he told Al Jazeera. “It’s counter-intuitive that these countries are reporting so few cases, when all countries that are more open about these things are reporting way more. It’s very strange.”

While Myanmar has started taking steps to address the pandemic, critics say a month of preparation was lost to jingoistic denial. Ten days before the first two cases were confirmed, government spokesman Zaw Htay claimed the country was protected by its lifestyle and diet, and because cash is used instead of credit cards to make purchases.

Turkmenistan’s authorities have reportedly removed almost all mentions of the coronavirus from official publications, including a read-out of a March 27 phone call between Uzbek President Shavkat Mirziyoyev and Turkmen President Gurbanguly Berdimuhamedov.

It is unclear if Turkmenistan even has a testing regime.

Russia, on the other hand, touts the number of tests it claims to have performed, but not how many people have been tested – and that is a key distinction because the same person can be tested more than once. Transparency International places Russia in the bottom third of its corruption index.

On May 1, Russia, with a population just above 145 million, reported that it had confirmed 106,498 cases of COVID-19 after conducting an astounding 3.72 million “laboratory tests”. Just 2.9 percent of the tests produced a positive result.

Data fog feature graphic 3/Laura Winter

Remember, Denmark’s population is six million, or half that of Moscow’s. Denmark had reportedly tested 206,576 people by May 1 and had 9,158 confirmed coronavirus cases, a rate of 4.4 percent. Finland, another democracy at the top of the transparency index, has a population of 5.5 million and a positive test result rate of 4.7 percent.

This discrepancy spurred the editors of PCR News, a Moscow-based Russian-language molecular diagnostics journal, to take a closer look at the Russian test. They reported that in order to achieve a positive COVID-19 result, the sample tested must contain a much higher volume of the virus, or viral load, as compared to the amount required for a positive influenza test result.

In terms of sensitivity or ability to detect COVID-19, the authors wrote: “Is it high or low? By modern standards – low.”

They later added, “The test will not reveal the onset of the disease, or it will be decided too early that the recovering patient no longer releases viruses and cannot infect anyone. And he walks along the street, and he is contagious.”

Ostensibly, if that person then dies, COVID-19 will not be certified as the cause of death.

Good governance experts see a dynamic at play.

Countries who test less will be shown as less of a problem. Countries that test badly will seem as if they don’t have a problem. Numbers are very powerful.

Michael Meyer-Resende, Democracy Reporting International

“In many of these countries, the legitimacy of the state depends on not going into crisis,” said Kukutschka, adding that he counts countries with world-class health systems among them.

“Countries who test less will be shown as less of a problem. Countries that test badly will seem as if they don’t have a problem,” said Meyer-Resende. “Numbers are very powerful. They seem objective.”

Meyer-Resende highlighted the case of China. “The Chinese government said for a while that it had zero new cases. That’s a very powerful statement. It says it all with a single digit: ‘We have solved the problem’. Except, it hadn’t. It had changed the way of counting cases.”

China – where the pandemic originated – recently escaped a joint US-Australian-led effort at the World Health Assembly to investigate whether Beijing had for weeks concealed a deadly epidemic from the WHO.

China alerted the WHO about the epidemic on December 31, 2019. Researchers at the University of Hong Kong estimated that the actual number of COVID-19 cases in China, where the coronavirus first appeared, could have been four times greater in the beginning of this year than what Chinese authorities had been reporting to the WHO.

“We estimated that by Feb 20, 2020, there would have been 232,000 confirmed cases in China as opposed to the 55,508 confirmed cases reported,” said the researchers’ report published by the Lancet.

The University of Hong Kong researchers attribute the discrepancy to ever-changing case definitions, the official guidance that tells doctors which symptoms – and therefore patients – can be diagnosed and recorded as COVID-19. China’s National Health Commission issued no less than seven versions of these guidelines between January 15 and March 3.

All of which adds to the confusion.

“Essentially, we are moving in a thick fog, and the numbers we have are no more than a small flashlight,” said Meyer-Resende.

What the epidemiological expert thinks

Dr Ghassan Aziz monitors epidemics in the Middle East. He is the Health Surveillance Program manager at the Doctors Without Borders (MSF) Middle East Unit. He spoke to Al Jazeera in his own capacity and not on behalf of the NGO.

“I think Iran, they’re not reporting everything,” he told Al Jazeera. “It’s fair to assume that [some countries] are underreporting because they are under-diagnosing. They report what they detect.”

He later added that US sanctions against Iran, which human rights groups say have drastically constrained Tehran’s ability to finance imports of medicines and medical equipment, could also be a factor.

“Maybe [it’s] on purpose, and maybe because of the sanctions and the lack of testing capacities,” said Aziz.

Once China shared the novel coronavirus genome on January 24, many governments began in earnest to test their populations. Others have placed limits on who can be tested.

In Brazil, due to a sustained lack of available tests, patients using the public health network in April were tested only if they were hospitalised with severe symptoms. On April 1, Brazil reported that 201 people had died from the virus. That number was challenged by doctors and relatives of the dead. A month later, after one minister of health was fired and another resigned after a week on the job, the testing protocols had not changed.

On May 1, Brazil reported that COVID-19 was the cause of death for 5,901 people. On June 5, Brazil’s health ministry took down the website that reported cumulative coronavirus numbers – only to be ordered by the country’s Supreme Court to reinstate the information.

Right-wing President Jair Bolsonaro has repeatedly played down the severity of the coronavirus pandemic, calling it “a little flu”. Brazilian Supreme Court Justice Gilmar Mendes accused the government of attempting to manipulate statistics, calling it “a manoeuvre of totalitarian regimes”.

Brazil currently has the dubious distinction of having the second-highest number of COVID-19 deaths in the world, behind the US. By June 15, the COVID-19 death toll in the country had surpassed 43,300 people.

Dr Aziz contends that even with testing, many countries customarily employ a “denial policy”. He said in his native country, Iraq, health authorities routinely obfuscate health emergencies by changing the names of outbreaks such as cholera to “endemic diarrhoea”, or Crimean-Congo hemorrhagic fever to “epidemic fever”.

“In Iraq, they give this idea to the people that ‘We did our best. We controlled it,'” Dr Aziz said. “When someone dies, ‘Oh. It’s not COVID-19. He was sick. He was old. This is God’s will. It was Allah.’ This is what I find so annoying.”

What the data scientist says

Sarah Callaghan, a data scientist and the editor-in-chief of Patterns, a data-science medical journal, told Al Jazeera the numbers of confirmed cases countries report reflect “the unique testing and environmental challenges that each country is facing”.

But, she cautioned: “Some countries have the resources and infrastructure to carry out widespread testing, others simply don’t. Some countries might have the money and the ability to test, but other local issues come into play, like politics.”

According to Callaghan, even in the best of times under the best circumstances, collecting data on an infectious disease is both difficult and expensive. But despite the difficulties presented by some countries’ data, she remains confident that the data and modelling that is available will indeed contribute much to understanding how COVID-19 spreads, how the virus reacts to different environmental conditions, and discovering the questions that need answers.

Her advice is: “When looking at the numbers, think about them. Ask yourself if you trust the source. Ask yourself if the source is trying to push a political or economic agenda.”

“There’s a lot about this situation that we don’t know, and a lot more misinformation that’s being spread, accidentally or deliberately.”

Opinion | Forty Years Later, Lessons for the Pandemic From Mount St. Helens (New York Times)

By Lawrence Roberts – May 17, 2020

The tensions we now face between science, politics and economics also arose before the country’s most destructive volcanic eruption.

Mr. Roberts is a former editor at ProPublica and The Washington Post.

Mount St. Helens erupted on May 18, 1980.
United Press International

When I met David A. Johnston, it was on a spring evening, about a month before he would be erased from existence by a gigantic cloud of volcanic ash boiling over him at 300 miles per hour. He was coming through the door of a makeshift command center in Vancouver, Wash., the closest city to the graceful snow-capped dome of Mount St. Helens, a volcano that had been dormant for 123 years. This was April 1980, and Mr. Johnston, a 30-year-old geologist, was one of the first scientists summoned to monitor new warning signs from the mountain — shallow earthquakes and periodic bursts of ash and steam.

As a young reporter I had talked my way into the command center. At first Mr. Johnston was wary; he wasn’t supposed to meet the press anymore. His supervisors had played down the chance that the smoking mountain was about to explode, and they had already reprimanded him for suggesting otherwise. But on this night he’d just been setting measuring equipment deep in the surrounding forest, and his runner-thin frame vibrated with excitement, his face flushed under his blond beard, and Mr. Johnston couldn’t help riffing on the likelihood of a cataclysmic event.

“My feeling is when it goes, it’s going to go just like that,” he told me, snapping his fingers. “Bang!” At best, he said, we’d have a couple of hours of warning.

Mr. Johnston was mostly right. Early on a Sunday morning several weeks later, the mountain did blow, in the most destructive eruption in U.S. history. But there was no warning. At his instrument outpost, on a ridge more than five miles from the summit, Mr. Johnston had only seconds to radio in a last message: “Vancouver! Vancouver! This is it!”

A photograph of David Johnston, who was killed when Mount St. Helens erupted.
Chris Sweda/Daily Southtown, via Associated Press

Monday, May 18, marks the 40th anniversary of the 1980 Mount St. Helens eruption, and as we now face our own struggle to gauge the uncertain risks presented by nature, to predict how bad things will get and how much and how long to protect ourselves, it may be useful to revisit the tension back then between science, politics and economics.

The drama played out on a much smaller stage — one region of one state, instead of the whole planet — but many of the same elements were present: Scientists provided a range of educated guesses, and public officials split on how to respond. Business owners and residents chafed at the restrictions put in place, many flouted them, and a few even threatened armed rebellion. In the end, the government mostly accepted the analyses of Mr. Johnston and his fellow geologists. As a result, while the eruption killed 57 people and flattened hundreds of square miles of dense Pacific Northwest forestland, the lives of hundreds, perhaps thousands, were spared.

At the first warning signs, state and federal officials moved to distance people from the mountain. They sought to block nonessential visitors from nearby Spirit Lake, ringed with scout camps and tourist lodges. Other than loggers, few people hung around the peak year-round, but the population surged in late spring and summer, when thousands hiked, camped and moved into vacation homes. Many regulars dismissed the risk. Slipping past roadblocks became a popular activity. Locals sold maps to sightseers and amateur photographers that showed how to take old logging roads up the mountain. The owner of a nearby general store shared a common opinion of the threat: “It’s just plain bull. I lived here 26 years, and nothing like this happened before.”

Like the probability of a pandemic, though, it was well-established that one of the dozen or so volcanoes in the 800-mile Cascade Range might soon turn active. Averaging two eruptions a century, they were overdue. A 1978 report by the U.S. Geological Survey, where Mr. Johnston worked, identified Mount St. Helens as most likely to blow next. Yet forecasting how big the event could be was a matter of art as well as science. Geologists could model only previous explosions and list the possible outcomes. (“That position was difficult for many to accept, because they believed we could and should make predictions,” a U.S.G.S. report said later.)

Some scientists suggested a much larger evacuation, but uncertainty, a hallmark of their discipline, can be difficult for those making real-time public policy. The guidelines from federal and state representatives camped out in Vancouver, and from Washington’s governor, Dixy Lee Ray, often seemed in conflict. Moreover, the Weyerhaeuser Company, which owned tens of thousands of acres of timber, opposed logging restrictions, even as some crews got nervous about working near the rumbling dome.

By mid-April, a bulge grew on the north flank, a clue that highly pressurized magma was trapped and expanding. If it burst, a landslide might bury Spirit Lake. The governor, a conservative Democrat who was a biologist by training, finally agreed to stronger measures. She ordered an inner “red zone” where only scientists and law enforcement personnel could enter, and a “blue zone” open to loggers and property owners with day passes. If the zones didn’t extend as far as many geologists hoped, they were certainly an improvement.

Then the mountain got deceptively quiet. The curve of seismic activity flattened and turned downward. Many grew complacent, and restless. On Saturday, May 17, people with property inside the red zone massed in cars and pickup trucks at the roadblock on State Highway 504. Hearing rumors that some carried rifles, the governor relented, allowing them through, with a police escort, to check on their homes and leave again. The state patrol chief, Robert Landon, told them, “We hope the good Lord will keep that mountain from giving us any trouble.” The property owners vowed to return the next day.

The next day was Sunday. At 8:32 a.m., a powerful quake shook loose the snow-covered north face of Mount St. Helens, releasing the superheated magma, which roared out of the mountain in a lateral blast faster than a bullet train, over the spot where Mr. Johnston stood, mowing down 230 square miles of trees, hurling trunks into the air like twigs. It rained down a suffocating storm of thick gray ash, “a burning sky-river wind of searing lava droplet hail,” as the poet Gary Snyder described it. Mudflows clogged the river valleys, setting off deadly floods. A column of ash soared 15 miles high and bloomed into a mushroom cloud 35 miles wide. Over two weeks, ash would circle the globe. Among the 57 dead were three aspiring geologists besides Mr. Johnston, as well as loggers, sightseers and photographers.

About a week later, the Forest Service took reporters up in a helicopter. I had seen the mountain from the air before the eruption. Now the sprawling green wilderness that appeared endless and permanent had disappeared in a blink. We flew for an hour over nothing but moonscape. The scientists had done their best, but nature flexed a power far more deadly than even they had imagined.

Lawrence Roberts, a former editor at ProPublica and The Washington Post, is the author of the forthcoming “Mayday 1971: A White House at War, a Revolt in the Streets, and the Untold History of America’s Biggest Mass Arrest.”

‘There is no absolute truth’: an infectious disease expert on Covid-19, misinformation and ‘bullshit’ (The Guardian)

Carl Bergstrom’s two disparate areas of expertise merged as reports of a mysterious respiratory illness emerged in January

‘Just because the trend that you see is consistent with a story that someone’s selling,inferring causality is dangerous.’
‘Just because the trend that you see is consistent with a story that someone’s selling,inferring causality is dangerous.’ Photograph: Matthew Horwood/Alamy Stock Photo

Julia Carrie Wong, Tue 28 Apr 2020 11.00 BST

Carl Bergstrom is uniquely suited to understanding the current moment. A professor of biology at the University of Washington, he has spent his career studying two seemingly disparate topics: emerging infectious diseases and networked misinformation. They merged into one the moment reports of a mysterious respiratory illness emerged from China in January.

The coronavirus touched off both a pandemic and an “infodemic” of hoaxes, conspiracy theories, honest misunderstandings and politicized scientific debates. Bergstrom has jumped into the fray, helping the public and the press navigate the world of epidemiological models, statistical uncertainty and the topic of his forthcoming book: bullshit.

The following interview has been edited for length and clarity.

You’ve been teaching a course and have co-written a book about the concept of bullshit. Explain what you mean by bullshit?

The formal definition that we use is “language, statistical figures, data, graphics and other forms of presentation that are intended to persuade by impressing and overwhelming a reader or listener with a blatant disregard for truth or logical coherence”.

The idea with bullshit is that it’s trying to appear authoritative and definitive in a way that’s not about communicating accurately and informing a reader, but rather by overwhelming them, persuading them, impressing them. If that’s done without any allegiance to truth, or accuracy, that becomes bullshit.

We’re all used to verbal bullshit. We’re all used to campaign promises and weasel words, and we’re pretty good at seeing through that because we’ve had a lot of practice. But as the world has become increasingly quantified and the currency of arguments has become statistics, facts and figures and models and such, we’re increasingly confronted, even in the popular press, with numerical and statistical arguments. And this area’s really ripe for bullshit, because people don’t feel qualified to question information that’s given to them in quantitative form.

Are there bullshit narratives about the coronavirus that you are concerned about right now?

What’s happened with this pandemic that we’re not accustomed to in the epidemiology community is that it’s been really heavily politicized. Even when scientists are very well-intentioned and not trying to support any side of the narrative, when they do work and release a paper it gets picked up by actors with political agendas.

Whether it’s talking about seroprevalence or estimating the chance that this is even going to come to the United States at all each study gets picked up and placed into this little political box and sort of used as a cudgel to beat the other side with.

So even when the material isn’t being produced as bullshit, it’s being picked up and used in the service of that by overstating its claims, by cherry-picking the information that’s out there and so on. And I think that’s kind of the biggest problem that we’re facing.

One example [of intentional bullshit] might be this insistence for a while on graphing the number of cases on a per-capita basis, so that people could say the US response is so much better than the rest of the world because we have a slower rate of growth per capita. That was basically graphical malfeasance or bullshit. When a wildfire starts spreading, you’re interested in how it’s spreading now, not whether it’s spreading in a 100-acre wood or millions of square miles of national forest.

Is there one big lesson that you think that the media should keep in mind as we communicate science to the public? What mistakes are we making?

I think the media has been adjusting really fast and doing really well. When I’m talking about how to avoid misinformation around this I’m constantly telling people to trust the professional fact-based media. Rather than looking for the latest rumor that’s spreading across Facebook or Twitter so that you can have information up to the hour, recognize that it’s much better to have solidly sourced, well-vetted information from yesterday.

Hyper-partisan media are making a huge mess of this, but that’s on purpose. They’ve got a reason to promote hydroxychloroquine or whatever it is and just run with that. They’re not even trying to be responsible.

But one of the biggest things that people [in the media]could do to improve would be to recognize that scientific studies, especially in a fast-moving situation like this, are provisional. That’s the nature of science. Anything can be corrected. There’s no absolute truth there. Each model, each finding is just adding to a weight of evidence in one direction or another.

A lot of the reporting is focusing on models, and most of us probably don’t have any basic training in how to read them or what kind of credence to put in them. What should we know?

The key thing, and this goes for scientists as well as non-scientists, is that people are not doing a very good job thinking about what the purpose of different models are, how the purposes of different models vary, and then what the scope of their value is. When these models get treated as if they’re oracles, then people both over-rely on them and treat them too seriously – and then turn around and slam them too hard for not being perfect at everything.

Are there mistakes that are made by people in the scientific community when it comes to communicating with the public?

We’re trying to communicate as a scientific community in a new way, where people are posting their data in real time. But we weren’t ready for the degree to which that stuff would be picked up and assigned meaning in this highly politically polarized environment. Work that might be fairly easy for researchers to contextualize in the field can be portrayed as something very, very different in the popular press.

The first Imperial College model in March was predicting 1.1 million to 2.2 million American deaths if the pandemic were not controlled. That’s a really scary, dramatic story, and I still think that it’s not unrealistic. That got promoted by one side of the partisan divide. Then Imperial came back and modeled a completely different scenario, where the disease was actually brought under control and suppressed in the US, and they released a subsequent model that said, ‘If we do this, something like 50,000 deaths will occur.’ That was picked up by the other side and used to try to discredit the Imperial College team entirely by saying, ‘A couple of weeks ago they said a million now they’re saying 50,000; they can’t get anything right.’ And the answer , of course, is that they were modeling two different scenarios.

We’re also not doing enough of deliberately stressing the possible weaknesses of our interpretations. That varies enormously from researcher to researcher and team to team.

It requires a lot of discipline to argue really hard for something but also be scrupulously open about all of the weaknesses in your own argument.

But it’s more important than ever, right? A really good paper will lay out all the most persuasive evidence it can and then in the conclusion section or the discussion section say, ‘OK, here are all the reasons that this could be wrong and here are the weaknesses.’

When you have something that’s so directly policy relevant, and there’s a lot of lives at stake, we’re learning how to find the right balance.

It is a bit of a nightmare to put out data that is truthful, but also be aware that there are bad faith actors at the moment who might pounce on it and use it in a way you didn’t intend.

There’s a spectrum. You have outright bad faith actors – Russian propaganda picking up on things and bots spreading misinformation – and then you have someone like Georgia Governor Brian Kemp who I wouldn’t calla bad faith actor. He’s a misinformed actor.

There’s so much that goes unsaid in science in terms of context and what findings mean that we don’t usually write in papers. If someone does a mathematical forecasting model, you’re usually not going to have a half-page discussion on the limitations of forecasting. We’re used to writing for an audience of 50 people in the world, if we’re lucky, who have backgrounds that are very similar to our own and have a huge set of shared assumptions and shared knowledge. And it works really well when you’re writing on something that only 50 people in the world care about and all of them have comparable training, but it is a real mess when it becomes pressing, and I don’t think any of us have figured out exactly what to do about that because we’re also trying to work quickly and it’s important to get this information out.

One area that has already become contentious and in some ways politicized is the serology surveys, which are supposed to show what percentage of the population has antibodies to the virus. What are some of the big picture contextual caveats and limitations that we should keep in mind as these surveys come out?

The seroprevalence in the US is a political issue, and so the first thing is to recognize that when anyone is reporting on that stuff, there’s a political context to it. It may even be that some of the research is being done with an implicitly political context, depending on who the funders are or what the orientations and biases of some of the researchers.

On the scientific side, I think there’s really two things to think about. The first one is the issue of selection bias. You’re trying to draw a conclusion about one population by sampling from a subset of that population and you want to know how close to random your subset is with respect to the thing you’re trying to measure. The Santa Clara study recruited volunteers off of Facebook. The obvious source of sampling bias there is that people desperately want to get tested. The people that want it are, of course, people that think they’ve had it.

The other big piece is understanding the notion of positive predictive value and the way false positive and false negative error rates influence the estimate. And that depends on the incidence of infection in the population.

If you have a test that has a 3% error rate, and the incidence in the population is below 3%, then most of the positives that you get are going to be false positives. And so you’re not going to get a very tight estimate about how many people have it. This has been a real problem with the Santa Clara study. From my read of the paper, their data is actually consistent with nobody being infected. A New York Citystudy on the other hand showed 21% seropositive, so even if there has a 3% error rate, the majority of those positives have to be true positives.

Now that we’ve all had a crash course in models and serosurveys, what are the other areas of science where it makes sense for the public to start getting educated on the terms of the debate?

One that I think will come along sooner or later is interpreting studies of treatments. We’ve dealt with that a little bit with the hydroxychloroquine business but not in any serious way because the hydroxychloroquine work has been pretty weak and the results have not been so positive.

But there are ongoing tests of a large range of existing drugs. And these studies are actually pretty hard to do. There’s a lot of subtle technical issues: what are you doing for controls? Is there a control arm at all? If not, how do you interpret the data? If there is a control arm, how is it structured? How do you control for the characteristics of the population on whom you’re using the drug or their selection biases in terms of who’s getting the drug?

Unfortunately, given what we’ve already seen with hydroxychloroquine, it’s fairly likely that this will be politicized as well. There’ll be a parallel set of issues that are going to come around with vaccination, but that’s more like a year off.

If you had the ability to arm every person with one tool – a statistical tool or scientific concept – to help them understand and contextualize scientific information as we look to the future of this pandemic, what would it be?

I would like people to understand that there are interactions between the models we make, the science we do and the way that we behave. The models that we make influence the decisions that we take individually and as a society, which then feed back into the models and the models often don’t treat that part explicitly.

Once you put a model out there that then creates changes in behavior that pull you out of the domain that the model was trying to model in the first place. We have to be very attuned to that as we try to use the models for guiding policy.

That’s very interesting, and not what I expected you to say.

What did you expect?

That correlation does not imply causation.

That’s another very good one. Seasonality is a great example there. We’re trying a whole bunch of things at the same time. We’re throwing all kinds of possible solutions at this and lots of things are changing. It’s remarkable to me actually, that so many US states are seeing the epidemic curve decrease. And so there’s a bunch of possibilities there. It could be because people’s behavior is changing. There could be some seasonality there. And there are other possible explanations as well.

But what is really important is that just because the trend that you see is consistent with a story that someone’s selling, there may be many other stories that are also consistent, so inferring causality is dangerous.

Pandemia: cientistas ganham exposição inédita nos meios de comunicação (Faperj)

Paul Jürgens – Publicado em: 09/04/2020 | Atualizado em: 10/04/2020

Luiz Davidovich: o presidente da Academia Brasileira de Ciências
espera que a positiva exposição midiática por que passa a ciência
neste momento não cesse após a epidemia ser superada

A chegada do coronavirus ao País provocou um impacto sem precedentes na rotina do funcionamento das instituições e empresas brasileiras, e no dia da dia da população. Com o trabalho da Imprensa, não foi diferente. Em poucos dias, jornalistas reviravam suas agendas em busca de contatos no meio científico, na tentativa de entender o que estava em jogo com a chegada da Covid-19 e de oferecer informações seguras a seus leitores. Um dos jornais impressos de maior circulação no País anunciou há poucos dias que estava convidando cinco cientistas para, alternadamente, assinaram coluna diária em suas páginas, intitulada “A Hora da Ciência”. O Boletim FAPERJ foi ouvir o que os pesquisadores e gestores que atuam na área da Ciência, Tecnologia e Inovação pensam desse súbito interesse de todos os meios de comunicação pela pesquisa no País, e que legado isso pode deixar para as relações da comunidade científica com os jornalistas, uma vez superada a crise sanitária.

Para o presidente da Academia Brasileira de Ciências, o físico Luiz Davidovich, a crise atual envolve todo o planeta, atingindo ricos e pobres, que agora estão tendo a oportunidade de acompanhar os avanços mais recentes da ciência, que por meio de técnicas cada vez mais sofisticadas permite conhecer o modo de ação do vírus, e que motiva equipes em todo o mundo para encontrar remédios e vacina. “A comunidade científica está tendo a oportunidade de dar o seu recado, diariamente, de forma clara e objetiva, sem partidarismo político. A primeira pessoa a anunciar a vitória da humanidade contra esse inimigo invisível e insidioso não será um político. A notícia virá, em primeira mão, com um comunicado redigido com termos técnicos, do grupo de pesquisas que descobrir a vacina”, diz Davidovich, professor do Instituto de Física da Universidade Federal do Rio de Janeiro (UFRJ). Ele espera que a positiva exposição midiática por que passa a ciência neste momento não cesse após a epidemia ser dominada. “Temos muitas ameaças no horizonte, por exemplo, com novos vírus que aparecem frequentemente e a questão das mudanças climáticas.E certamente muitas descobertas que mudarão nosso quotidiano, em benefício da qualidade de vida, ainda estão por vir”, acrescentou.

O coordenador de estratégias de integração regional e nacional da Fundação Oswaldo Cruz (Fiocruz), Wilson Savino, avalia que uma parte significativa da população do planeta já tinha motivos para acreditar na ciência. Ele, no entanto, acredita que isso não necessariamente se traduz por tomadas de consciência em termos de atitudes e de ações. “Somente quando a vida está em perigo, e, no caso da pandemia de Covid-19 esse medo tem dimensão planetária, é que a percepção de que a ciência poderá dar respostas (res)surge”, diz. “A mídia não age de maneira diferente. Não apenas os atores da comunicação midiática sentem o mesmo, procurando informação da melhor qualidade possível junto aos cientistas e instituições científicas, mas também sabem que seus leitores e ouvintes estão ávidos por informação confiável sobre seus próprios destinos”, fala. Vice-coordenador geral das redes de Pesquisa em Arboviroses, que recebe apoio da FAPERJ, Savino, que também é membro da Academia Brasileira de Ciências, torce para que a avidez por respostas científicas para resolver questões relevantes na vida da sociedade não desapareça após o controle da pandemia. “Que a ciência tenha uma nova iluminação nos corações e mentes deste nosso Brasil”.

Eliete Bouskela: para a médica e pesquisadora,
aproximação da sociedade com os cientistas
pode trazer enormes benefícios

Primeira mulher a ocupar o cargo de diretora Científica da FAPERJ, a médica e pesquisadora Eliete Bouskela afirma que cientistas costumam abordar os problemas de forma mais racional e que isso também contribui para o aumento do interesse dos meios de comunicação pela ciência, sobretudo em um momento como esse, de pandemia. “Nós, pesquisadores, tratamos das questões de forma mais racional, transparente, e, quando necessário, não hesitamos em declarar que não temos uma resposta, que estamos buscando soluções”, diz. Professora Titular da Universidade do Estado do Rio de Janeiro (Uerj) e membro associado da Academia Francesa de Medicina, ela acredita que o atual escrutínio da imprensa pelo trabalho dos cientistas deve contribuir para aproximar a comunidade científica do resto da sociedade. “À medida que saímos da torre de marfim e construímos um canal de comunicação com a população, isso certamente resultará em um aumento do interesse das pessoas pelo conhecimento científico e pela carreira de professor e pesquisador. A aproximação da sociedade com os cientistas, que também fazem parte da sociedade, pode trazer enormes benefícios”, assegura.

“Mais fortes e maduros”. É assim que o médico e Professor Titular de Psiquiatria da Faculdade de Medicina da Universidade Federal do Rio de Janeiro (UFRJ) Antonio Egidio Nardi acredita que sairemos da crise sanitária atual. “Vidas serão perdidas e isso é muitíssimo lamentável. Mas a sociedade também ganhará com esta crise relacionada à Covid-19, por exemplo, com a valorização da educação e dos investimentos em ciência e saúde”. Segundo o pesquisador, é possível observar que tanto nos sites informais, quanto na mídia de qualidade, já se discute, com algum embasamento científico, a origem da pandemia, a forma de propagação, como evitar o contágio rápido e as possibilidades de tratamento. “Artigos científicos  comentários de pesquisadores e editoriais de revistas com credibilidade circulam nas mídias sociais de forma surpreendente. A ciência está viva, sendo mundo valorizada. O conhecimento científico está atingindo um grande público. Este é o objetivo primordial da ciência e das sociedades científicas: ajudar a sociedade a viver melhor”, destaca.A sociedade pós-pandemia será melhor e saberá reconhecer o valor de pesquisas, dos profissionais de saúde e da educação de qualidade”, aposta o médico, membro da Academia Nacional de Medicina e que recebe apoio da FAPERJ para suas pesquisas por meio do programa Cientista do Nosso Estado.

Idealizador e ex-diretor do Parque Tecnológico da UFRJ, o engenheiro Mauricio Guedes, que desde julho de 2018 ocupa o cargo de diretor de Tecnologia da FAPERJ, acredita que a humanidade está tendo uma rara oportunidade para repensar o seu modelo de sociedade. “Essa grande exposição midiática sobre as atividades ligadas à ciência, com horas e horas de transmissões ao vivo nas tevês e pela Internet, e também em reportagens que agora ocupam quase todo o espaço disponível em jornais e revistas, certamente trará uma contribuição decisiva para que a população e os meios de comunicação reconheçam o valor da pesquisa e o papel central dos cientistas e tecnólogos no nosso futuro”, observa. “Enxergo aqui uma nova chance de entendermos o mais rápido possível que universidades e empresas precisam se unir para promover o avanço do conhecimento, ao mesmo tempo em que criam soluções em grande escala para o enfrentamento desta crise planetária”, diz. “O mundo não será como antes”.

Para o médico e imunologista Cláudio Tadeu Daniel-Ribeiro, coordenador do Centro de Pesquisa Diagnóstico e Treinamento em Malária no Instituto Oswaldo Cruz (IOC/Fiocruz), o papel da mídia tem sido exemplar, confrontando informações e tentando esclarecer dúvidas da população. “Nesse contexto dramático e assustador, temos a sorte de ver uma imprensa que busca os fatos, lá onde o conhecimento é produzido; na ciência, para esclarecer a sociedade, desinformada, parte por não saber como e onde ter acesso a dados fidedignos, parte por que leigos, agindo em nome de vísões tão desinformadas quanto descoladas da realidade dos fatos, insistem em propalar notícias e opiniões incorretas, que confundem a população”, diz.

Professor TItular de Fisiologia e Biofísica Instituto de Biofísica Carlos Chagas Filho da UFRJ, Antonio Carlos Campos de Carvalho alerta que só a ciência pode oferecer soluções que minimizem os estragos que esta crise fará no mundo. “Em situações de crise mundial, como a atual, a sociedade e os governos sempre se voltam para a ciência, buscando projetar cenários e as melhores respostas para o problema. Sem a ciência, a mídia já percebeu que estaremos sujeitos a achismos de pessoas desqualificadas para lidar com a crise”, diz. “Se nossos governantes entenderem que a ciência é capaz de trazer soluções racionais para nossos problemas, veremos adiante um apoio maciço às universidades e institutos de pesquisa através das agências de fomento, como a FAPERJ. Só ciência e tecnologia geram inovação e progresso social e econômico. O que sustenta nossa economia atualmente é o agronegócio, fortemente impactado justamente pelos avanços científicos e tecnológicos, promovidos, no passado, por diversas instituições nacionais de pesquisa. Com o avanço das técnicas de edição de genomas, vários países terão ganhos significativos de produtividade e temo pelo que pode acontecer com a economia brasileira se perdermos nossa posição de liderança no agronegócio mundial”, analisa o assessor para área da Saúde da Diretoria Científica da FAPERJ.

À frente da Assessoria de Relações Internacionais da FAPERJ, a pesquisadora Vânia Paschoalin acredita que, frente a uma situação de muito agravo à saúde humana, onde um vírus reemergente provoca mortes e sofrimentos, a humanidade parece ter entendido a importância da ciência para salvar vidas, diminuir o sofrimento humano e proporcionar bem estar e saúde. “Os cientistas sempre estiveram à disposição para explicar, com conhecimento e profundidade, o que lhes é perguntado. Assim, acabaram por assumir, neste momento, um papel muito importante de esclarecimentos e direções, devido à credibilidade que a sociedade sempre conferiu a eles”, avalia. Para a diretora-adjunta de Pós-Graduação do Instituto de Química da UFRJ, a humanidade está passando por muitas mudanças neste momento e o interesse dos jornalistas em ouvir os cientistas é reflexo disso. “Espero que tenhamos um apreço ainda mais respeitoso pela Ciência e pelo trabalho obstinado dos cientistas daqui para a frente, e que isso seja revertido em verbas regulares a pesquisa, de maneira que os cientistas possam gerar e disponibilizar conhecimentos para o bem da humanidade”, conclui.

Pandemia do coronavírus faz crescer confiança na Ciência, indica pesquisa (O Globo)

Artigo original

Carol Knoploch, 9 de abril de 2020

Em dez países, 85% dos entrevistados disseram que precisam ouvir mais os cientistas e menos os políticos; no Brasil, esta porcentagem foi de 89%

09/04/2020 – 12:31 / Atualizado em 09/04/2020 – 13:35

Uma cientista examina, por meio de um microscópio, máscara facial reutilizável, com camada com íons de prata, em Kalingrado, na Rússia. Foto: VITALY NEVAR / REUTERS
Uma cientista examina, por meio de um microscópio, máscara facial reutilizável, com camada com íons de prata, em Kalingrado, na Rússia. Foto: VITALY NEVAR / REUTERS

RIO – A pandemia do coronavírus, que já matou cerca de 80 mil pessoas e adoeceu cerca de 1,3 milhão (dados oficiais da Organização Mundial da Saúde do último dia 8), fez crescer no mundo inteiro a confiança na Ciência.

Veja: Isolamento social funciona mas efeito leva um mês para ser sentido

Segundo pesquisa da Edelman Trust Barometer, sobre a “Confiança e o Coronavírus”, 85% dos entrevistados disseram que precisam ouvir mais os cientistas e menos os políticos. No Brasil, esta porcentagem foi de 89%.

Sobre porta vozes confiáveis, os cientistas são os mais citados no geral (83%), seguido pelo médico pessoal (82%), assim como no Brasil (91% e 86% respectivamente).   Autoridades governamentais receberam 48% (geral) e 53% (Brasil) das indicações — era possível escolher mais de uma resposta.

— Talvez a notícia que mais esperamos nos dias de hoje é a descoberta de uma vacina contra o coronavírus. E ela será dada por um cientista — declarou o físico Luiz Davidovich, presidente da Academia Brasileira de Ciências. — A Ciência está muito presente nesse momento atual no mundo inteiro. Aqui no Brasil, na mídia e na fala do nosso ministro da Saúde. O tempo inteiro, (Luiz Henrique) Mandetta enfatiza o papel da Ciência no combate ao coronavírus. Cientistas do mundo todo se comunicam, trocam informações e estão nessa corrida contra o tempo. Não sei o que acontecerá depois desta pandemia, mas os governos e as pessoas em geral deveriam manter seus apoios e confiança nos cientistas.

. Foto: Editoria de Arte
. Foto: Editoria de Arte
. Foto: Editoria de Arte
. Foto: Editoria de Arte

Interativo:  Veja a disseminação do coronavírus no Brasil e no Mundo

A pesquisa foi feita entre 6 e 10 de março de 2020, por sondagem on-line em 10 países: África do Sul, Alemanha, Brasil, Canadá, Coreia do Sul, Estados Unidos, França, Itália, Japão e Reino Unido. Foram 10 mil entrevistados (1.000 por país) e todos os dados têm representatividade nacional em termos de idade, região e gênero. A margem de erro é de três pontos percentuais para mais ou para menos.

Mostrou ainda que a maioria se disse preocupada com a politização da crise: na Coreia do Sul este índice foi o maior (67%), seguindo pela África do Sul e Estados Unidos (62%), França e Alemanha (61%) e Brasil, com 58%, mesma porcentagem no total geral.

Davidovich afirma que antes desta pandemia, a “atitude anticiência” mostrava-se presente em vários países do mundo, inclusive no Brasil. Citou a falta de investimentos e apoio na área e também exemplos dos movimentos contra a vacinação e  o “exótico” terraplanismo, que ganhou força nos Estados Unidos a partir de 2014.

— Quando um presidente de um país, poderoso como os EUA, fala contra as evidencias cientificas com relação às mudanças climáticas, por exemplo, ele afeta o mundo inteiro. Isso vai ser corrigido depois desta epidemia, em que os cientistas seguem como fonte mais confiável?

Cientista mostra tubo com uma solução contendo anticorpos para Covid-19, com o qual trabalha para descobrir um medicamento, na Universidade de Tsinghua, em Pequim, China. Foto: Thomas Peter / REUTERS
Cientista mostra tubo com uma solução contendo anticorpos para Covid-19, com o qual trabalha para descobrir um medicamento, na Universidade de Tsinghua, em Pequim, China. Foto: Thomas Peter / REUTERS


Para o antropólogo Ruben George Oliven, titular do programa de pós-graduação de Antropologia Social da Universidade Federal do Rio Grande do Sul, acredita que pesquisa mostra o quanto o cientista e os profissionais da saúde estão valorizados nos tempos atuais. Mesmo que a pesquisa tenha sido feita em países tão diferentes. Observou que no Brasil,  os discursos antagônicos entre a presidência e o Ministério da Saúde colocam as autoridades governamentais em xeque.

— Mesmo num país como Brasil, em que a religiosidade é importante e os lideres religiosos não estão citados na pesquisa, as pessoas confiam no cientista. Diferentemente do político, que precisa estar bem com todo mundo para se reeleger, que tem discursos diferentes para diferentes grupos, o cientista tem alto grau é visto como alguém que se dedica a descobrir a verdade. Está numa especie de altar, ao lado dos profissionais da saúde — comenta Oliven, que destaca ainda o médico pessoal. — O meu medico é a pessoa que me trata, no qual eu deposito confiança e o que ele diz tem grau de veracidade muito grande. É o que caracteriza uma boa relação médico-paciente.

Ana Julião, gerente geral da Edelman, agência global de comunicação e responsável pela pesquisa, afirma que a empresa faz pesquisas sobre confiança, no mundo inteiro, há 20 anos e tem observado uma polarização entre informação e opinão:

— Essa crise gera um medo natural nas pessoas e faz com que os cientistas sejam os mais confiáveis. Nesse momento, a gente vê o quanto a informação é muito mais importante que a opinião.

Fake news

Sobre a busca por informações, a pesquisa mostrou que a Italia destacou as fonte governamentais (63%). Na África do Sul (72%) e no Brasil (64%), as mídias sociais são citadas como principal fonte de informação. Mas a maioria, sete países, buscam dados prioritariamente com os veículos de comunicação, cujo índice total (incluindo todos os pesquisados) é de 64%.  No Brasil, a imprensa (59%) aparece em segundo e depois, as fontes do governo (40%).

. Foto: Editoria de Arte
. Foto: Editoria de Arte

No total geral, depois da imprensa, aparecem: fontes do governo nacional (40%), mídias sociais (38%), organizações globais de saúde como a OMS (34%), autoridades sanitárias nacionais (29%), amigos e familiares (27%) e fontes do governo local (26%).

Segundo a pesquisa, no Brasil, 85% dizem se preocupar com fake news sobre a pandemia. Além disso, 52% admitem ter dificuldade para encontrar informações confiáveis e de credibilidade sobre o coronavírus e seus efeitos e 89% afirmam que precisam ouvir mais os cientistas e menos os políticos.

No geral, levando em consideração os dez países pesquisados, 74% se dizem preocupados com notícias falsas, 45% tem dificuldade para encontrar dados confiáveis e 85% confiam mais na ciência do que nos políticos.

A filosofa Carla Rodrigues, professora da Universidade Federal do Rio de Janeiro, observa ainda que a pesquisa foi feita no início de março e que houve, no Brasil, uma explosão de fake news nos últimos dias. Assim, segundo ela, as pessoas devem ter mais dificuldade para buscar dados confiáveis. Também destacou o fato da pesquisa mostrar que entre os porta vozes mais eficientes não está as autoridades governamentais.

— Esse número de 52% seria muito maior, sem dúvida. Principalmente por causa da politização criada em torno do coronavírus. Há cerca de duas semanas, a quantidade de fake news é enorme e se criou uma confusão sobre o tema — diz Carla, que acrescenta que nos últimos anos se intensificou o uso de fake news como instrumento de mobilização contra diversas instituições. — Incluindo a Ciência que foi muito enfraquecida. Nesse contexto, é muito mais difícil fazer com que as instituições responsáveis pelo combate a pandemia sejam respeitadas. Ou seja, mais um obstáculo a enfrentar.

A “busca pela verdade”, pelos cientistas, segundo Carla, é constante, mutante, e que é preciso ter cuidado. Isso porque as descobertas serão, em sua maioria, superadas e não se pode usar este fenômeno para desacreditar a classe.

— O coronavírus é um problema novo. E a Ciência vai continuar a pesquisar e investigar. A resposta será sempre atualizada e passível de revisão. Muitas vezes este fenômeno é usado para desacreditar a Ciência. Mas, a boa Ciência não é absoluta, não tem uma verdade final. Ainda bem.

Here’s Why Coronavirus And Climate Change Are Different Sorts Of Policy Problems (Forbes)

Editors’ Pick | Mar 15, 2020, 07:05pm EST

Nives Dolsak and Aseem Prakash

Contributor Green Tech

Passengers wearing protective face masks stand at Schiphol Airport in Amsterdam, on March 13, 2020, amid an outbreak of COVID-19, the new coronavirus. Photo by OLAF KRAAK/ANP/AFP via Getty Images.

Climate protection and public health have striking similarities. The benefits of both can be enjoyed by everyone, even by individuals who do not contribute to the collective efforts to address these problems. If climate change slows down, both drivers of gas-guzzlers and electric cars will benefit – although the former did not help in climate efforts. Similarly, if the spread of Coronavirus is halted (the so-called flattening the curve), individuals who refused to wash their hands, as well as the ones who washed them assiduously, will enjoy the restored normal life.

Most countries have gotten their acts together, although belatedly, on Coronavirus. Citizens also seem to be following the advice of public health officials. Could then the Coronavirus policy model be applied to climate change? We urge caution because these crises are different, which means that policies that worked well for Coronavirus might not be effective for climate change.

Different Penalties for Policy and Behavioral Procrastination

Climate change is the defining crisis of our times. Floods, hurricanes, forest fires, and extreme weather events have become more frequent and severe over the years. Although climate change generates passionate discussions in big cities and university campuses, there is inadequate public clamor for immediate action. Some types of decarbonization policies are certainly in place. However, carbon-intensive lifestyles continue (with “flying shame” in Scandinavia being an exception). Today In: Green Tech

This policy lethargy and behavioral inertia are due to many reasons, including concerted opposition by the fossil fuel industry to deep decarbonization. But there are other reasons as well. Climate change is cumulative and does not have a quick onset. Its effects are not always immediate and visible. Many individuals probably do not see a clear link  between their actions and the eventual outcome. This reduces the willingness to alter lifestyles and tolerate personal sacrifices for the collective good.

In contrast, Coronavirus is forcing an immediate policy response and behavioral changes. Its causality is clear and its onset quick. Lives are at stake, especially in western countries. The stock markets are tanking, and the economy is heading towards a recession. Politicians recognize that waffling can lead to massive consequences, even in the short-term. Corona-skeptic President Trump has reversed course and declared a national emergency.

PROMOTED Deloitte BrandVoice  | Paid Program Reducing Environmental Impact Is Now A Business Imperative UNICEF USA BrandVoice  | Paid Program Women’s Advice For Girls: Stay In School. Believe In Yourself. Grads of Life BrandVoice  | Paid Program Where to Begin: Essential Metrics For Talent-Conscious Businesses

In the US, there is federal inaction on climate change. But Coronavirus seems different. 2020 is a Presidential election year, and perhaps this motivates the federal government to (finally) act decisively so that Coronavirus does not become Hurricane Katrina type of political liability.

Spatial Optimism

Climate policies are hobbled by “spatial optimism,” whereby individuals believe that their risk of getting affected by climate change is less than for others. This reduces the willingness to tolerate personal sacrifices for deep decarbonization.

Coronavirus episode began with some level of spatial optimism in the Western world. After all, it was happening in China. But this confidence has quickly disappeared. Globalization means a lot of international travel and trade. China is the main global supplier of many products. Prominent companies such as Apple (AAPL) and Tesla (TSLA) depend on China for manufacturing and sales of their products. Spatial optimism has been overwhelmed by international travel as well as globalized supply chains and financial markets.

Belief in the Efficacy of Adaptation

Some might believe that climate change can be “managed.” Innovators will probably develop commercial-scale negative carbon technologies and societies will adapt to sea-level rise by building seawalls, or maybe relocating some communities to safer areas.

Coronavirus offers no such comfort. Unlike the seasonal flu, there is no vaccine (yet). It is difficult to adapt to the Coronavirus threat when you don’t know what to touch, where to go, and if your family members and neighbors are infected. Not to mention, how many rolls of tissue paper you need to stock before the supplies run out at the local grocery store.

Different Incentives to Attack Scientific Knowledge

On Coronavirus, citizens seem to be willing to follow the advice of public health professionals (at least when it comes to social distancing as reflected in empty roads and shopping centers). Every word of Dr. Anthony Fauci counts.

Why has this advice not drawn scorn from politicians who are suspicious of the “deep state”? After all, the same politicians attack scientific consensus on climate change.

Climate skeptics probably see substantial political and economic payoffs by delaying climate action. Stock markets have not penalized climate skepticism in the US: markets hit record high levels in the first three years of the Trump presidency. And, climate opposition is not leading to electoral losses. On the contrary, the climate agendas in liberal states, such as Oregon and Washington, have stalled.  

Nobody seems to gain by attacking scientific consensus to delay policy action on Coronavirus. Airlines, hospitality, and tourism industries, who have taken a direct hit from social-distancing policies, probably want the problem to be quickly addressed so that people can get back to their “normal” lives.

US politicians who talk about the “deep state,” may want Coronavirus issue resolved before the November 2020 election. Attacking science does not further their political objectives. After all, the looming recession and the stock market decline could influence the election outcomes.

Depth, Scale, and Duration of Changes

Climate policy will cause economic and social dislocation. Decarbonization means that some industries will shut down. Jobs will be lost, and communities will suffer unless “just transition” policies are in place.

Coronavirus policies will probably not cause long-term structural changes in the economy. People will resume flying, tourists will flock to Venice, Rome, and Paris, and the basketball arenas will again overflow with spectators.

However, some short-term measures could lead to long-term changes. For example, individuals may realize that telecommuting is easy and efficient. As a result, they may permanently reduce their work-related travel. Coronavirus may provide the sort of a “nudge” that shifts long-term behavioral preferences.

In sum, the contrast between the rapid response to Coronavirus and policy waffling on climate change reveals how citizens think of risk and how this shapes their willingness to incur costs for the collective good. Further, it suggests that politicians respect science when its recommendations serve their political ends.

Nives Dolsak is Stan and Alta Barer Professor in Sustainability Science and Director of the School of Marine & Environmental Affairs. Aseem Prakash is the Walker Family Professor and the Director of the Center for Environmental Politics. Both are at the University of Washington, Seattle.  

Conspiracy theories: how belief is rooted in evolution – not ignorance (The Conversation)

December 13, 2019 9.33am EST – original article

Mikael Klintman PhD, Professor, Lund University

Despite creative efforts to tackle it, belief in conspiracy theories, alternative facts and fake news show no sign of abating. This is clearly a huge problem, as seen when it comes to climate change, vaccines and expertise in general – with anti-scientific attitudes increasingly influencing politics.

So why can’t we stop such views from spreading? My opinion is that we have failed to understand their root causes, often assuming it is down to ignorance. But new research, published in my book, Knowledge Resistance: How We Avoid Insight from Others, shows that the capacity to ignore valid facts has most likely had adaptive value throughout human evolution. Therefore, this capacity is in our genes today. Ultimately, realising this is our best bet to tackle the problem.

So far, public intellectuals have roughly made two core arguments about our post-truth world. The physician Hans Rosling and the psychologist Steven Pinker argue it has come about due to deficits in facts and reasoned thinking – and can therefore be sufficiently tackled with education.

Meanwhile, Nobel Prize winner Richard Thaler and other behavioural economists have shown how the mere provision of more and better facts often lead already polarised groups to become even more polarised in their beliefs.

Tyler Merbler/Flickr, CC BY-SA

The conclusion of Thaler is that humans are deeply irrational, operating with harmful biases. The best way to tackle it is therefore nudging – tricking our irrational brains – for instance by changing measles vaccination from an opt-in to a less burdensome opt-out choice.

Such arguments have often resonated well with frustrated climate scientists, public health experts and agri-scientists (complaining about GMO-opposers). Still, their solutions clearly remain insufficient for dealing with a fact-resisting, polarised society.

Evolutionary pressures

In my comprehensive study, I interviewed numerous eminent academics at the University of Oxford, London School of Economics and King’s College London, about their views. They were experts on social, economic and evolutionary sciences. I analysed their comments in the context of the latest findings on topics raging from the origin of humanity, climate change and vaccination to religion and gender differences.

It became evident that much of knowledge resistance is better understood as a manifestation of social rationality. Essentially, humans are social animals; fitting into a group is what’s most important to us. Often, objective knowledge-seeking can help strengthen group bonding – such as when you prepare a well-researched action plan for your colleagues at work.

But when knowledge and group bonding don’t converge, we often prioritise fitting in over pursuing the most valid knowledge. In one large experiment, it turned out that both liberals and conservatives actively avoided having conversations with people of the other side on issues of drug policy, death penalty and gun ownership. This was the case even when they were offered a chance of winning money if they discussed with the other group. Avoiding the insights from opposing groups helped people dodge having to criticise the view of their own community.

Similarly, if your community strongly opposes what an overwhelming part of science concludes about vaccination or climate change, you often unconsciously prioritise avoiding getting into conflicts about it.

This is further backed up by research showing that the climate deniers who score the highest on scientific literacy tests are more confident than the average in that group that climate change isn’t happening – despite the evidence showing this is the case. And those among the climate concerned who score the highest on the same tests are more confident than the average in that group that climate change is happening.

This logic of prioritising the means that get us accepted and secured in a group we respect is deep. Those among the earliest humans who weren’t prepared to share the beliefs of their community ran the risk of being distrusted and even excluded.

And social exclusion was an enormous increased threat against survival – making them vulnerable to being killed by other groups, animals or by having no one to cooperate with. These early humans therefore had much lower chances of reproducing. It therefore seems fair to conclude that being prepared to resist knowledge and facts is an evolutionary, genetic adaptation of humans to the socially challenging life in hunter-gatherer societies.

Today, we are part of many groups and internet networks, to be sure, and can in some sense “shop around” for new alliances if our old groups don’t like us. Still, humanity today shares the same binary mindset and strong drive to avoid being socially excluded as our ancestors who only knew about a few groups. The groups we are part of also help shape our identity, which can make it hard to change groups. Individuals who change groups and opinions constantly may also be less trusted, even among their new peers.

In my research, I show how this matters when it comes to dealing with fact resistance. Ultimately, we need to take social aspects into account when communicating facts and arguments with various groups. This could be through using role models, new ways of framing problems, new rules and routines in our organisations and new types of scientific narratives that resonate with the intuitions and interests of more groups than our own.

There are no quick fixes, of course. But if climate change were reframed from the liberal/leftist moral perspective of the need for global fairness to conservative perspectives of respect for the authority of the father land, the sacredness of God’s creation and the individual’s right not to have their life project jeopardised by climate change, this might resonate better with conservatives.

If we take social factors into account, this would help us create new and more powerful ways to fight belief in conspiracy theories and fake news. I hope my approach will stimulate joint efforts of moving beyond disputes disguised as controversies over facts and into conversations about what often matters more deeply to us as social beings.

The secret of scientists who impact policy (Science Daily)

For influence, engaging stakeholders is key, study shows

February 21, 2017
University of Vermont
Researchers analyzed 15 policy decisions worldwide, with outcomes ranging from new coastal preservation laws to improved species protections, to produce the first quantitative analysis of how environmental knowledge impacts the attitudes and decisions of conservation policymakers.

Environmental scholars have greater policy influence when they engage directly with stakeholders, a UVM-led study says. Credit: Natural Capital Project

Why does some research lead to changes in public policy, while other studies of equal quality do not?

That crucial question — how science impacts policy — is central to the research of University of Vermont (UVM) Prof. Taylor Ricketts and recent alum Stephen Posner.

According to their findings, the most effective way environmental scholars can boost their policy influence — from protecting wildlife to curbing pollution — is to consult widely with stakeholders during the research process.

Speaking at the American Association for the Advancement of Science annual meeting talk, The Effectiveness of Ecosystem Services Science in Decision-Making, on Feb 18., the team briefed scientists and policy experts on their 2016 study in Proceedings of the National Academy of Sciences (PNAS).

Outreach trumps findings

Surprisingly, the study finds that stakeholder engagement is a better predictor of future policy impacts than perceived scientific credibility, says Ricketts, Director of UVM’s Gund Institute and Gund Professor in the Rubenstein School of Environment and Natural Resources.

The study is the first quantitative analysis of how environmental knowledge impacts the attitudes and decisions of conservation policymakers. Researchers from the UVM, World Wildlife Fund and Natural Capital Project analyzed 15 policy decisions worldwide, with outcomes ranging from new coastal preservation laws to improved species protections.

One hand clapping, academic style

Stephen Posner, a Gund researcher and COMPASS policy engagement associate, characterizes policy-related research without outreach as the academic equivalent of “the sound of one hand clapping.”

“Scholars may have the best policy intentions and important research, but our results suggest that decision-makers are unlikely to listen without meaningful engagement of them and various stakeholders,” he says.

When scholars meet with constituent groups — for example, individual landowners, conservation organizations, or private businesses — it improves policymakers’ perception of scientific knowledge as unbiased and representative of multiple perspectives, the study finds.

“For decision-makers, that made research more legitmate and worthy of policy consideration,” Ricketts adds.

Ways to improve consultation

The research team suggests research institutions offer scholars more time and incentives to improve engagement. They also encourage researchers to seek greater understanding of policy decision-making in their fields, and include stakeholder outreach plans in research projects.

“For those working on policy-related questions, we hope these findings offer a reminder of the value of engaging directly with policy makers and stakeholders, ” Posner says. “This will be crucial as we enter the new political reality of the Trump administration.”

Previous research on science-policy decision-making used qualitative approaches, or focused on a small number of case studies.


The study is called “Policy impacts of ecosystem services knowledge” by Stephen Posner, Emily McKenzie, and Taylor H. Ricketts.

Co-author Emily McKenzie hails from WWF and the Natural Capital Project.

The study used a global sample of regional case studies from the Natural Capital Project, in which researchers used the standardized scientific tool InVest to explore environmental planning and policy outcomes.

Data included surveys of decision-makers and expert review of 15 cases with different levels of policy impact. The forms of engagement studied included emails, phone conversations, individual and group meetings, as well as decision-maker perceptions of the scientific knowledge.

Journal Reference:

  1. Stephen M. Posner, Emily McKenzie, Taylor H. Ricketts. Policy impacts of ecosystem services knowledgeProceedings of the National Academy of Sciences, 2016; 113 (7): 1760 DOI: 10.1073/pnas.1502452113

Researchers say they’ve figured out what makes people reject science, and it’s not ignorance (Science Alert)

Why some people believe Earth is flat.


23 JAN 2017

A lot happened in 2016, but one of the biggest cultural shifts was the rise of fake news – where claims with no evidence behind them (e.g. the world is flat) get shared as fact alongside evidence-based, peer-reviewed findings (e.g. climate change is happening).

Researchers have coined this trend the ‘anti-enlightenment movement‘, and there’s been a lot of frustration and finger-pointing over who or what’s to blame. But a team of psychologists has identified some of the key factors that can cause people to reject science – and it has nothing to do with how educated or intelligent they are.

In fact, the researchers found that people who reject scientific consensus on topics such as climate change, vaccine safety, and evolution are generally just as interested in science and as well-educated as the rest of us.

The issue is that when it comes to facts, people think more like lawyers than scientists, which means they ‘cherry pick’ the facts and studies that back up what they already believe to be true.

So if someone doesn’t think humans are causing climate change, they will ignore the hundreds of studies that support that conclusion, but latch onto the one study they can find that casts doubt on this view. This is also known as cognitive bias.

“We find that people will take a flight from facts to protect all kinds of belief including their religious belief, their political beliefs, and even simple personal beliefs such as whether they are good at choosing a web browser,” said one of the researchers, Troy Campbell from the University of Oregon.

“People treat facts as relevant more when the facts tend to support their opinions. When the facts are against their opinions, they don’t necessarily deny the facts, but they say the facts are less relevant.”

This conclusion was based on a series of new interviews, as well as a meta-analysis of the research that’s been published on the topic, and was presented in a symposium called over the weekend as part of the Society for Personality and Social Psychology annual convention in San Antonio.

The goal was to figure out what’s going wrong with science communication in 2017, and what we can do to fix it.

The research has yet to be published, so isn’t conclusive, but the results suggest that simply focussing on the evidence and data isn’t enough to change someone’s mind about a particular topic, seeing as they’ll most likely have their own ‘facts’ to fire back at you.

“Where there is conflict over societal risks – from climate change to nuclear-power safety to impacts of gun control laws, both sides invoke the mantel of science,” said one of the team, Dan Kahan from Yale University.

Instead, the researchers recommend looking into the ‘roots’ of people’s unwillingness to accept scientific consensus, and try to find common ground to introduce new ideas.

So where is this denial of science coming from? A big part of the problem, the researchers found, is that people associate scientific conclusions with political or social affiliations.

New research conducted by Kahan showed that people have actually always cherry picked facts when it comes to science – that’s nothing new. But it hasn’t been such a big problem in the past, because scientific conclusions were usually agreed on by political and cultural leaders, and promoted as being in the public’s best interests.

Now, scientific facts are being wielded like weapons in a struggle for cultural supremacy, Kahan told Melissa Healy over at the LA Times, and the result is a “polluted science communication environment”.

So how can we do better?

“Rather than taking on people’s surface attitudes directly, tailor the message so that it aligns with their motivation,” said Hornsey. “So with climate skeptics, for example, you find out what they can agree on and then frame climate messages to align with these.”

The researchers are still gathering data for a peer-reviewed publication on their findings, but they presented their work to the scientific community for further dissemination and discussion in the meantime.

Hornsey told the LA Times that the stakes are too high to continue to ignore the ‘anti-enlightenment movement’.

“Anti-vaccination movements cost lives,” said Hornsey. “Climate change skepticism slows the global response to the greatest social, economic and ecological threat of our time.”

“We grew up in an era when it was just presumed that reason and evidence were the ways to understand important issues; not fear, vested interests, tradition or faith,” he added.

“But the rise of climate skepticism and the anti-vaccination movement made us realise that these enlightenment values are under attack.”

Why scientists are losing the fight to communicate science to the public (The Guardian)

Richard P Grant

Scientists and science communicators are engaged in a constant battle with ignorance. But that’s an approach doomed to failure

Syringe and needle.

Be quiet. It’s good for you. Photograph: Gareth Fuller/PA

video did the rounds a couple of years ago, of some self-styled “skeptic” disagreeing – robustly, shall we say – with an anti-vaxxer. The speaker was roundly cheered by everyone sharing the video – he sure put that idiot in their place!

Scientists love to argue. Cutting through bullshit and getting to the truth of the matter is pretty much the job description. So it’s not really surprising scientists and science supporters frequently take on those who dabble in homeopathy, or deny anthropogenic climate change, or who oppose vaccinations or genetically modified food.

It makes sense. You’ve got a population that is – on the whole – not scientifically literate, and you want to persuade them that they should be doing a and b (but not c) so that they/you/their children can have a better life.

Brian Cox was at it last week, performing a “smackdown” on a climate change denier on the ABC’s Q&A discussion program. He brought graphs! Knockout blow.

And yet … it leaves me cold. Is this really what science communication is about? Is this informing, changing minds, winning people over to a better, brighter future?

I doubt it somehow.

There are a couple of things here. And I don’t think it’s as simple as people rejecting science.

First, people don’t like being told what to do. This is part of what Michael Gove was driving at when he said people had had enough of experts. We rely on doctors and nurses to make us better, and on financial planners to help us invest. We expect scientists to research new cures for disease, or simply to find out how things work. We expect the government to try to do the best for most of the people most of the time, and weather forecasters to at least tell us what today was like even if they struggle with tomorrow.

But when these experts tell us how to live our lives – or even worse, what to think – something rebels. Especially when there is even the merest whiff of controversy or uncertainty. Back in your box, we say, and stick to what you’re good at.

We saw it in the recent referendum, we saw it when Dame Sally Davies said wine makes her think of breast cancer, and we saw it back in the late 1990s when the government of the time told people – who honestly, really wanted to do the best for their children – to shut up, stop asking questions and take the damn triple vaccine.

Which brings us to the second thing.

On the whole, I don’t think people who object to vaccines or GMOs are at heart anti-science. Some are, for sure, and these are the dangerous ones. But most people simply want to know that someone is listening, that someone is taking their worries seriously; that someone cares for them.

It’s more about who we are and our relationships than about what is right or true.

This is why, when you bring data to a TV show, you run the risk of appearing supercilious and judgemental. Even – especially – if you’re actually right.

People want to feel wanted and loved. That there is someone who will listen to them. To feel part of a family.

The physicist Sabine Hossenfelder gets this. Between contracts one time, she set up a “talk to a physicist” service. Fifty dollars gets you 20 minutes with a quantum physicist … who will listen to whatever crazy idea you have, and help you understand a little more about the world.

How many science communicators do you know who will take the time to listen to their audience? Who are willing to step outside their cosy little bubble and make an effort to reach people where they are, where they are confused and hurting; where they need?

Atul Gawande says scientists should assert “the true facts of good science” and expose the “bad science tactics that are being used to mislead people”. But that’s only part of the story, and is closing the barn door too late.

Because the charlatans have already recognised the need, and have built the communities that people crave. Tellingly, Gawande refers to the ‘scientific community’; and he’s absolutely right, there. Most science communication isn’t about persuading people; it’s self-affirmation for those already on the inside. Look at us, it says, aren’t we clever? We are exclusive, we are a gang, we are family.

That’s not communication. It’s not changing minds and it’s certainly not winning hearts and minds.

It’s tribalism.

Leading Climate Scientists: ‘We Have A Global Emergency,’ Must Slash CO2 ASAP (Think Progress)

 MAR 22, 2016 2:38 PM


James Hansen and 18 leading climate experts have published a peer-reviewed version of their 2015 discussion paper on the dangers posed by unrestricted carbon pollution. The study adds to the growing body of evidence that the current global target or defense line embraced by the world — 2°C (3.6°F) total global warming — “could be dangerous” to humanity.

That 2°C warming should be avoided at all costs is not news to people who pay attention to climate science, though it may be news to people who only follow the popular media. The warning is, after all, very similar to the one found in an embarrassingly underreported report last year from 70 leading climate experts, who had been asked by the world’s leading nations to review the adequacy of the 2°C target.

Specifically, the new Hansen et al study — titled “Ice melt, sea level rise and superstorms: evidence from paleoclimate data, climate modeling, and modern observations that 2 C global warming could be dangerous” — warns that even stabilizing at 2°C warming might well lead to devastating glacial melt, multimeter sea level rise and other related catastrophic impacts. The study is significant not just because it is peer-reviewed, but because the collective knowledge about climate science in general and glaciology in particular among the co-authors is quite impressive.

Besides sea level rise, rapid glacial ice melt has many potentially disastrous consequences, including a slowdown and eventual shutdown of the key North Atlantic Ocean circulation and, relatedly, an increase in super-extreme weather. Indeed, that slowdown appears to have begun, and, equally worrisome, it appears to be supercharging both precipitation, storm surge, and superstorms along the U.S. East Coast (like Sandy and Jonas), as explained here.

It must be noted, however, that the title of the peer-reviewed paper is decidedly weaker than the discussion paper’s “Ice melt, sea level rise and superstorms: evidence from paleoclimate data, climate modeling, and modern observations that 2°C global warming is highly dangerous.” The switch to “could be dangerous” is reminiscent of the switch (in the opposite direction) from the inaugural 1965 warning required for cigarette packages, “Caution: Cigarette Smoking May Be Hazardous to Your Health” to the 1969 required label “Warning: The Surgeon General Has Determined that Cigarette Smoking Is Dangerous to Your Health.”

And yes I’m using the analogy to suggest readers should not be sanguine about the risks we face at 2°C warning. Based on both observations and analysis, the science is clearly moving in the direction that 2°C warming is not “safe” for humanity. But as Hansen himself acknowledged Monday on the press call, the record we now have of accelerating ice loss in both Greenland and West Antarctica is “too short to infer accurately” whether the current exponential trend will continue through the rest of the century.

Hansen himself explains the paper’s key conclusions and the science underlying them in a new video:


The fact that 2°C total warming is extremely likely to lock us in to sea level rise of 10 feet or more has been obvious for a while now. The National Science Foundation (NSF) itself issued a news release back in 2012 with the large-type headline, “Global Sea Level Likely to Rise as Much as 70 Feet in Future Generations.” The lead author explained, “The natural state of the Earth with present carbon dioxide levels is one with sea levels about 70 feet higher than now.” Heck, a 2009 paper in Science found the same thing.

What has changed is our understanding of just how fast sea levels could rise. In 2014 and 2015, a number of major studies revealed that large parts of the Antarctic and Greenland ice sheets are unstable and headed toward irreversible collapse — and some parts may have already passed the point of no return. Another 2015 study found that global sea level rise since 1990 has been speeding up even faster than we knew.

The key question is how fast sea levels can rise this century and beyond. In my piece last year on Hansen’s discussion draft, I examined the reasons the Intergovernmental Panel on Climate Change (IPCC) and scientific community have historically low-balled the plausible worst-case for possible sea level rise by 2100. I won’t repeat that all here.

The crux of the Hansen et al. forecast can be found in this chart on ice loss from the world’s biggest ice sheet:

Antarctic ice mass change

Antarctic ice mass change from GRACE satallite data (red) and surface mass balance method (MBM, blue). Via Hansen et al.

Hansen et al. ask the question: if the ice loss continues growing exponentially how much ice loss (and hence how much sea level rise) will there be by century’s end? If, for instance, the ice loss rate doubles every 10 years for the rest of the century (light green), then we would see multi-meter sea level rise before 2100? On the other hand, it is pretty clear just from looking at the chart that there isn’t enough data to make a certain projection for the next eight decades.

The authors write, “our conclusions suggest that a target of limiting global warming to 2°C … does not provide safety.” On the one hand, they note, “we cannot be certain that multi-meter sea level rise will occur if we allow global warming of 2 C.” But, on the other hand, they point out:

There is a possibility, a real danger, that we will hand young people and future generations a climate system that is practically out of their control.
We conclude that the message our climate science delivers to society, policymakers, and the public alike is this: we have a global emergency. Fossil fuel CO2 emissions should be reduced as rapidly as practical.

I have talked to many climate scientists who quibble with specific elements of this paper, in particular whether the kind of continued acceleration of ice sheet loss is physically plausible. But I don’t find any who disagree with the bold-faced conclusions.

Since there are a growing number of experts who consider that 10 feet of sea level rise this century is a possibility, it would be unwise to ignore the warning. That said, on our current emissions path we already appear to be headed toward the ballpark of four to six feet of sea level rise in 2100 — with seas rising up to one foot per decade after that. That should be more than enough of a “beyond adaptation” catastrophe to warrant strong action ASAP.

The world needs to understand the plausible worst-case scenario for climate change by 2100 and beyond — something that the media and the IPCC have failed to deliver. And the world needs to understand the “business as usual” set of multiple catastrophic dangers of 4°C if we don’t reverse course now. And the world needs to understand the dangers of even 2°C warming.

So kudos to all of these scientists for ringing the alarm bell: James Hansen, Makiko Sato, Paul Hearty, Reto Ruedy, Maxwell Kelley, Valerie Masson-Delmotte, Gary Russell, George Tselioudis, Junji Cao, Eric Rignot, Isabella Velicogna, Blair Tormey, Bailey Donovan, Evgeniya Kandiano, Karina von Schuckmann, Pushker Kharecha, Allegra N. Legrande, Michael Bauer, and Kwok-Wai Lo.

O painel do clima quer falar com você (Observatório do Clima)


O australiano John Cook, editor do site Skeptical Science, fala durante encontro do IPCC em Oslo. Foto: Claudio Angelo/OC

O australiano John Cook, editor do site Skeptical Science, fala durante encontro do IPCC em Oslo. Foto: Claudio Angelo/OC

IPCC faz sua primeira reunião sobre comunicação disposto a mudar a cultura do segredo e a linguagem arcana de seus relatórios – mas esbarra numa estrutura de governança conservadora.

Por Claudio Angelo, do OC, em Oslo –

Luís Bernardo Valença, protagonista do romance Equador, do português Miguel de Souza Tavares, recebe do rei de Portugal uma missão virtualmente impossível: assumir o governo de São Tomé e Príncipe para convencer os compradores ingleses de cacau de que não existe trabalho escravo nas ilhas – e, ao mesmo tempo, garantir que o sistema de trabalho escravo não mude, de forma a não prejudicar a economia local.

A história guarda uma analogia com o momento pelo qual passa o IPCC, o painel do clima da ONU, que na semana passada realizou em Oslo, na Noruega, a primeira reunião de sua história dedicada à comunicação. O comitê internacional de cientistas, agraciado com o Prêmio Nobel da Paz em 2007, reconhece que a forma como se comunica com seus diversos públicos precisa mudar: os sumários de seus relatórios de avaliação são indecifráveis para leigos e para os próprios formuladores de políticas públicas a quem supostamente se dedicam; as decisões são tomadas em reuniões fechadas, o que alimenta rumores de que o painel é ora uma conspiração de ambientalistas para distorcer a ciência, ora uma vítima de ações de governos para aguar conclusões impactantes sobre a gravidade das mudanças do clima; a maneira como a incerteza e o risco são expressos pelo painel é bizantina.

A vontade de abrir-se mais ao público, porém, esbarra no conservadorismo do próprio painel, que preserva um modo de operação da década de 1990, quando lançou seu primeiro relatório de avaliação). Os métodos, as regras e os rituais do IPCC precisam permanecer os mesmos – e seus líderes parecem não querer abrir mão disso. Ao mesmo tempo, eles mesmos pedem mais transparência e mais acessibilidade. Qual é a chance de isso dar certo?

O próprio encontro de Oslo pode ser um termômetro. Foram convidados a participar cerca de 50 especialistas em comunicação do mundo inteiro e mais duas dezenas de autoridades do próprio painel. A reunião foi a primeira em toda a história do IPCC a ser transmitida ao vivo pela internet. Mas isso que só aconteceu depois da cobrança de algumas personalidades da área, como o jornalista americano Andrew Revkin. Ela foi aberta também pela internet pelo presidente do painel, o sul-coreano Hoesung Hwang. Os co-presidentes dos três grupos de trabalho que cuidam de avaliar os três grandes aspectos da mudança do clima (a base física, impactos e vulnerabilidades e mitigação) estiveram presentes o tempo todo, assim como dois dos três vice-presidentes, a americana Ko Barrett e o malês Youba Sokona. Cientistas que coordenaram a produção do AR5, o quinto relatório do IPCC, também estiveram nos dois dias de encontro.

Um consenso importante formado em Oslo foi que a comunicação precisa integrar o processo de produção dos relatórios desde o início. O modelo atual seguido pelo IPCC consiste em preparar primeiro os relatórios e então divulgá-los aos diversos públicos – tomadores de decisão, imprensa e o público geral. É o que Paul Lussier, especialista em mídia da Universidade Yale, chamou de “passar batom num porco” durante sua apresentação.

Enfeitar o suíno, até aqui, tem sido a receita para o fiasco de comunicação do painel. Isso foi mais ou menos matematicamente demonstrado pelo cientista ambiental português Suraje Dessai, professor da Universidade de Leeds, no Reino Unido, e coautor do AR5 (Quinto Relatório de Avaliação do IPCC, publicado entre 2013 e 2014). Uma análise dos sumários do IPCC conduzida por Dessai e colegas com a ajuda de softwares que olham simplicidade e legibilidade foi publicada no ano passado no periódico Nature Climate Change. O trabalho mostrou que não apenas o IPCC é menos legível do que outras publicações científicas, como também o grau de compreensibilidade dos sumários despencou de 1990 para cá.

Uma das recomendações feitas ao final do encontro, e que serão encaminhadas à plenária do IPCC em abril, é para que se incorporem comunicadores profissionais, jornalistas de ciência, psicólogos e antropólogos desde a chamada fase de “definição do escopo” dos relatórios. Isso começaria no AR6, o Sexto Relatório de Avaliação do IPCC, que deverá ser publicado em algum momento entre 2020 e 2022. Essa própria definição, que hoje é feita pelas autoridades do painel e pelos governos, poderá vir a ser realizada numa espécie de consulta pública – na qual diferentes atores, desde a sociedade civil até empresários e mesmo crianças, digam o que querem que o painel avalie sobre a mudança climática. Tamanha abertura seria uma revolução no IPCC, rompendo a lógica professoral que impera hoje na definição das perguntas às quais os relatórios tentam responder.

Outra sugestão, apresentada por um grupo que discutiu as relações entre o IPCC e os meios de comunicação, foi para que os rascunhos dos sumários executivos sejam abertos para o público antes da aprovação final pelos governos. Cada sumário passa por uma série de rascunhos até chegar ao formato final de revisão, que é enviado aos governos para comentários. Os sumários são aprovados por governos e cientistas na plenária do IPCC, onde recebem alterações finais. A regra é que os governos modifiquem muito o texto, mas – e este é um “mas” importante, porque é o que define a credibilidade do IPCC – a palavra final é sempre dos cientistas.

Os rascunhos hoje não são públicos, mas qualquer pessoa pode solicitar ao IPCC fazer parte do comitê de revisores – e ganham, assim, acesso aos documentos. Em 2013, um negacionista do clima vazou em seu blog uma versão do AR5, alegando que o painel estava escondendo evidências de que o aquecimento global se devia a raios cósmicos (não estava). A proposta apresentada em Oslo foi para que os rascunhos de revisão fossem tornados públicos, de forma a minimizar o impacto de vazamentos e a conter desinformação na imprensa.

Outras recomendações feitas em Oslo vão de dar ao site do IPCC uma nova interface pública até produzir infográficos animados da ciência avaliada pelos relatórios.

Na prática, porém, a teoria é outra: um dos dogmas do IPCC é que ele não pode produzir prescrições políticas, ou seja, precisa se limitar a dizer aos países o que acontece com o mundo em cada cenário de emissões e o que é preciso fazer para atingir níveis de emissão x, y ou z no futuro. A rigor, o painel do clima não pode incitar as pessoas a combater a mudança climática – isso seria uma posição de militância. Pior, entre os mais de 150 governos que integram o IPCC e de fato mandam nele (daí a sigla significar Painel Intergovernamental sobre Mudanças Climáticas) há os que não querem resolver o problema, porque vivem da sua causa – os combustíveis fósseis. Essas são amarras importantes à comunicação.

Outro problema é que o IPCC ainda vive no século XX, num sentido bem real. Enquanto a comunicação hoje é digital, o painel do clima decidiu, por consenso, que seus relatórios são aprovados linha por linha pelos governos – e isso significa caneta e papel. Não há nem sequer método para submeter um infográfico animado à plenária, caso alguém ache que é o caso usar esse tipo de recurso no AR6. Sugestões de ter uma equipe de vídeo acompanhando o “making of” dos relatórios foram rejeitadas no passado, porque algumas pessoas no painel não queriam que ninguém ficasse “espionando” seu trabalho. E por aí vai.

O IPCC foi criado em 1988, mas só ganhou uma estratégia de comunicação em 2012. Tem um longo aprendizado pela frente e precisa começar de algum lugar. Pessoas com quem conversei em Oslo disseram duvidar que a maior parte das recomendações seja acatada. Mas é auspicioso, num momento em que o mundo se prepara para implementar o Acordo de Paris, que o templo do conhecimento climático esteja disposto a embarcar na tarefa da comunicação. Ela é mais necessária do que nunca agora. (Observatório do Clima/ #Envolverde)

* O jornalista viajou a convite do IPCC.

** Publicado originalmente no site Observatório do Clima.

Meteorologista da Funceme. “A gente fica feliz com essas chuvas” (O Povo)


DOM 24/01/2016

De acordo com o meteorologista da Funceme Raul Fritz, vórtice ciclônico, característico da pré-estação chuvosa, pode render chuvas intensas em janeiro, como ocorreu no ano de 2004

Luana Severo,


Segundo Fritz, a ciência climática não chegou a um nível tão preciso para ter uma previsão confiável 


“Nós não queremos ser Deus, apenas tentamos antecipar o que pode acontecer”. Nascido em Natal, no Rio Grande do Norte, Raul Fritz, de 53 anos, é supervisor da unidade de Tempo e Clima da Fundação Cearense de Meteorologia e Recursos Hídricos (Funceme). Ele, que afirmou não querer tomar o lugar de Deus nas decisões sobre o clima, começou a trabalhar para a Funceme em 1988, ainda como estagiário, pouco após uma estiagem que se prolongou por cinco anos no Estado, entre 1979 e 1983.

Os anos de prática e a especialização em meteorologia por satélite conferem a Fritz a credibilidade necessária para, por meio de mapas, equações numéricas e o comportamento da natureza, estimar se chove ou não no semiárido cearense. Ele compôs, portanto, a equipe de meteorologistas da Fundação que, na última quarta-feira, 20, previu 65% de chances de chuvas abaixo da média entre os meses de fevereiro e abril deste ano prognóstico que, se concretizado, fará o Ceará completar cinco anos de seca.

Em entrevista ao O POVO, ele detalha o parecer, define o sistema climático cearense e comenta sobre a conflituosa relação entre a Funceme e a população, que sustenta o hábito de desconfiar de todas as previsões do órgão, principalmente porque, um dia após a divulgação do prognóstico, o Estado foi tomado de susto por uma intensa chuvarada.

O POVO – Mesmo com o prognóstico desanimador de 65% de chances de chuvas abaixo da média entre os meses de fevereiro e abril, o cearense tem renovado a fé num “bom inverno” devido às recentes precipitações influenciadas pelo Vórtice Ciclônico de Altos Níveis (VCAN). Há a possibilidade de esse fenômeno perdurar?

Raul Fritz – Sim. Esse sistema que está atuando agora apresenta maior intensidade em janeiro. Ele pode perdurar até meados de fevereiro, principalmente pelas circunstâncias meteorológicas atmosféricas que a gente vê no momento.

OP – Por que o VCAN não tem relação com a quadra chuvosa?

Raul – A quadra chuvosa é caracterizada pela atuação de um sistema muito importante para o Norte e o Nordeste, que é a Zona de Convergência Intertropical (ZCI). É o sistema que traz chuvas de forma mais regular para o Estado. O vórtice é muito irregular. Tem anos em que ele traz boas chuvas, tem anos em que praticamente não traz.

OP – O senhor consegue lembrar outra época em que o VCAN teve uma atuação importante em relação às chuvas?

Raul – Em 2004, houve muita chuva no período de janeiro. Em fevereiro também tivemos boas chuvas, mas, principalmente, em janeiro, ao ponto de encher o reservatório do Castanhão, que tinha sido recém-construído. Mas, os meses seguintes a esses dois não foram bons meses de chuva, então é possível a gente ter esse período de agora bastante chuvoso, seguido de chuvas mais escassas.

OP – O que impulsiona o quadro de estiagem
no Ceará?

Raul – Geograficamente, existem fatores naturais que originam um estado climático de semiaridez. É uma região que tem uma irregularidade muito grande na distribuição das chuvas, tanto ao longo do território como no tempo. Chuvas, às vezes, acontecem bem num período do ano e ruim no seguinte, e se concentram no primeiro semestre, principalmente entre fevereiro e maio, que a gente chama de ‘quadra chuvosa’. Aí tem a pré-estação que, em alguns anos, se mostra boa. Aparenta ser o caso deste ano.

OP – A última seca prolongada no Ceará, que durou cinco anos, ocorreu de 1979 a 1983. Estamos, atualmente, seguindo para o mesmo quadro. O que é capaz de interromper esse ciclo?

Raul – O ciclo geralmente não ultrapassa ou tende a não ultrapassar esse período. A própria variabilidade climática natural interrompe. Poucos casos chegam a ser tão extensos. É mais frequente de dois a três anos. Mas, às vezes, podem se estender a esses dois exemplos, de cinco anos seguidos de chuvas abaixo da média. Podemos ter, também, alguma influência do aquecimento global, que, possivelmente, perturba as condições naturais. Fenômenos como El Niños intensos contribuem. Quando eles chegam e se instalam no Oceano Pacífico, tendem a ampliar esse quadro grave de seca, como é o caso de agora. Esse El Niño que está atuando no momento é equivalente ao de 1997 e 1998, que provocou uma grande seca.

OP – É uma tendência esse panorama de grandes secas intercaladas?

Raul – Sim, e é mais frequente a gente ter anos com chuvas entre normal e abaixo da média, do que anos acima da média.

OP – A sabedoria popular, na voz dos profetas da chuva, aposta em precipitações regulares este ano. Em que ponto ela converge com o conhecimento científico?

Raul – O profeta da chuva percebe, pela análise da natureza, que os seres vivos estão reagindo às condições de tempo e, a partir disso, elabora uma previsão de longo prazo, que é climática. Mas, essa previsão climática pode não corresponder exatamente a um prolongamento daquela variação que ocorreu naquele momento em que ele fez a avaliação. Se acontecer, ele acha que acertou a previsão de clima. Se não, ele considera que errou. Mas, pode coincidir que essa variação a curto prazo se repita e se transforme em longo prazo. Aí é o ponto em que converge. A Funceme tenta antecipar o que pode acontecer num prazo maior, envolvendo três meses a frente. É um exercício muito difícil.

OP – Geralmente, há uma descrença da população em torno das previsões da Funceme. Como desmistificar isso?

Raul – A previsão oferece probabilidades e qualquer uma delas pode acontecer, mas, a gente indica a mais provável. São três que nós lançamos. Acontece que a população não consegue entender essa informação, que é padrão internacional de divulgação. Acha que é uma coisa determinística. Que, se a Funceme previu como maior probabilidade termos chuvas abaixo da média em certo período, acha que já previu seca. Mas, a mais provável é essa mesmo, até para alertar às pessoas com atividades que dependem das chuvas e ao próprio Governo a tomarem precauções, se prevenirem e não só reagirem a uma seca já instalada.

OP – A Funceme, então, também se surpreende com as ocorrências de menor probabilidade, como o VCAN?

Raul – Sim, porque esses vórtices são de difícil previsibilidade. A ciência não conseguiu chegar num nível de precisão grande para ter uma previsão confiável para esse período (de pré-estação chuvosa). De qualquer forma, nos é exigido dar alguma ideia do que possa acontecer. É um risco muito grande que a Funceme assume. A gente sofre críticas por isso. Por exemplo, a gente lançou a previsão de chuvas abaixo da média, aí no outro dia vem uma chuva muito intensa. As pessoas não compreendem, acham que essas chuvas do momento vão se prolongar até o restante da temporada. Apesar da crítica da população, que chega até a pedir para fechar o órgão, a gente fica feliz com a chegada
dessas chuvas.


“A gente lançou a previsão de chuvas abaixo da média, aí no outro dia vem uma chuva muito intensa. As pessoas não compreendem, acham que essas chuvas do momento vão se prolongar até o restante da temporada”

Raul Fritz, meteorologista da Funceme


Raul Fritz, o cientista da chuva


Nascido em Natal, no Rio Grande do Norte, Raul Fritz, de 53 anos, é supervisor da unidade de Tempo e Clima da Fundação Cearense de Meteorologia e Recursos Hídricos (Funceme). Ele começou a trabalhar para a Funceme em 1988, ainda como estagiário, pouco após uma estiagem que se prolongou por cinco anos no Estado, entre 1979 e 1983.

Com ajuda da ciência (Revista Fapesp)

Experiências internacionais inspiram governo de São Paulo a criar cargo de cientista-chefe em secretarias estaduais


030-035_Cientistas chefes_236Uma medida inédita no país anunciada pelo governo do estado de São Paulo pretende aproximar ciência e gestão pública. Até o início de 2016, cada secretaria estadual deverá contar com um cientista-chefe, cuja função principal será apontar as melhores soluções baseadas no conhecimento científico para enfrentar desafios da respectiva pasta. O anúncio foi feito por Márcio França, vice-governador e secretário estadual de Desenvolvimento Econômico, Ciência, Tecnologia e Inovação, na abertura do Fórum Nacional das Fundações Estaduais de Amparo à Pesquisa (Confap), realizado em 27 e 28 de agosto, na capital paulista. A iniciativa é inspirada no modelo de aconselhamento científico praticado em diferentes níveis de governo em países como Estados Unidos, Reino Unido e Israel.

A iniciativa começou a amadurecer em uma reunião do Conselho Superior da FAPESP, no dia 18 de março, da qual o vice-governador participou como convidado. Na ocasião, França mencionou a dificuldade de identificar pesquisadores com ideias para auxiliar a gestão pública. A sugestão de criar a função de cientista-chefe partiu de Carlos Henrique de Brito Cruz, diretor científico da Fundação. “O professor Brito citou a experiência de países europeus, entre eles o Reino Unido, que criaram o cargo de cientista-chefe em suas estruturas de governo para auxiliar ministros, primeiros-ministros ou presidentes a tomar decisões”, relata Fernando Costa, professor da Universidade Estadual de Campinas (Unicamp) e membro do Conselho Superior da FAPESP, um dos presentes à reunião.

No encontro, Brito Cruz explicou ao vice-governador que cerca de 55% dos recursos da FAPESP são investidos em pesquisas voltadas para aplicações, e Eduardo Moacyr Krieger, vice-presidente da instituição, acrescentou que quase 30% dos investimentos da Fundação são direcionados para a área da saúde e podem beneficiar diretamente ações da Secretaria da Saúde. “Outros campos, como agricultura, educação e segurança pública, também deveriam aproveitar mais a contribuição de pesquisadores”, afirma Krieger. Márcio França gostou da sugestão. “Pensei: por que não aprimorar o diálogo com a comunidade científica por meio de uma fundação como a FAPESP?”, recorda-se o vice-governador, que levou a ideia ao governador Geraldo Alckmin e recebeu sinal verde para implementá-la.

Robin Grimes, do governo britânico, em visita ao campus da Universidade de Nottingham na Malásia, em 2013 (acima, de gravata);

Robin Grimes, do governo britânico, em visita ao campus da Universidade de Nottingham na Malásia, em 2013 (acima, de gravata)

“Essa medida não significa que o governo de São Paulo não vem ouvindo a comunidade científica”, observa Marilza Vieira Cunha Rudge, vice-reitora da Universidade Estadual Paulista (Unesp), também membro do Conselho Superior da FAPESP. Segundo ela, o objetivo é fazer com que os conhecimentos gerados em universidades e instituições de pesquisa do estado sejam absorvidos rapidamente pela administração pública. Uma minuta do decreto está sendo redigida com assessoria da Fundação. Um dos objetivos é que os cientistas-chefes ampliem a aplicação de resultados de pesquisas, entre as quais as apoiadas pela FAPESP, sugerindo articulações com projetos em andamento e propondo novos projetos.

O governo analisa agora os detalhes da iniciativa. O primeiro passo será selecionar os cientistas-chefes que atuarão nas secretarias. Segundo França, o mais provável é que se convidem professores vinculados às três universidades estaduais paulistas – a de São Paulo (USP), a Unicamp e a Unesp – que poderiam ou não se licenciar. Também se discute qual seria o prazo mais adequado para o seu mandato. Para França, uma coisa é certa: os cientistas-chefes terão muito trabalho. “Os problemas e os desafios surgem aos montes na administração pública. Todos os dias e nas mais diversas áreas”, observa o vice-governador.

A bússola que orienta os caminhos futuros é a dos exemplos internacionais. Em setembro de 2014, o presidente dos Estados Unidos, Barack Obama, ofereceu um prêmio de US$ 20 milhões para o grupo de pesquisa que conseguir desenvolver o melhor teste de diagnóstico capaz de reconhecer rapidamente infecções causadas por bactérias resistentes a antibióticos. Segundo informações do Centro para Controle e Prevenção de Doenças (CDC), essas infecções são responsáveis pela morte anual de 23 mil norte-americanos. A ação foi motivada por uma avaliação encomendada pela Casa Branca ao Conselho de Ciência e Tecnologia (PCAST), formado por cerca de 20 especialistas, entre ganhadores de Prêmio Nobel e representantes do setor industrial. O grupo é comandado por John Holdren, professor da Universidade Harvard e conselheiro científico de Obama.

John Holdren, cientista-chefe dos Estados Unidos, que aconselha o presidente Obama (abaixo)

Os Estados Unidos têm tradição em aconselhamento científico. Em 1933, o presidente Franklin Roosevelt criou um comitê consultivo formado por cientistas, engenheiros e profissionais da saúde para assessorá-lo. Em 1957, o país foi o primeiro a nomear um cientista-chefe para trabalhar na Casa Branca. Logo departamentos e autarquias passaram a contar com a consultoria de especialistas. Em 1998, a então secretária de Estado, Madeleine Albright, encomendou um relatório para as Academias Nacionais de Ciências dos Estados Unidos sobre o suporte que a ciência poderia dar em assuntos relativos à política externa. A recomendação foi que ela escolhesse um assessor científico. “Minha tarefa é ajudar o governo a aproveitar os recursos da ciência e da tecnologia para embasar a política externa”, disse à Pesquisa FAPESP Vaughan Turekian, assessor científico de John Kerry, o atual secretário de Estado. Ex-diretor internacional da Associação Americana para o Avanço da Ciência (AAAS), Turekian conta que foi submetido a um rigoroso processo de análise de suas credenciais científicas. “O assessor é nomeado por um período determinado. Isso é intencional. Convém lembrar que o cargo não é uma indicação política”, esclarece.

Outra referência é o Reino Unido, que criou o cargo em 1964. A função de cientista-chefe é desempenhada hoje pelo imunologista Mark Walport, ex-diretor do Wellcome Trust, fundação que financia pesquisa biomédica. Desde 2013, Walport assessora o premiê David Cameron. Um dos primeiros temas tratados por Walport no governo foi o da experimentação animal. Em 2014, após estatísticas mostrarem que o número de animais utilizados em testes pré-clínicos aumentou nos últimos anos no Reino Unido, o governo anunciou medidas para reduzir ou substituir seu uso. Walport atuou como ponte entre o governo e a comunidade científica. Reconheceu a necessidade de mudanças, mas salientou que a abolição de animais em estudos científicos ainda é inviável.

Walport também preside o Conselho de Ciência e Tecnologia (CST), ligado ao Departamento de Negócios, Inovação e Capacitação do Reino Unido. O órgão dispõe de uma divisão de especialistas que forma o Grupo de Aconselhamento Científico para Emergências (Sage). A equipe foi acionada em 2010, quando cinzas de um vulcão na Islândia afetaram o espaço aéreo do Reino Unido, e em 2011, após o incidente nuclear de Fukushima, no Japão.

Mark Walport, cientista-chefe do Reino Unido, em visita a centro de pesquisa no Quênia, em julho (primeiro à esquerda na foto ao lado)

O Reino Unido conta com cientistas-chefes em departamentos e ministérios. “Há uma rede de conselheiros científicos dentro do governo. Isso aproximou ainda mais os diferentes ministérios. O professor Walport organiza uma reunião semanal com os conselheiros, que discutem juntos as prioridades de cada área”, disse à Pesquisa FAPESP Robin Grimes, conselheiro-chefe para assuntos científicos do Ministério das Relações Exteriores do Reino Unido. “Acredito que São Paulo conseguirá se articular melhor com a ciência ao adotar essa medida, além de obter acesso a conceituadas redes de pesquisadores no Brasil e no mundo”, afirmou Grimes.

Para James Wilsdon, especialista em política científica da Universidade de Sussex, Inglaterra, esses exemplos ajudaram outros países a criar modelos de aconselhamento científico adaptados a suas realidades. “Há uma grande variedade de temas que demandam o olhar da ciência, como mudanças climáticas, pandemias, segurança alimentar e pobreza”, explica Wilsdon em um relatório apresentado na conferência da Rede Internacional para Aconselhamento Científico a Governos (INGSA), realizada em agosto de 2014 em Auckland, na Nova Zelândia. A entidade reúne tomadores de decisão e pesquisadores com o objetivo de compartilhar experiências e discutir a utilização de informações científicas em governos. O documento apresenta uma avaliação dos modelos de aconselhamento adotados em 20 países. Além dos exemplos clássicos, são apresentados casos de países que criaram recentemente o cargo, como a Nova Zelândia, cujo primeiro cientista-chefe, Peter Gluckman, foi nomeado em 2009.

O estudo mostra que alguns países optaram por formas de aconselhamento não atreladas à figura de um cientista-chefe. No Japão, o Conselho de Ciência, Tecnologia e Inovação (CSTI) é um dos quatro conselhos que auxiliam o gabinete do primeiro-ministro. Ele é formado pelo primeiro-ministro, seis ministros de Estado e representantes da comunidade científica e do setor industrial. Já países como China, Alemanha, Holanda e África do Sul aproveitam a expertise das entidades representativas da comunidade científica. A Sociedade Alemã de Pesquisas Científicas (DFG), agência não governamental de apoio à pesquisa, é consultada pelo governo e ajuda a elaborar políticas públicas. “Fazemos declarações em comissões do
Senado e temos interação direta com o governo”, diz Dietrich Halm, diretor-presidente da DFG para a América-Latina. Segundo Wilsdon, uma das vantagens desse modelo é que os pesquisadores gozam de independência em relação ao governo.

Peter Gluckman, cientista-chefe do primeiro-ministro da Nova Zelândia

Na região da América-Latina e Caribe, o relatório do fórum de aconselhamento científico cita os exemplos de Cuba e El Salvador. No modelo cubano, há um escritório de aconselhamento científico vinculado ao conselho de Estado, formado por 31 membros. Embora o Brasil nunca tenha contado com a figura do cientista-chefe, a administração pública no país criou mecanismos de articulação com pesquisadores. “Informalmente, o governo federal é aconselhado pela comunidade científica em vários temas”, disse Aldo Rebelo, então ministro da Ciência, Tecnologia e Inovação (MCTI). “No meu caso, mantive contato com a Academia Brasileira de Ciências (ABC), com a Sociedade Brasileira para o Progresso da Ciência (SBPC) e com sociedades científicas.” Segundo o vice-presidente da FAPESP, Eduardo Moacyr Krieger, que também foi presidente da ABC, a atuação do cientista-chefe deve complementar o trabalho que as academias de ciências desenvolvem. “As recomendações dadas pelas academias aos governos estão no plano macro. Já o cientista-chefe está no plano da implementação e do detalhamento do que deve ser feito no cotidiano da administração pública”, diz ele.

No estado de São Paulo a assessoria científica ao governo já era praticada em situações específicas, mesmo sem a presença de cientistas-chefes. É o caso da interlocução entre especialistas ligados ao Programa Biota-FAPESP e a Secretaria Estadual do Meio Ambiente. Desde o lançamento do programa em 1999, 23 resoluções e decretos estaduais mencionam resultados do Biota como referência para a tomada de decisões. Há um canal de diálogo com gestores das unidades de conservação onde são desenvolvidos projetos. “Os pesquisadores costumam ser membros de conselhos consultivos de parques estaduais e outras áreas protegidas”, observa Carlos Joly, professor da Unicamp e coordenador do programa. Os especialistas vinculados ao Biota também trabalham em parceria com instituições ligadas à secretaria, como o Instituto de Botânica, o Instituto Florestal e a Fundação Florestal. E o próprio gabinete da secretária do Meio Ambiente, Patricia Faga Iglecias Lemos, acompanha a produção científica do programa.

Outra experiência é a do Conselho Estadual de Ciência, Tecnologia e Inovação em Saúde, criado em 2014 para assessorar a Secretaria da Saúde na formulação e condução de políticas. O órgão é composto por representantes de universidades públicas instaladas em São Paulo, institutos, centros de pesquisa, hospitais e entidades ligadas ao setor industrial. “Atualmente, o conselho discute a proposta de criação de uma política estadual de ciência, tecnologia e inovação em saúde”, explica Sergio Swain Muller, presidente do conselho. “Já realizamos oficinas, ouvimos a contribuição das universidades e estamos preparando um documento com diagnósticos e ações para a consolidação desse plano.” Cabe também ao conselho auxiliar na definição de prioridades para o próximo edital do Programa de Pesquisa para o Sistema Único de Saúde (PPSUS), conduzido pela FAPESP em parceria com a Secretaria da Saúde, o Ministério da Saúde e o CNPq. “Uma das prioridades é apoiar pesquisas sobre novos mecanismos de gestão pública da saúde”, diz Muller. Já no âmbito da Secretaria Estadual de Agricultura e Abastecimento foi criada em 2002 a Agência Paulista de Tecnologia dos Agronegócios (Apta), que atua na coordenação de pesquisas de interesse da pasta. Sua estrutura compreende os institutos Agronômico (IAC), Biológico, de Economia Agrícola, de Pesca, de Tecnologia de Alimentos e o de Zootecnia, além de 15 polos regionais de pesquisa.

e Vaughan Turekian, assessor direto de John Kerry, secretário de Estado norte-americano

“Prospectamos estudos capazes de resolver problemas enfrentados por agricultores e os encaminhamos para a secretaria”, diz Orlando Melo de Castro, coordenador da Apta. Um dos desafios da secretaria cuja solução vem sendo debatida entre os institutos abrigados pela agência é tornar a cana-de-açúcar mais resistente à seca. “O IAC foi procurado, porque já trabalha nesse assunto, inclusive em parceria com usinas localizadas em Goiás, onde há um período de seca prolongado. A ideia é aproveitar essas pesquisas em programas da secretaria”, explica Castro.

Para o sociólogo Simon Schwartzman, estudioso da comunidade científica brasileira e pesquisador do Instituto de Estudos do Trabalho e Sociedade, o país não tem tradição no uso da ciência por gestores públicos. “Claro, há exceções”, pondera. “O Ministério da Saúde conta com um centro de pesquisas próprio, o Instituto Oswaldo Cruz, assim como acontece com o Ministério da Agricultura, que tem a ajuda da Embrapa.” Carlos Joly recorda-se que a comunidade científica costumava impor barreiras na hora de se sentar à mesa com políticos. “Colaborei como assessor de meio ambiente na elaboração da Constituição Federal de 1988. Na época, fui criticado por colegas, que pensavam que cientista não deveria se envolver com assuntos da política”, conta. Em 1995, Joly foi convidado pelo então secretário de Meio Ambiente do estado de São Paulo, Fábio Feldmann, a trabalhar como seu assessor. “Naquele momento isso já não foi visto como algo incomum. Aos poucos os pesquisadores se deram conta da importância de trabalhar em colaboração com gestores públicos”, afirma Joly.

O climatologista Carlos Nobre, presidente da Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (Capes), guarda na memória histórias da relação tensa entre políticos e cientistas. Em 1998, Nobre e sua equipe do Centro de Previsão de Tempo e Estudos Climáticos (CPTEC) encaminharam ao governo federal e ao Congresso um parecer prevendo uma seca de grande intensidade no Nordeste nos meses seguintes, em decorrência do El Niño. “Ninguém nos ouviu”, lembra Nobre. “Acho que não acreditavam, na época, que fosse possível fazer previsão de secas de qualidade com base em modelos matemáticos.”

O vice-governador Márcio França reconhece que há pontos de tensão quando políticos e cientistas se encontram. “A questão é que nem sempre o consenso científico é financeira e politicamente viável naquele momento”, diz ele. Para Carlos Nobre, que já ocupou cargos de gestão de política científica no MCTI e integra o corpo de especialistas do fórum global de aconselhamento científico, ainda assim a situação é melhor hoje. “Ambos os lados perceberam que a solução de problemas como secas e desastres naturais dependem de ações conjuntas”, afirma.

Autora do livro The fifth branch: science advisers as policymakers e de artigos que abordam a relação entre ciência, democracia e política, a norte-americana Sheila Jasanoff, da Universidade Harvard, adverte que o aconselhamento científico a governos exige muitos julgamentos. “Requer a tomada de decisões sobre, por exemplo, se é melhor correr um risco ou se precaver. É preciso saber como pesar as diversas evidências”, explica. Segundo ela, o aconselhamento pode de fato auxiliar os gestores. “Mas os órgãos científicos consultivos precisam operar de forma aberta e transparente. Isso é exigido por lei nos Estados Unidos”, explica. Em 2010, o governo britânico divulgou um documento no qual recomenda que os níveis de incerteza presentes em torno de questões científicas sejam explicitamente identificados nos pareceres enviados a gestores públicos, comunicados em linguagem simples e direta.

Why Communicating About Climate Change Is so Difficult: It’s ‘The Elephant We’re All Inside of’ (Huffington Post)

Jim Pierobon

Posted: 02/05/2015 8:48 pm EST Updated: 02/05/2015 8:59 pm EST

How stakeholders communicate about climate change has long been framed by who’s doing the framing as much, or more so, than the information being communicated. So I am forever curious how various stakeholders — believers, skeptics and deniers alike — are talking about it and who, if anybody, is “moving the needle” in either direction.

One of the most salient and recent inputs to the climate communications conundrum is Don’t Even Think About It — Why Our Brains Are Wired To Ignore Climate Change, by George Marshall in Oxford, England.

Marshall’s work deserves to be spotlighted for how it illuminates why skeptics and deniers alike will not be moved to engage in thoughtful exchanges unless those communicating respect certain tenets of what academic and nonprofit research are finding.

Marshall draws on the efforts of the climate information network (COIN) he co-founded along with research by two leading university-based centers: the Project on Climate Change Communications at Yale University in Princeton, NJ and the Center for Climate Change Communications at George Mason University in Fairfax, VA.

George Marshall is the co-founder of the Climate Outreach and Information Network, a nonprofit organization that specializes in public communication around climate change.

Marshall also taps into the works of authorities who’ve written and/or spoken extensively about climate change, such as Harvard Professor of Psychology Daniel Gilbert, GOP pollster Frank Luntz, Princeton Psychology and Public Affairs Professor Daniel Kahneman, former South Carolina Congressman Bob Inglis, Associate Professor of Sociology at University of Oregon Kari Norgaard and ABC-TV network correspondent Bill Blakemore.

Perhaps it would behoove those preparing for the upcoming United Nations Climate Change Conference of the Parties, aka COP21, in Paris November 30 – December 11, 2015 to heed much of what Marshall and other top-tier researchers are finding and sharing if they are serious about forging a legally binding and universal agreement on climate.

Here is my synthesis of the most illuminating take-a-ways from Marshall’s book. I offer it as a checklist with which to gauge climate communications efforts, regardless of which — if any — side of the issue you’re on. Be sure to share your thoughts.

  • Perceptions are shaped by individual psychological coping mechanisms and the collective narratives that they shape with the people around them.
  • A compelling emotional story that speaks to peoples’ core values has more impact than rational scientific data such as hotter global temperatures and rising sea levels.
  • People’s social identity has an extraordinary hold over their behaviors and views.
  • Drawing too much attention to an undesirable norm (e.g. catastrophic weather) can seriously backfire.
  • In high-carbon societies, EVERYone has a strong reason to ignore the problem or to write their own alibi. What might work better are narratives based on cooperation, mutual interests and a common humanity.
  • The real story is about our fear, denial and struggle to accept our own responsibility. “Climate change isn’t the elephant in the room; it’s the elephant we’re all inside of,” said ABC’s Bill Blakemore.
  • Our brains are UNsuited to deal with climate change unless the threats are personal, abrupt, immoral and immediate. A distant, abstract and disputed threat does not have the necessary characteristics for seriously mobilizing public opinion.
  • Without a clear deadline for action, we create our own timeline. We do so in ways that remove the compulsion to act. We make it just current enough to accept that something needs to be done but put it just too far into the future to require immediate action.

We’d all benefit the most from: what models for communicating about climate change are working, and which ones are not?

  • The messenger is more important than the message. The messenger can be the most important — but also the weakest link — between scientific information and personal conviction. Building on that, to break the partisan “deadlock” and public disinterest starts, Marshall asserts educational efforts need to create the means for new messengers to be heard.
  • There may be lessons learned from the campaign by oil giant BP in the early 2000s offering person-on-the-street testimonials about the need to deal with climate change. Full disclosure: While a Senior Vice President of Public Affairs with Ogilvy Public Relations Worldwide from 2001-2006, I helped develop and execute elements of BP’s “Beyond Petroleum” campaign.
  • Until the economy is back on a strong growth track, climate change advocates will struggle to earn attention in their home countries as long as bread-and-butter ‘pocketbook’ issues are more important to an overwhelming majority of citizens.

See George Marshall in action from this recent interview on TalkingStickTV via YouTube.

While we’re on the subject, I recommend reading the excellent work by the MacArthur Foundation’s “Connecting on Climate” guide completed in 2014. It includes 10 principles for effective climate change communication based on research from various social science fields.

What to Call a Doubter of Climate Change? (New York Times)

The words are hurled around like epithets.

People who reject the findings of climate science are dismissed as “deniers” and “disinformers.” Those who accept the science are attacked as “alarmists” or “warmistas. ” The latter term, evoking the Sandinista revolutionaries of Nicaragua, is perhaps meant to suggest that the science is part of some socialist plot.

In the long-running political battles over climate change, the fight about what to call the various factions has been going on for a long time. Recently, though, the issue has taken a new turn, with a public appeal that has garnered 22,000 signatures and counting.

The petition asks the news media to abandon the most frequently used term for people who question climate science, “skeptic,” and call them “climate deniers” instead.

Climate scientists are among the most vocal critics of using the term “climate skeptic” to describe people who flatly reject their findings. They point out that skepticism is the very foundation of the scientific method. The modern consensus about the risks of climate change, they say, is based on evidence that has piled up over the course of decades and has been subjected to critical scrutiny every step of the way.

Drop into any climate science convention, in fact, and you will hear vigorous debate about the details of the latest studies. While they may disagree over the fine points, those same researchers are virtually unanimous in warning that society is running extraordinary risks by continuing to pump huge quantities of greenhouse gases into the atmosphere.

In other words, the climate scientists see themselves as the true skeptics, having arrived at a durable consensus about emissions simply because the evidence of risk has become overwhelming. And in this view, people who reject the evidence are phony skeptics, arguing their case by cherry-picking studies, manipulating data, and refusing to weigh the evidence as a whole.

The petition asking the media to drop the “climate skeptic” label began withMark B. Boslough, a physicist in New Mexico who grew increasingly annoyed by the term over several years. The phrase is wrong, he said, because “these people do not embrace the scientific method.”

Dr. Boslough is active in a group called the Committee for Skeptical Inquiry, which has long battled pseudoscience in all its forms. Late last year, he wrote a public letter on the issue, and dozens of scientists and science advocates associated with the committee quickly signed it. They include Bill Nye, of “Science Guy” fame, and Lawrence M. Krauss, the physicist and best-selling author.

A climate advocacy organization, Forecast the Facts, picked up on the letter and turned it into a petition. Once the signatures reach 25,000, the group intends to present a formal request to major news organizations to alter their terminology.

All of which raises an obvious question: If not “skeptic,” what should the opponents of climate science be called?

As a first step, it helps to understand why they so vigorously denounce the science. The opposition is coming from a certain faction of the political right. Many of these conservatives understand that since greenhouse emissions are caused by virtually every economic activity of modern society, they are likely to be reduced only by extensive government intervention in the market.

So casting doubt on the science is a way to ward off such regulation. This movement is mainly rooted in ideology, but much of the money to disseminate its writings comes from companies that profit from fossil fuels.

Despite their shared goal of opposing regulation, however, these opponents of climate science are not all of one mind in other respects, and thus no single term really fits them all.

Some make scientifically ludicrous claims, such as denying that carbon dioxide is a greenhouse gas or rejecting the idea that humans are responsible for its increase in the atmosphere. Others deny that Earth is actually warming, despite overwhelming evidence that it is, including the rapid melting of billions of tons of land ice all over the planet.

Yet the critics of established climate science also include a handful of people with credentials in atmospheric physics, and track records of publishing in the field. They acknowledge the heat-trapping powers of greenhouse gases, and they distance themselves from people who deny such basic points.

“For God’s sake, I can’t be lumped in with that crowd,” said Patrick J. Michaels, a former University of Virginia scientist employed by the libertarian Cato Institute in Washington.

Contrarian scientists like Dr. Michaels tend to argue that the warming will be limited, or will occur so gradually that people will cope with it successfully, or that technology will come along to save the day – or all of the above.

The contrarian scientists like to present these upbeat scenarios as the only plausible outcomes from runaway emissions growth. Mainstream scientists see them as being the low end of a range of possible outcomes that includes an alarming high end, and they say the only way to reduce the risks is to reduce emissions.

The dissenting scientists have been called “lukewarmers” by some, for their view that Earth will warm only a little. That is a term Dr. Michaels embraces. “I think it’s wonderful!” he said. He is working on a book, “The Lukewarmers’ Manifesto.”

When they publish in scientific journals, presenting data and arguments to support their views, these contrarians are practicing science, and perhaps the “skeptic” label is applicable. But not all of them are eager to embrace it.

“As far as I can tell, skepticism involves doubts about a plausible proposition,” another of these scientists, Richard S. Lindzen, told an audience a few years ago. “I think current global warming alarm does not represent a plausible proposition.”

Papers by Dr. Lindzen and others disputing the risks of global warming have fared poorly in the scientific literature, with mainstream scientists pointing out what they see as fatal errors. Nonetheless, these contrarian scientists testify before Congress and make statements inconsistent with the vast bulk of the scientific evidence, claiming near certainty that society is not running any risk worth worrying about.

It is perhaps no surprise that many environmentalists have started to call them deniers.

The scientific dissenters object to that word, claiming it is a deliberate attempt to link them to Holocaust denial. Some academics sharply dispute having any such intention, but others have started using the slightly softer word “denialist” to make the same point without stirring complaints about evoking the Holocaust.

Scientific denialism has crept into other aspects of modern life, of course, manifesting itself as creationism, anti-vaccine ideology and the opposition to genetically modified crops, among other doctrines.

To groups holding such views, “evidence just doesn’t matter any more,” said Riley E. Dunlap, a sociologist at Oklahoma State University. “It becomes possible to create an alternate reality.”

But Dr. Dunlap pointed out that the stakes with most of these issues are not as high as with climate-change denial, for the simple reason that the fate of the planet may hang in the balance.

Antropoceno, Capitaloceno, Cthulhuceno: o que caracteriza uma nova época? (ClimaCom)


A proposta de formalização de uma nova época da Terra levanta questões sobre utilidade, responsabilidade e formas alternativas de narrar a história do mundo em que vivemos

Por Daniela Klebis

Os impactos das ações humanas sobre o planeta nos últimos 200 anos têm sido tão profundos que podem justificar a definição de nova época para a Terra, o Antropoceno. No último dia 17 de outubro, a Comissão Internacional sobre Estratigrafia (ICS, na sigla inglês), reuniu-se em Berlim para dar continuidade às discussões sobre a formalização dessa nova época terrena, cuja decisão final será votada somente em 2016. A despeito dos processos burocráticos, o termo já foi informalmente assimilado por filósofos, arqueólogos, historiadores, ambientalistas e cientistas do clima e, nesse meio, o debate segue, para além da reunião de evidências físicas, no sentido de compreender sua utilidade: estamos prontos para assumir a época dos humanos?

A história da Terra se divide em escalas de tempo geológicas, que são definidas pela ICS, com sede em Paris, na França. Essas escalas de tempo começam com grandes espaços de tempos chamados éons, que se dividem em eras (como a Mezozóica), e então em períodos (Jurássico, Neogeno),  épocas e por fim, em idades. Quem acenou pela primeira vez a necessidade de definir uma nova época, baseada nos impactos indeléveis das ações humanas sobre a paisagem terrestre foi o químico atmosférico Paul J. Crutzen, prêmio Nobel de química em 1995. Cutzen sugeriu o termo Antropoceno durante o encontro  do Programa Internacional de Geofera e Biosfera (IGBP, na sigla em inglês), no México, em 2000. O evento tinha por objetivo discutir os problemas do Holoceno, a época em que nos encontramos há cerca de 11700 anos,desde o fim da era glacial.

A hipótese sustentada pelos defensores da nova denominação baseia-se nas observações sobre as mudanças iniciadas pelo homem sobre o ambiente desde 1800, cujas evidências geológicas  possuem impacto a  longo prazo na história da Terra.  E quais são as evidências que podem justificar a adoção do termo Antropoceno?  “O que nós humanos mais fizemos nesses dois séculos foi criar coisas que não existiram pelos 4,5 bilhões de anos da história da Terra”, denuncia o geólogo Jan Zalasiewicz, presidente do grupo de trabalho sobre o Antropoceno da ICS, em colóquio em Sidney, na Autrália, em março deste ano.


Minerais sintéticos, fibras de carbono, plásticos, concreto, são alguns exemplos de novos elementos criados pelo homem. O concreto, um material produzido pela mistura de cimento, areia, pedra e água, vem se espalhando na superfície de nosso planeta a uma velocidade de 2 bilhões de quilômetros por ano, conforme aponta o geólogo.  Abaixo da superfície, escavações em busca de minérios e petróleo já abriram mais de 50 milhões de quilômetros em buracos subterrâneos.

Além das mudanças físicas, a emissão exagerada de dióxido de carbono e outros gases de efeito estufa, resultantes da ação humana, provocam mudanças químicas na atmosfera, como aquecimento global, descongelamento de calotas polares e acifidificação dos oceanos. A biosfera é também analisada, já que mudanças resultantes da perda de habitats, atividades predatórias e invasão de especies também provocam mudanças na composição química e física dos ambientes.

As evidências do impacto da ação humana,que vêm sendo consistentemente apontadas em estudos climáticos, foram reforçadas pelo 5º. Relatório do Painel Intercontinental de Mudanças Climáticas (IPCC), publicado no início do ano, com um consenso de 97% dos cientistas. Mais recentemente, no dia 30 de setembro, um relatório publicado no publicado pela WWF (World Wildlife Fund, em inglês), em parceria com a Sociedade Zoológica de Londres, apontou ainda que, nos últimos 40 anos, 52% da população de animais vertebrados na Terra desapareceu. Ao mesmo tempo, os seres humanos dobraram em quantidade. “Estamos empurrando a biosfera para a sua 6ª. extinção em massa”, alerta Hans-Otto Pörtner, do Instituto Alfred Wegener de Pesquisa Marinha e Polar, em Bremerhaven, Alemanha, e co-autor do capítulo sobre ecossistema do relatório do IPCC publicado nesse ano. Pörtner refere-se às cinco grandes extinções em massa registradas nos últimos 540 milhões de anos, caracterizadas por palentólogos como períodos em que mais de 75% das espécies foram extintas do planeta em um curto intervalo geológico.

“Há 200 anos, a coisas começaram a mudar o suficiente para visivelmente impactar o planeta: a população cresceu, assim como as emissões de CO2”, destaca Zalasiwicz. Segundo ele, o uso de energia cresceu 90 vezes entre 1800 e 2010, e já queimamos cerca de 200 milhões de anos de fósseis, entre carvão, óleo e gás. “Os humanos correspondem a 1/3 de todos os vertebrados da terra. Mas a dominação sem precedentes sobre todos os outros seres vivos, faz dessa a er a humana”, conclui.

Eileen Crist pesquisadora do Departamento de Ciências e Tecnologia na Sociedade, no Virginia Tech, no EUA, desafia a escolha do termo, defendendo que o discurso do Antropoceno deixa de questionar a soberania humana para propor, ao contrário, abordagens tecnológicas que poderiam tornar o domínio humano sustentável. “Ao afirmar a centralidade do homem – tanto como uma força causal quanto como objeto de preocupação – o Antropoceno encolhe o espaço discursivo para desafiar a dominação da biosfera, oferecendo, ao invés disso, um campo técnico-científico para a sua racionalização e um apelo pragmático para nos resignarmos à sua atualidade”, argumenta a pesquidadora em um artigo publicado em 2013.

O Antropoceno, dessa forma, entrelaça uma série de temas na formatação de seu discurso, como, por exemplo, o aumento acelerado da população que chegará a superar os 10 bilhões de habitantes; o crescimento econômico e a cultura de consumo enquanto modelo social dominante; a tecnologia como destino inescapável e, ao mesmo tempo, salvação da vida humana na Terra; e, ainda, o pressuposto de que o impacto humano é natural e contingente da nossa condição de seres providos de inteligência superior. Crist aponta que esse discurso mascara a opção de racionalizar o regime totalitátio do humano no planeta. “Como discurso coeso, ele bloqueia formas alternativas de vida humana na Terra”, indica.



Donna Haraway, professora emérita da Universidade da Califórina em Santa Cruz, EUA, comentou, em participação no Colóquio Os Mil Nomes de Gaia, em setembro, que essa discussão é um dos “modos de buscar palavras que soam muito grandes, porém, não são grandes o suficiente para compreender a continuidade e a precariedade de viver e morrer nessa Terra”. Haraway é também umas das críticas do termo Antropoceno. Segundo ela, o Antropoceno implica um homem individual, que se desenvolve, e desenvolve uma nova paisagem de mundo, estranho a todas as outras formas de vida: uma percepção equivocada de um ser que seria capaz existir sem se relacionar com o resto do planeta. “Devemos compreender que para ser um, devemos ser muitos. Nos tornamos com outros seres”, comenta.

Para Haraway, épreciso, problematizar essa percepção, e endereçar a responsabilidade pelas mudanças, que está justamente no sistema capitalista que criamos. Este sim tem impulsionado a exploração, pelos homens, da Terra: “A história inteira poderia ser Capitaloceno, e não Antropoceno”, diz. Tal percepção, de acordo com a filósofa, pemite-nos resistir ao senso inescapabilidade presente nesse discurso, como Crist mencionou acima. “Estamos cercados pelo perigo de assumir que tudo está acabado, que nada pode acontecer”, diz.

Haraway aponta, entretanto, que é necessário evocar um senso de continuidade (ongoingness,em inglês),a partir de outras possibilidades narrativas e de pensamento.Uma delas, seria o Cthulhuceno, criado pela filósofa. A expressão vem de um conto de H.P.Lovecraft, O chamado de Cthulhu, que fala sobre humanos que têm suas mentes deterioradas quando, em rituais ao deus Cthulhu – uma mistura de homem, dragão e polvo que vive adormecido sob as águas do Pacífico Sul – conseguem vislumbrar uma realidade diferente da que conheciam.  No início da história, o autor norte-americano descreve o seguinte: “A coisa mais misericordiosa do mundo, acho eu, é a incapacidade da mente humana de correlacionar tudo que ela contém”.  A partir desse contexto, Donna Haraway explica que é necessário “desestabilizar mundos de pensamentos, com mundos de pensamentos”. O Cthulhuceno não é sobre adotar uma transcendência, uma ideia de vida ou morte: “trata-se de abraçar a continuidade sinuosa do mundo terreno, no seu passado​​, presente e futuro. Entretanto, tal continuidade implica em assumir que existe um problema muito grande e que ele precisa ser enfrentado. Devemos lamentar o que aconteceu, pois não deveria ter ocorrido. Mas não temos que continuar no mesmo caminho”, sugere.

The cultural side of science communication (Northwestern University)


Hilary Hurd Anyaso

New research explores how culture affects our conceptions of nature

EVANSTON, Ill. — Do we think of nature as something that we enjoy when we visit a national park and something we need to “preserve?” Or do we think of ourselves as a part of nature? A bird’s nest is a part of nature, but what about a house?

The answers to these questions reflect different cultural orientations. They are also reflected in our actions, our speech and in cultural artifacts.

A new Northwestern University study, in partnership with the University of Washington, the American Indian Center of Chicago and the Menominee tribe of Wisconsin, focuses on science communication and how that discipline necessarily involves language and other media-related artifacts such as illustrations. The challenge is to identify effective ways of communicating information to culturally diverse groups in a way that avoids cultural polarization, say the authors.

“We suggest that trying to present science in a culturally neutral way is like trying to paint a picture without taking a perspective,” said Douglas Medin, lead author of the study and professor of psychology in the Weinberg College of Arts and Sciences and the School of Education and Social Policy at Northwestern.

This research builds on the broader research on cultural differences in the understanding of and engagement with science.

“We argue that science communication — for example, words, photographs and illustrations — necessarily makes use of artifacts, both physical and conceptual, and these artifacts commonly reflect the cultural orientations and assumptions of their creators,” write the authors.

“These cultural artifacts both reflect and reinforce ways of seeing the world and are correlated with cultural differences in ways of thinking about nature. Therefore, science communication must pay attention to culture and the corresponding different ways of looking at the world.”

Medin said their previous work reveals that Native Americans traditionally see themselves as a part of nature and tend to focus on ecological relationships. In contrast, European-Americans tend to see humans as apart from nature and focus more on taxonomic relationships.

“We show that these cultural differences are also reflected in media, such as children’s picture books,” said Medin, who co-authored the study with Megan Bang of the University of Washington. “Books authored and illustrated by Native Americans are more likely to have illustrations of scenes that are close-up, and the text is more likely to mention the plants, trees and other geographic features and relationships that are present compared with popular children’s books not done by Native Americans.

“The European-American cultural assumption that humans are not part of ecosystems is readily apparent in illustrations,” he said.

The authors went to Google images and entered “ecosystems,” and 98 percent of the images did not have humans present. A fair number of the remaining 2 percent had children outside the ecosystem, observing it through a magnifying glass and saying, “I spy an ecosystem.”

“These results suggest that formal and informal science communications are not culturally neutral but rather embody particular cultural assumptions that exclude people from nature,” Medin said.

Medin and his research team have developed a series of “urban ecology” programs at the American Indian Center of Chicago, and these programs suggest that children can learn about the rest of nature in urban settings and come to see humans as active players in the world ecosystems.

How to Talk About Climate Change So People Will Listen (The Atlantic)


Environmentalists warn us that apocalypse awaits. Economists tell us that minimal fixes will get us through. Here’s how we can move beyond the impasse. 

Josh Cochran

Not long ago, my newspaper informed me that glaciers in the western Antarctic, undermined by the warmer seas of a hotter world, were collapsing, and their disappearance “now appears to be unstoppable.” The melting of these great ice sheets would make seas rise by at least four feet—ultimately, possibly 12—more than enough to flood cities from New York to Tokyo to Mumbai. Because I am interested in science, I read the two journal articles that had inspired the story. How much time do we have, I wondered, before catastrophe hits?

One study, in Geophysical Research Letters, provided no guidance; the authors concluded only that the disappearing glaciers would “significantly contribute to sea level rise in decades to centuries to come.” But the other, in Science, offered more-precise estimates: during the next century, the oceans will surge by as much as a quarter of a millimeter a year. By 2100, that is, the calamity in Antarctica will have driven up sea levels by almost an inch. The process would get a bit faster, the researchers emphasized, “within centuries.”

How is one supposed to respond to this kind of news? On the one hand, the transformation of the Antarctic seems like an unfathomable disaster. On the other hand, the disaster will never affect me or anyone I know; nor, very probably, will it trouble my grandchildren. How much consideration do I owe the people it will affect, my 40-times-great-grandchildren, who, many climate researchers believe, will still be confronted by rising temperatures and seas? Americans don’t even save for their own retirement! How can we worry about such distant, hypothetical beings?

In our ergonomic chairs and acoustical-panel cubicles, we sit cozy as kings atop 300 years of flaming carbon.

Worse, confronting climate change requires swearing off something that has been an extraordinary boon to humankind: cheap energy from fossil fuels. In the 3,600 years between 1800B.C. and 1800 A.D., the economic historian Gregory Clark has calculated, there was “no sign of any improvement in material conditions” in Europe and Asia. Then came the Industrial Revolution. Driven by the explosive energy of coal, oil, and natural gas, it inaugurated an unprecedented three-century wave of prosperity. Artificial lighting, air-conditioning, and automobiles, all powered by fossil fuels, swaddle us in our giddy modernity. In our ergonomic chairs and acoustical-panel cubicles, we sit cozy as kings atop 300 years of flaming carbon.

In the best of times, this problem—given its apocalyptic stakes, bewildering scale, and vast potential cost—would be difficult to resolve. But we are not in the best of times. We are in a time of legislative paralysis. In an important step, the Obama administration announced in June its decision to cut power-plant emissions 30 percent by 2030. Otherwise, this country has seen strikingly little political action on climate change, despite three decades of increasingly high-pitched chatter by scientists, activists, economists, pundits, and legislators.

The chatter itself, I would argue, has done its share to stall progress. Rhetorical overreach, moral miscalculation, shouting at cross-purposes: this toxic blend is particularly evident when activists, who want to scare Americans into taking action, come up against economists, with their cool calculations of acceptable costs. Eco-advocates insist that only the radical transformation of society—the old order demolished, foundation to roof—can fend off the worst consequences of climate change. Economists argue for adapting to the most-likely consequences; cheerleaders for industrial capitalism, they propose quite different, much milder policies, and are ready to let nature take a bigger hit in the short and long terms alike. Both envelop themselves in the mantle of Science, emitting a fug of charts and graphs. (Actually, every side in the debate, including the minority who deny that humans can affect the climate at all, claims the backing of Science.) Bewildered and battered by the back-and-forth, the citizenry sits, for the most part, on its hands. For all the hot air expended on the subject, we still don’t know how to talk about climate change.

As an issue, climate change was unlucky: when nonspecialists first became aware of it, in the 1990s, environmental attitudes had already become tribal political markers. As the Yale historian Paul Sabin makes clear in The Bet, it wasn’t always this way. The votes for the 1970 Clean Air Act, for example, were 374–1 in the House, 73–0 in the Senate. Sabin’s book takes off from a single event: a bet between the ecologist Paul R. Ehrlich and the economist Julian Simon a decade later. Ehrlich’s The Population Bomb (1968), which decried humankind’s rising numbers, was a foundational text in the environmental movement. Simon’s Ultimate Resource (1981) was its antimatter equivalent: a celebration of population growth, it awakened opposition to the same movement.

Activist led by Bill McKibben, the founder of, protest the building of the Keystone XL pipeline at the White House, February 2013. (AP)

Ehrlich was moderately liberal in his politics but unrestrained in his rhetoric. The second sentence of The Population Bomb promised that “hundreds of millions of people” would starve to death within two decades, no matter what “crash programs” the world launched to feed them. A year later, Ehrlich gave even odds that “England will not exist in the year 2000.” In 1974, he told Congress that “a billion or more people” could starve in the 1980s “at the latest.” When the predictions didn’t pan out, he attacked his critics as “incompetent” and “ignorant,” “morons” and “idiots.”

Simon, who died in 1998, argued that “human resourcefulness and enterprise” will extricate us from our ecological dilemma. Moderately conservative in his politics, he was exuberantly uninhibited in his scorn for eco-alarmists. Humankind faces no serious environmental problems, he asserted. “All long-run trends point in exactly the opposite direction from the projections of the doomsayers.” (All? Really?) “There is no convincing economic reason why these trends toward a better life should not continue indefinitely.” Relishing his role as a spoiler, he gave speeches while wearing red plastic devil horns. Unsurprisingly, he attracted disagreement, to which he responded with as much bluster as Ehrlich. Critics, motivated by “blatant intellectual dishonesty” and indifference to the poor, were “corrupt,” their ideas “ignorant and wrongheaded.”

In 1980, the two men wagered $1,000 on the prices of five metals 10 years hence. If the prices rose, as Ehrlich predicted, it would imply that these resources were growing scarcer, as Homo sapiens plundered the planet. If the prices fell, this would be a sign that markets and human cleverness had made the metals relatively less scarce: progress was continuing. Prices dropped. Ehrlich paid up, insisting disingenuously that he had been “schnookered.”

Schnookered, no; unlucky, yes. In 2010, three Holy Cross economists simulated the bet for every decade from 1900 to 2007. Ehrlich would have won 61 percent of the time. The results, Sabin says, do not prove that these resources have grown scarcer. Rather, metal prices crashed after the First World War and spent most of a century struggling back to their 1918 levels. Ecological issues were almost irrelevant.

The bet demonstrated little about the environment but much about environmental politics. The American landscape first became a source of widespread anxiety at the beginning of the 20th century. Initially, the fretting came from conservatives, both the rural hunters who established the licensing system that brought back white-tailed deer from near-extinction and the Ivy League patricians who created the national parks. So ineradicable was the conservative taint that decades later, the left still scoffed at ecological issues as right-wing distractions. At the University of Michigan, the radical Students for a Democratic Society protested the first Earth Day, in 1970, as elitist flimflam meant to divert public attention from class struggle and the Vietnam War; the left-wing journalist I. F. Stone called the nationwide marches a “snow job.” By the 1980s, businesses had realized that environmental issues had a price tag. Increasingly, they balked. Reflexively, the anticorporate left pivoted; Earth Day, erstwhile snow job, became an opportunity to denounce capitalist greed.

Climate change is a perfect issue for symbolic battle, because it is as yet mostly invisible.

The result, as the Emory historian Patrick Allitt demonstrates in A Climate of Crisis, was a political back-and-forth that became ever less productive. Time and again, Allitt writes, activists and corporate executives railed against each other. Out of this clash emerged regulatory syntheses: rules for air, water, toxins. Often enough, businesspeople then discovered that following the new rules was less expensive than they had claimed it would be; environmentalists meanwhile found out that the problems were less dire than they had claimed.


Throughout the 1980s, for instance, activists charged that acid rain from midwestern power-plant emissions was destroying thousands of East Coast lakes. Utilities insisted that anti-pollution equipment would be hugely expensive and make homeowners’ electric bills balloon. One American Electric Power representative predicted that acid-rain control could lead to the “destruction of the Midwest economy.” A 1990 amendment to the Clean Air Act, backed by both the Republican administration and the Democratic Congress, set up a cap-and-trade mechanism that reduced acid rain at a fraction of the predicted cost; electric bills were barely affected. Today, most scientists have concluded that the effects of acid rain were overstated to begin with—fewer lakes were hurt than had been thought, and acid rain was not the only cause.

Rather than learning from this and other examples that, as Allitt puts it, “America’s environmental problems, though very real, were manageable,” each side stored up bitterness, like batteries taking on charge. The process that had led, however disagreeably, to successful environmental action in the 1970s and ’80s brought on political stasis in the ’90s. Environmental issues became ways for politicians to signal their clan identity to supporters. As symbols, the issues couldn’t be compromised. Standing up for your side telegraphed your commitment to take back America—either from tyrannical liberal elitism or right-wing greed and fecklessness. Nothing got done.

As an issue, climate change is perfect for symbolic battle, because it is as yet mostly invisible. Carbon dioxide, its main cause, is not emitted in billowing black clouds, like other pollutants; nor is it caustic, smelly, or poisonous. A side effect of modernity, it has for now a tiny practical impact on most people’s lives. To be sure, I remember winters as being colder in my childhood, but I also remember my home then as a vast castle and my parents as godlike beings.

In concrete terms, Americans encounter climate change mainly in the form of three graphs, staples of environmental articles. The first shows that atmospheric carbon dioxide has been steadily increasing. Almost nobody disputes this. The second graph shows rising global temperatures. This measurement is trickier: carbon dioxide is spread uniformly in the air, but temperatures are affected by a host of factors (clouds, rain, wind, altitude, the reflectivity of the ground) that differ greatly from place to place. Here the data are more subject to disagreement. A few critics argue that for the past 17 years warming has mostly stopped. Still, most scientists believe that in the past century the Earth’s average temperature has gone up by about 1.5 degrees Fahrenheit.

Rising temperatures per se are not the primary concern. What matters most is their future influence on other things: agricultural productivity, sea levels, storm frequency, infectious disease. As the philosopher Dale Jamieson points out in the unfortunately titled Reason in a Dark Time, most of these effects cannot be determined by traditional scientific experiments—white-coats in laboratories can’t melt a spare Arctic ice cap to see what happens. (Climate change has no lab rats.) Instead, thousands of researchers refine ever bigger and more complex mathematical models. The third graph typically shows the consequences such models predict, ranging from worrisome (mainly) to catastrophic (possibly).

Such charts are meaningful to the climatologists who make them. But for the typical citizen they are a muddle, too abstract—too much like 10th-grade homework—to be convincing, let alone to motivate action. In the history of our species, has any human heart ever been profoundly stirred by a graph? Some other approach, proselytizers have recognized, is needed.

To stoke concern, eco-campaigners like Bill McKibben still resort, Ehrlich-style, to waving a skeleton at the reader. Thus the first sentence of McKibben’sOil and Honey, a memoir of his climate activism, describes 2011–12, the period covered by his book, as “a time when the planet began to come apart.” Already visible “in almost every corner of the earth,” climate “chaos” is inducing “an endless chain of disasters that will turn civilization into a never-ending emergency response drill.”

Bill McKibben says we must “start producing a nation of careful, small-scale farmers … who can adapt to the crazed new world with care and grace.”

The only solution to our ecological woes, McKibben argues, is to live simpler, more local, less resource-intensive existences—something he believes is already occurring. “After a long era of getting big and distant,” he writes, “our economy, and maybe our culture, has started to make a halting turn toward the small and local.” Not only will this shift let us avoid the worst consequences of climate change, it will have the happy side effect of turning a lot of unpleasant multinational corporations to ash. As we “subside into a workable, even beautiful, civilization,” we will lead better lives. No longer hypnotized by the buzz and pop of consumer culture, narcotized couch potatoes will be transformed into robust, active citizens: spiritually engaged, connected to communities, appreciative of Earth’s abundance.

For McKibben, the engagement is full throttle: The Oil half of his memoir is about founding, a group that seeks to create a mass movement against climate change. (The 350 refers to the theoretical maximum safe level, in parts per million, of atmospheric carbon dioxide, a level we have already surpassed.) The Honey half is about buying 70 acres near his Vermont home to support an off-the-grid beekeeper named Kirk Webster, who is living out McKibben’s organic dream in a handcrafted, solar-powered cabin in the woods. Webster, McKibben believes, is the future. We must, he says, “start producing a nation of careful, small-scale farmers such as Kirk Webster, who can adapt to the crazed new world with care and grace, and who don’t do much more damage in the process.”

Poppycock, the French philosopher Pascal Bruckner in effect replies in The Fanaticism of the Apocalypse. A best-selling, telegenic public intellectual (a species that hardly exists in this country), Bruckner is mainly going after what he calls “ecologism,” of which McKibbenites are exemplars. At base, he says, ecologism seeks not to save nature but to purify humankind through self-flagellating asceticism.

To Bruckner, ecologism is both ethnocentric and counterproductive. Ethnocentric because eco-denunciations of capitalism simply give new, green garb to the long-standing Euro-American fear of losing dominance over the developing world (whose recent growth derives, irksomely, from fossil fuels). Counterproductive because ecologism induces indifference, or even hostility to environmental issues. In the quest to force humanity into a puritanical straitjacket of rural simplicity, ecologism employs what should be neutral, fact-based descriptions of a real-world problem (too much carbon dioxide raises temperatures) as bludgeons to compel people to accept modes of existence they would otherwise reject. Intuiting moral blackmail underlying the apparently objective charts and graphs, Bruckner argues, people react with suspicion, skepticism, and sighing apathy—the opposite of the reaction McKibbenites hope to evoke.

The ranchers and farmers in Tony Horwitz’s Boom, a deft and sometimes sobering e-book, suggest Bruckner may be on to something. Horwitz, possibly best known for his study of Civil War reenactors, Confederates in the Attic, travels along the proposed path of the Keystone XL, a controversial pipeline intended to take oil from Alberta’s tar-sands complex to refineries in Steele City, Nebraska—and the project McKibben has used as the rallying cry for McKibben set off on his anti-Keystone crusade after the climatologist-provocateur James Hansen charged in 2011 that building the pipeline would be “game over” for the climate. If Keystone were built, Hansen later wrote, “civilization would be at risk.” Everyone Horwitz meets has heard this scenario. But nobody seems to have much appetite for giving up the perks of industrial civilization, Kirk Webster–style. “You want to go back to the Stone Age and use only wind, sun, and water?” one person asks. A truck driver in the tar-sands project tells Horwitz, “This industry is giving me a future, even if it’s a short one and we’re all about to toast together.” Given the scale of the forces involved, individual action seems futile. “It’s going to burn up anyhow at the end,” explains a Hutterite farmer, matter-of-factly. “The world will end in fire.”


Whereas McKibbenites see carbon dioxide as an emblem of a toxic way of life, economists like William Nordhaus of Yale tend to view it as simply a by-product of the good fortune brought by capitalism. Nordhaus, the president of the American Economic Association, has researched climate issues for four decades. His The Climate Casino has an even, unhurried tone; a classic Voice of Authority rumbles from the page. Our carbon-dioxide issues, he says, have a “simple answer,” one “firmly based in economic theory and history”:

The best approach is to use market mechanisms. And the single most important market mechanism that is missing today is a high price on CO2 emissions, or what is called “carbon prices” … The easiest way is simply to tax CO2 emissions: a “carbon tax” … The carbon price [from the tax] will be passed on to the consumer in the form of higher prices.

Nordhaus provides graphs (!) showing how a gradually increasing tax—or, possibly, a market in emissions permits—would slowly and steadily ratchet down global carbon-dioxide output. The problem, as he admits, is that the projected reduction “assumes full participation.” Translated from econo-speak, “full participation” means that the Earth’s rich and populous nations must simultaneously apply the tax. Brazil, China, France, India, Russia, the United States—all must move in concert, globally cooperating.

To say that a global carbon tax is a simple answer is like arguing that the simple answer to death is repealing the Second Law of Thermodynamics.

Alas, nothing like Nordhaus’s planetary carbon tax has ever been enacted. The sole precedent is the Montreal Protocol, the 1987 treaty banning substances that react with atmospheric ozone and reduce its ability to absorb the sun’s harmful ultraviolet radiation. Signed by every United Nations member and successfully updated 10 times, the protocol is a model of international eco-cooperation. But it involves outlawing chemicals in refrigerators and spray cans, not asking nations to revamp the base of their own prosperity. Nordhaus’s declaration that a global carbon tax is a simple answer is like arguing that the simple answer to death is repealing the Second Law of Thermodynamics.

Does climate change, as Nordhaus claims, truly slip into the silk glove of standard economic thought? The dispute is at the center of Jamieson’s Reason in a Dark Time. Parsing logic with the care of a raccoon washing a shiny stone, Jamieson maintains that economists’ discussions of climate change are almost as problematic as those of environmentalists and politicians, though for different reasons.

Remember how I was complaining that all discussions of climate change devolve into homework? Here, sadly, is proof. To critique economists’ claims, Jamieson must drag the reader through the mucky assumptions underlying cost-benefit analysis, a standard economic tool. In the case of climate change, the costs of cutting carbon dioxide are high. What are the benefits? If the level of carbon dioxide in the atmosphere rises only slightly above its current 400 parts per million, most climatologists believe, there is (roughly) a 90 percent chance that global temperatures will eventually rise between 3 and 8 degrees Fahrenheit, with the most likely jump being between 4 and 5 degrees. Nordhaus and most other economists conclude that humankind can slowly constrain this relatively modest rise in carbon without taking extraordinary, society-transforming measures, though neither decreasing the use of fossil fuels nor offsetting their emissions will be cheap or easy. But the same estimates show (again in rough terms) a 5 percent chance that letting carbon dioxide rise much above its current level would set off a domino-style reaction leading to global devastation. (No one pays much attention to the remaining 5 percent chance that the carbon rise would have very little effect on temperature.)

In our daily lives, we typically focus on the most likely result: I decide whether to jaywalk without considering the chance that I will trip in the street and get run over. But sometimes we focus on the extreme: I lock up my gun and hide the bullets in a separate place to minimize the chance that my kids will find and play with them. For climate change, should we focus on adapting to the mostprobable outcome or averting the most dangerous one? Cost-benefit analyses typically ignore the most-radical outcomes: they assume that society has agreed to accept the small but real risk of catastrophe—something environmentalists, to take one particularly vehement section of society, have by no means done.

On top of this, Jamieson argues, there is a second problem in the models economists use to discus climate change. Because the payoff from carbon-dioxide reduction will occur many decades from now, Nordhausian analysis suggests that we should do the bare minimum today, even if that means saddling our descendants with a warmer world. Doing the minimum is expensive enough already, economists say. Because people tomorrow will be richer than we are, as we are richer than our grandparents were, they will be better able to pay to clean up our emissions. Unfortunately, this is an ethically problematic stance. How can we weigh the interests of someone born in 2050 against those of someone born in 1950? In this kind of trade-off between generations, Jamieson argues, “there is no plausible value” for how much we owe the future.

Given their moral problems, he concludes, economic models are much less useful as guides than their proponents believe. For all their ostensible practicality—for all their attempts to skirt the paralysis-inducing specter of the apocalypse—economists, too, don’t have a good way to talk about climate change.

Years ago, a colleague and I spoke with the physicist Richard Feynman, later a national symbol of puckish wit and brash truth-telling. At the frontiers of science, he told us, hosts of unclear, mutually contradictory ideas are always swarming about. Researchers can never agree on how to proceed or even on what is important. In these circumstances, Feynman said, he always tried to figure out what would take him forward no matter which theory eventually turned out to be correct. In this agnostic spirit, let’s assume that rising carbon-dioxide levels will become a problem of some magnitude at some time and that we will want to do something practical about it. Is there something we should do, no matter what technical arcanae underlie the cost-benefit analyses, no matter when we guess the bad effects from climate change will kick in, no matter how we value future generations, no matter what we think of global capitalism? Indeed, is there some course of action that makes sense even if we think that climate change isn’t much of a problem at all?

As my high-school math teacher used to say, let’s do the numbers. Roughly three-quarters of the world’s carbon-dioxide emissions come from burning fossil fuels, and roughly three-quarters of that comes from just two sources: coal in its various forms, and oil in its various forms, including gasoline. Different studies produce slightly different estimates, but they all agree that coal is responsible for more carbon dioxide than oil is—about 25 percent more. That number is likely to increase, because coal consumption is growing much faster than oil consumption.

Geo-engineering involves tinkering with planetary systems we only partially understand. But planet-hacking does have an overarching advantage: it’s cheap.​

Although coal and oil are both fossil fuels, they are used differently. In this country, for example, the great majority of oil—about three-quarters—is consumed by individuals, as they heat their homes and drive their cars. Almost all U.S. coal (93 percent) is burned not in homes but by electric-power plants; the rest is mainly used by industry, notably for making cement and steel. Cutting oil use, in other words, requires huge numbers of people to change their houses and automobiles—the United States alone has 254 million vehicles on the road. Reducing U.S. coal emissions, by contrast, means regulating 557 big power plants and 227 steel and cement factories. (Surprisingly, many smaller coal plants exist, some at hospitals and schools, but their contributions are negligible.) I’ve been whacking poor old Nordhaus for his ideas about who should pay for climate change, but he does make this point, and precisely: “The most cost-effective way to reduce CO2 emissions is to reduce the use of coal first and most sharply.” Note, too, that this policy comes with a public-health bonus: reining in coal pollution could ultimately avoid as many as 6,600 premature deaths and 150,000 children’s asthma attacks per year in the United States alone.


Different nations have different arrangements, but almost everywhere the basic point holds true: a relatively small number of industrial coal plants—perhaps 7,000 worldwide—put out an amazingly large amount of carbon dioxide, more than 40 percent of the global total. And that figure is rising; last year, coal’s share of energy production hit a 44-year high, because Asian nations are building coal plants at a fantastic rate (and, possibly, because demand for coal-fired electricity will soar as electric cars become popular). No matter what your views about the impact and import of climate change, you are primarily talking about coal. To my mind, at least, retrofitting 7,000 industrial facilities, however mind-boggling, is less mind-boggling than, say, transforming the United States into “a nation of careful, small-scale farmers” or enacting a global carbon tax with “full participation.” It is, at least, imaginable.

The focus of the Obama administration on reducing coal emissions suggests that it has followed this logic. If the pattern of the late 20th century still held, industry would reply with exaggerated estimates of the cost, and compromises would be worked out. But because the environment has become a proxy for a tribal battle, an exercise in power politics will surely ensue. I’ve given McKibben grief for his apocalyptic rhetoric, but he’s exactly correct that without a push from a popular movement—without something like—meaningful attempts to cut back coal emissions are much less likely to yield results.

Regrettably, has fixated on the Keystone pipeline, which the Congressional Research Service has calculated would raise this nation’s annual output of greenhouse gases by 0.05 to 0.3 percent. (James Hansen, in arguing that the pipeline would be “game over” for the climate, erroneously assumed that all of the tar-sands oil could be burned rapidly, instead of dribbling out in relatively small portions year by year, over decades.) None of this is to say that exploiting tar sands is a good idea, especially given the apparent violation of native treaties in Canada. But a popular movement focused on symbolic goals will have little ability to win practical battles in Washington.

If politics fail, the only recourse, says David Keith, a Harvard professor of public policy and applied physics, will be a technical fix. And soon—by mid-century. Keith is talking about geo-engineering: fighting climate change with more climate change. A Case for Climate Engineering is a short book arguing that we should study spraying the stratosphere with tiny glittering droplets of sulfuric acid that bounce sunlight back into space, reducing the Earth’s temperature. Physically speaking, the notion is feasible. The 1991 eruption of Mount Pinatubo, in the Philippines, created huge amounts of airborne sulfuric acid—and lowered the Earth’s average temperature that year by about 1 degree.

Keith is candid about the drawbacks. Not only does geo-engineering involve tinkering with planetary systems we only partially understand, it can’t cancel out, even in theory, greenhouse problems like altered rainfall patterns and increased ocean acidity. The sulfur would soon fall to the Earth, a toxic rain of pollution that could kill thousands of people every year. The carbon dioxide that was already in the air would remain. To continue to slow warming, sulfur would have to be lofted anew every year. Still, Keith points out, without this relatively crude repair, unimpeded climate change could be yet more deadly.

Planet-hacking does have an overarching advantage: it’s cheap. “The cost of geoengineering the entire planet for a decade,” Keith writes, “could be less than the $6 billion the Italian government is spending on dikes and movable barriers to protect a single city, Venice, from climate change–related sea level rise.”

That advantage is also dangerous, he points out. A single country could geo-engineer the whole planet by itself. Or one country’s geo-engineering could set off conflicts with another country—a Chinese program to increase its monsoon might reduce India’s monsoon. “Both are nuclear weapons states,” Keith reminds us. According to Forbes, the world has 1,645 billionaires, several hundred of them in nations threatened by climate change. If their businesses or homes were at risk, any one of them could single-handedly pay for a course of geo-engineering. Is anyone certain none of these people would pull the trigger?

Few experts think that relying on geo-engineering would be a good idea. But no one knows how soon reality will trump ideology, and so we may finally have hit on a useful form of alarmism. One of the virtues of Keith’s succinct, scary book is to convince the reader that unless we find a way to talk about climate change, planes full of sulfuric acid will soon be on the runway.

CONCLIMA 2013 – acesse vídeos de todas as palestras (Rede Clima)

CONCLIMA 2013 – acesse vídeos de todas as palestras

imagem video conclimaEstão disponíveis na Internet os vídeos de todas as apresentações realizadas durante a 1ª CONCLIMA – Conferência Nacional da Rede CLIMA, INCT para Mudanças Climáticas (INCT-MC) e Programa Fapesp de Pesquisas sobre Mudanças Climáticas Globais (PFPMCG), realizada de 9 a 13 de setembro em São Paulo. A Rede CLIMA também produziu uma síntese de toda a conferência, com duração de 30 minutos.

O objetivo da CONCLIMA foi apresentar os resultados das pesquisas e o conhecimento gerado por esses importantes programas e projetos – um ambicioso empreendimento científico criado pelos governos federal e do Estado de São Paulo para prover informações de alta qualidade em estudos de clima, detecção de variabilidade climática e mudança climática, e seus impactos em setores chaves do Brasil.

Acesse os vídeos:

Vídeo da CONCLIMA – 1a Conferência Nacional de Mudanças Climáticas Globais:

Apresentações – arquivos PDF

Íntegra das apresentações – VÍDEOS

Mesa de Abertura


Paulo Nobre – INPE

Iracema Cavalcanti – INPE

Léo Siqueira – INPE

Marcos Heil Costa – UFV

Sérgio Correa – UERJ


Tércio Ambrizzi – USP 

Eduardo Assad – Embrapa

Mercedes Bustamante – UnB


Agricultura – Hilton Silveira Pinto – Embrapa

Recursos Hídricos – Alfredo Ribeiro Neto – UFPE

Energias Renováveis – Marcos Freitas – COPPE/UFRJ

Biodiversidade e Ecossistemas – Alexandre Aleixo – MPEG

Desastres Naturais – Regina Rodrigues – UFSC 

Zonas Costeiras – Carlos Garcia – FURG

Urbanização e Cidades – Roberto do Carmo – Unicamp

Economia – Eduardo Haddad – USP

Saúde – Sandra Hacon – Fiocruz

Desenvolvimento Regional – Saulo Rodrigues Filho – UnB


O INCT para Mudanças Climáticas – José Marengo – INPE

Detecção e atribuição e variabilidade natural do clima – Simone Ferraz – UFSM

Mudanças no uso da terra – Ana Paula Aguiar – INPE

Ciclos Biogeoquímicos Globais e Biodiversidade – Mercedes Bustamante – UnB

Oceanos – Regina Rodrigues – UFSC

REDD – Osvaldo Stella – IPAM

Cenários Climáticos Futuros e Redução de Incertezas – José Marengo – INPE

Gases de Efeito Estufa – Plínio Alvalá – INPE

Estudos de ciência, tecnologia e políticas públicas – Myanna Lahsen – INPE

Interações biosfera-atmosfera – Gilvan Sampaio – INPE

Amazônia – Gilberto Fisch – IAE/DCTA


Sistema de Alerta Precoce para Doenças Infecciosas Emergentes na Amazônia Ocidental – Manuel Cesario – Unifran

Clima e população em uma região de tensão entre alta taxa de urbanização e alta biodiversidade: Dimensões sociais e ecológicas das mudanças climáticas – Lucia da Costa Ferreira – Unicamp

Cenários de impactos das mudanças climáticas na produção de álcool visando a definição de políticas públicas – Jurandir Zullo – Unicamp

Fluxos hidrológicos e fluxos de carbono – casos da Bacia Amazônica e reflorestamento de microbacias – Humberto Rocha – USP

O papel dos rios no balanço regional do carbono – Maria Victoria Ballester – USP

Aerossóis atmosféricos, balanço de radiação, nuvens e gases traços associados com mudanças de uso de solo na Amazônia – Paulo Artaxo – USP

Socio-economic impacts of climate change in Brazil: quantitative inputs for the design of public policies – Joaquim José Martins Guilhoto e Rafael Feltran Barbieri- USP

Emissão de dióxido de carbono em solos de áreas de cana-de-açúcar sob diferentes estratégias de manejo – Newton La Scala Jr – Unesp

Impacto do Oceano Atlantico Sudoeste no Clima da America do Sul ao longo dos séculos 20 e 21 – Tércio Ambrizzi – USP


Apresentação Sergio Margulis – SAE – Presidência da República

Apresentação Gustavo Luedemann (MCTI)

Apresentação Carlos Klink (SMCQ/MMA)

Apresentação Couto Silva (MMA): Ambiente sobre o status da Elaboração do Plano Nacional de Adaptação. Funcionamento do GT Adaptação e suas redes temáticas. Proposta de Calendário. Proposta de Estrutura do Plano. 

Apresentação Alexandre Gross (FGV): Recortes temáticos do Plano Nacional de Adaptação: apresentação do Relatório sobre dimensões temporal, espacial e temática na adaptação às mudanças climáticas (Produto 4), processo e resultados do GT Adaptação, coleta de contribuições e discussão.

Mesa redonda: Mudanças climáticas, extremos e desastres naturais 

Apresentação Rafael Schadeck – CENAD 

Apresentação Marcos Airton de Sousa Freitas – ANA 

Mesa redonda: Relação ciência – planos setoriais; políticas públicas

Apresentação Carlos Nobre – SEPED/MCTI

Apresentação Luiz Pinguelli Rosa (COPPE UFRJ, FBMC)

Apresentação Eduardo Viola – UnB

Mesa redonda: Inventários e monitoramento das emissões e remoções de GEE 

Apresentação Gustavo Luedemann – MCTI 


Apresentação Patrícia Pinho – IGBP/INPE

Apresentação Paulo Artaxo – USP

John Oliver Does Science Communication Right (I Fucking Love Science)

May 15, 2014 | by Stephen Luntz

photo credit: Last Week Tonight With John Oliver (HBO). Satirist John Oliver shows how scientific pseudo-debates should be covered

One of the most frustrating experiences scientists, science communicators and anyone who cares about science have is the sight of media outlets giving equal time to positions held by a tiny minority of researchers.

This sort of behavior turns up for all sorts of concocted “controversies”, satirized as “Opinions differ on the Shape of the Earth”. However, the most egregious examples occur in reporting climate change. Thousands of carefully researched peer reviewed papers are weighed in the balance and judged equal to a handful of shoddily writtennumerically flaky publications whose flaws take less than a day  to come to light.

That is, of course, if you ignore the places where the anti-science side pretty much gets free range.

So it is a delight to see John Oliver show how it should be done.

We have only one problem with Oliver’s work. He repeats the claim that 97% of climate scientists agree that humans are warming the planet. In fact the study he referred to has 97.1% of peer reviewed papers on climate change endorsing this position. However, these papers were usually produced by large research teams, while the opposing minority were often cooked up by a couple of kooks in their garage. When you look at the numbers of scientists involved the numbers are actually 98.4% to 1.2%, with the rest undecided. Which might not sound like a big difference, but would make Oliver’s tame “skeptic” look even more lonely.
HT Vox, with a nice summary of the evidence


Brasil sedia conferência sobre comunicação pública da ciência (Fapesp)

Questões relacionadas à prática e à pesquisa em divulgação científica integram a programação do evento que ocorrerá em Salvador; inscrições com desconto vão até 7 de abril (PSCT)


Agência FAPESP –A Rede Internacional de Comunicação Pública da Ciência e da Tecnologia (PCST, na sigla em inglês) realiza, pela primeira vez na América Latina, entre 5 e 8 de maio, sua conferência internacional que tem, entre os objetivos principais, promover debates sobre o engajamento público em temas relacionados a ciência e tecnologia.

Formada por indivíduos de diversas partes do mundo que produzem e estudam na área de comunicação pública de ciência e tecnologia, a Rede PCST patrocina conferências, discussões virtuais e outras atividades que promovam novas ideias e perspectivas.

A edição deste ano terá como tema central “Divulgação da Ciência para a inclusão social e o engajamento político”. A ideia é mostrar que, mesmo com investimento em ciência e tecnologia, a maior parte do mundo ainda enfrenta a exclusão social e o desenvolvimento desigual, separando cada vez mais os países ricos dos pobres. O papel da difusão científica é criar possibilidades de ação dos cidadãos que permitam reduzir essa diferença.

Na plenária “Comunicação da ciência e mídia social”, os participantes poderão conferir as palestras de Dominique Brossard, da Universidade de Wisconsin, dos Estados Unidos, e membro do comitê científico da Rede PCST, e de Mohammed Yahia, editor da Nature Middle East, do Egito.

Além da Rede Internacional do PCST, o evento é promovido pelo Museu da Vida, da Fundação Oswaldo Cruz (Fiocruz), e pelo Laboratório de Estudos Avançados em Jornalismo (Labjor), da Universidade Estadual de Campinas (Unicamp).

As inscrições com desconto na taxa podem ser realizadas até 7 de abril. As atividades ocorrerão no Hotel Pestana Bahia, que fica na Rua Fonte do Boi, 216, em Salvador.

Mais informações

Against storytelling of scientific results (Nature Methods)

Yarden Katz

Nature Methods 10, 1045 (2013) doi:10.1038/nmeth.2699 – Published online

30 October 2013

To the Editor:

Krzywinski and Cairo1 beautifully illustrate the widespread view that scientific writing should follow a journalistic ‘storytelling’, wherein the choice of what data to plot, and how, is tailored to the message the authors want to deliver. However, they do not discuss the pitfalls of the approach, which often result in a distorted and unrepresentative display of data—one that does not do justice to experimental complexities and their myriad of interpretations.

If we project the features of great storytellers onto a scientist, the result is a portrait of a scientist far from ideal. Great storytellers embellish and conceal information to evoke a response in their audience. Inconvenient truths are swept away, and marginalities are spun to make a point more spectacular. A storyteller would plot the data in the way most persuasive rather than most informative or representative.

Storytelling encourages the unrealistic view that scientific projects fit a singular narrative. Biological systems are difficult to measure and control, so nearly all experiments afford multiple interpretations—but storytelling actively denies this fact of science.

The ‘story-told’ scientific paper is a constrictive mapping between figures and text. Figures produced by masters of scientific storytelling are so tightly controlled to match the narrative that the reader is left with little to ponder or interpret. Critical reading of such papers becomes a detective’s game, in which one reads between the lines for clues of data relegated to a supplement for their deviance from ‘the story’.

Dissecting the structure of scientific papers, Bruno Latour explains the utility of the storytelling approach in giving readers the sense that they are evaluating the data along with the authors while simultaneously persuading them of the story. The storytelling way to achieve this is “to lay out the text so that wherever the reader is there is only one way to go”2—or as Krzywinski and Cairo put it, “Inviting readers to draw their own conclusions is risky”1. Authors prevent this by “carefully stacking more black boxes, less easily disputable arguments”2. This is consistent with the visualization advice that Krzywinski and Cairo give: the narrower and more processed the display of the data is to fit the story, the more black boxes are stacked, making it harder for the reader to access data raw enough to support alternative models or ‘stories’.

Readers and authors know that complex experiments afford multiple interpretations, and so such deviances from the singular narrative must be present somewhere. It would be better for both authors and readers if these could be discussed openly rather than obfuscated. For those who plan to follow up on the results, these discrepancies are often the most important. Storytelling therefore impedes communication of critical information by restricting the scope of the data to that agreeable with the story.

Problems arise when experiments are driven within a storytelling framework. In break rooms of biology research labs, one often hears: “It’d be a great story if X regulated Y by novel mechanism Z.” Experiments might be prioritized by asking, “Is it important for your story?” Storytelling poses a dizzying circularity: before your findings are established, you should decide whether these are the findings you would like to reach. Expectations of a story-like narrative can also be demoralizing to scientists, as most experimental data do not easily fold into this framing.

Finally, a great story in the journalistic sense is a complete one. Papers that make the unexplained observations transparent get penalized in the storytelling framework as incomplete. This prevents the communal puzzle-solving that arises by piecing together unexplained observations from multiple papers.

The alternative to storytelling is the usual language of evidence and arguments that are used—with varying degrees of certainty—to support models and theories. Speaking of models and their evidence goes back to the oldest of scientific discourse, and this framing is also standard in philosophy and law. This language allows authors to discuss evidence for alternative models without imposing a singular journalistic-like story.

There might be other roles for storytelling. Steven McKnight’s lab recently found, entirely unexpectedly, that a small molecule can be used to purify a complex of RNA-binding proteins in the cell, revealing a wide array of striking biological features3. It is that kind of story of discovery—what François Jacob called “night science”—that is often best suited for storytelling, though these narratives are often deemed by scientists as irrelevant ‘fluff’.

As practiced, storytelling shares more with journalism than with science. Journalists seek a great story, and the accompanying pressures sometimes lead to distortion in the portrayal of events in the press. When exerted on scientists, these pressures can yield similar results. Storytelling encourages scientists to design experiments according to what constitutes a ‘great story’, potentially closing off unforeseen avenues more exciting than any story imagined a priori. For the alternative framing to be adopted, editors, reviewers and authors (particularly at the higher-profile journals) will have to adjust their evaluation criteria and reward authors who choose representative displays while discussing alternative models to their own.


  1. Krzywinski, M. & Cairo, A. Nat. Methods 10, 687 (2013).
  2. Latour, B. Science in Action (Harvard Univ. Press, 1987).
  3. Baker, M. Nat. Methods 9, 639 (2012).

Quando os jornalistas falam sobre ciência (Fapesp)

Martin Bauer foi um dos palestrantes do painel da FAPESP Week London sobre cultura científica (foto: LSE)


Por Fernando Cunha, de Londres

Agência FAPESP – O retrato que os jornalistas fazem da ciência, a tensão quando repórteres tentam relatar ao público o que faz a ciência e a maneira como a ciência aparece na ficção – algumas vezes de forma estereotipada – foram algumas das questões discutidas no painel sobre cultura científica, apresentado na sexta-feira (27/09), último dia de atividades da FAPESP Week London 2013.

Martin Bauer, da London School of Economics, falou sobre um projeto de pesquisa coordenado por ele que criará indicadores para verificar o nível de mobilização do mundo científico para divulgar ciência. Temas controversos como transgenia e mudanças climáticas, que provocam mobilização política, sempre vêm acompanhados de um momento educativo, segundo o pesquisador.

O projeto propõe a construção, até o fim de 2015, de um sistema de indicadores a partir de parâmetros como: a atenção do público à informação científica, as aspirações de leitores em termos de bem-estar, a percepção da contribuição da ciência para a cultura do público e a distância relativa entre o sujeito que recebe a informação científica e a própria ciência.

“Estamos interessados na análise dos conteúdos para entender a eficácia da linguagem utilizada e, particularmente, no conceito de autoridade da ciência para transmitir conteúdos de pesquisa para o público”, disse Bauer.

Os indicadores previstos estão agrupados em três tipos: percepção pública da ciência; presença, em termos quantitativos, da ciência na imprensa e na produção cultural da mídia – incluindo as diferentes editorias em jornais e a programação de rádio e TV, incluindo telenovelas; e assuntos científicos tratados.

A pesquisa e coleta de dados serão feitas principalmente em países da Europa e na Índia, e incluirá também o Brasil, em uma colaboração com o Laboratório de Estudos Avançados em Jornalismo da Universidade Estadual de Campinas (Labjor/Unicamp) e a Fundação Oswaldo Cruz (Fiocruz).

Interesse público

Marcelo Leite, repórter do jornal Folha de S. Paulo, tratou do jornalismo de ciência e da percepção pública da ciência no Brasil a partir de uma análise de matérias publicadas sobre as mudanças climáticas globais.

“Estou convencido de que o jornalismo de ciência pode funcionar como um modelo que todas as formas de jornalismo poderiam e deveriam seguir, porque vai além de opiniões, crenças e ideologias para chegar ao mais perto que podemos esperar da verdade”, disse Leite. “O jornalismo de ciência instrui leitores sobre o processo de pesquisa científica, tentando desfazer a percepção equivocada de que os cientistas são detentores da verdade eterna”, disse.

Para o jornalista, a presença do tema das mudanças climáticas nos veículos de comunicação e, em especial, a divulgação na imprensa do conteúdo do quinto relatório do Painel Intergovernamental sobre Mudanças Climáticas (IPCC), da Organização das Nações Unidas (ONU), no fim de setembro, oferecem interessante oportunidade de análise da cobertura e da percepção dos conteúdos.

Uma pesquisa realizada pelo Ministério da Ciência, Tecnologia e Inovação no Brasil, concluída em 2010, mostrou que o índice de atenção do público brasileiro em ciência tem aumentado, atingindo 65% dos cidadãos consultados.

“Sobre o tema ambiente, o interesse subiu de 58%, em 2006, para 82% em 2010, provavelmente por questões relacionadas com a Amazônia e mudanças climáticas, que parecem atrair a maioria das pessoas”, disse.

Em relação à cobertura jornalística de temas relacionados ao clima, a pesquisa concluiu que há uma concentração de textos na área política – com foco em negociações e na mitigação das consequências das emissões de gases de efeito estufa (50% das matérias) – e no desmatamento da floresta (23%).

O painel teve também a participação de Philip Macnaghten, da Universidade de Durham e professor visitante na Unicamp, que tratou dos cenários da pesquisa e inovação responsável no Brasil, e de Maria Immacolata Vassalo de Lopes, professora na Escola de Comunicações e Artes (ECA) da Universidade de São Paulo (USP), que falou sobre a experiência de criação de uma rede ibero-americana para o estudo da ficção na televisão, o Obitel, formada em 2005 na Colômbia, com a participação de pesquisadores de 12 países, incluindo do Brasil.

O projeto teve apoio da FAPESP, da USP, do Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq), da Globo Universidade (TV Globo) e do Ibope.

Realizada pela FAPESP na capital britânica entre 25 e 27 de setembro, com apoio da Royal Society e do British Council, a FAPESP Week London promoveu a discussão de temas avançados de pesquisa para ampliar oportunidades de colaboração entre cientistas brasileiros e europeus nos campos da Biodiversidade, Mudanças Climáticas, Ciências da Saúde, Bioenergia, Nanotecnologia e Comunicação.