Arquivo mensal: outubro 2011

The world at seven billion (BBC)

27 October 2011 Last updated at 23:08 GMT

File photograph of newborn babies in Lucknow, India, in July 2009

As the world population reaches seven billion people, the BBC’s Mike Gallagher asks whether efforts to control population have been, as some critics claim, a form of authoritarian control over the world’s poorest citizens.

The temperature is some 30C. The humidity stifling, the noise unbearable. In a yard between two enormous tea-drying sheds, a number of dark-skinned women patiently sit, each accompanied by an unwieldy looking cloth sack. They are clad in colourful saris, but look tired and shabby. This is hardly surprising – they have spent most of the day in nearby plantation fields, picking tea that will net them around two cents a kilo – barely enough to feed their large families.

Vivek Baid thinks he knows how to help them. He runs the Mission for Population Control, a project in eastern India which aims to bring down high birth rates by encouraging local women to get sterilised after their second child.

As the world reaches an estimated seven billion people, people like Vivek say efforts to bring down the world’s population must continue if life on Earth is to be sustainable, and if poverty and even mass starvation are to be avoided.

There is no doubting their good intentions. Vivek, for instance, has spent his own money on the project, and is passionate about creating a brighter future for India.

But critics allege that campaigners like Vivek – a successful and wealthy male businessman – have tended to live very different lives from those they seek to help, who are mainly poor women.

These critics argue that rich people have imposed population control on the poor for decades. And, they say, such coercive attempts to control the world’s population often backfired and were sometimes harmful.

Population scare

Most historians of modern population control trace its roots back to the Reverend Thomas Malthus, an English clergyman born in the 18th Century who believed that humans would always reproduce faster than Earth’s capacity to feed them.

Giving succour to the resulting desperate masses would only imperil everyone else, he said. So the brutal reality was that it was better to let them starve.

‘Plenty is changed into scarcity’

Thomas Malthus

From Thomas Malthus’ Essay on Population, 1803 edition:

A man who is born into a world already possessed – if he cannot get subsistence from his parents on whom he has a just demand, and if the society do not want his labour, has no claim of right to the smallest portion of food.

At nature’s mighty feast there is no vacant cover for him. She tells him to be gone, and will quickly execute her own orders, if he does not work upon the compassion of some of her guests. If these guests get up and make room for him, other intruders immediately appear demanding the same favour. The plenty that before reigned is changed into scarcity; and the happiness of the guests is destroyed by the spectacle of misery and dependence in every part of the hall.

Rapid agricultural advances in the 19th Century proved his main premise wrong, because food production generally more than kept pace with the growing population.

But the idea that the rich are threatened by the desperately poor has cast a long shadow into the 20th Century.

From the 1960s, the World Bank, the UN and a host of independent American philanthropic foundations, such as the Ford and Rockefeller foundations, began to focus on what they saw as the problem of burgeoning Third World numbers.

The believed that overpopulation was the primary cause of environmental degradation, economic underdevelopment and political instability.

Massive populations in the Third World were seen as presenting a threat to Western capitalism and access to resources, says Professor Betsy Hartmann of Hampshire College, Massachusetts, in the US.

“The view of the south is very much put in this Malthusian framework. It becomes just this powerful ideology,” she says.

In 1966, President Lyndon Johnson warned that the US might be overwhelmed by desperate masses, and he made US foreign aid dependent on countries adopting family planning programmes.

Other wealthy countries such as Japan, Sweden and the UK also began to devote large amounts of money to reducing Third World birth rates.

‘Unmet need’

What virtually everyone agreed was that there was a massive demand for birth control among the world’s poorest people, and that if they could get their hands on reliable contraceptives, runaway population growth might be stopped.

But with the benefit of hindsight, some argue that this so-called unmet need theory put disproportionate emphasis on birth control and ignored other serious needs.

Graph of world population figures

“It was a top-down solution,” says Mohan Rao, a doctor and public health expert at Delhi’s Jawaharlal Nehru University.

“There was an unmet need for contraceptive services, of course. But there was also an unmet need for health services and all kinds of other services which did not get attention. The focus became contraception.”

Had the demographic experts worked at the grass-roots instead of imposing solutions from above, suggests Adrienne Germain, formerly of the Ford Foundation and then the International Women’s Health Coalition, they might have achieved a better picture of the dilemmas facing women in poor, rural communities.

“Not to have a full set of health services meant women were either unable to use family planning, or unwilling to – because they could still expect half their kids to die by the age of five,” she says.

India’s sterilisation ‘madness’

File photograph of Sanjay and Indira Gandhi in 1980

Indira Gandhi and her son Sanjay (above) presided over a mass sterilisation campaign. From the mid-1970s, Indian officials were set sterilisation quotas, and sought to ingratiate themselves with superiors by exceeding them. Stories abounded of men being accosted in the street and taken away for the operation. The head of the World Bank, Robert McNamara, congratulated the Indian government on “moving effectively” to deal with high birth rates. Funding was increased, and the sterilising went on.

In Delhi, some 700,000 slum dwellers were forcibly evicted, and given replacement housing plots far from the city centre, frequently on condition that they were either sterilised or produced someone else for the operation. In poorer agricultural areas, whole villages were rounded up for sterilisation. When residents of one village protested, an official is said to have threatened air strikes in retaliation.

“There was a certain madness,” recalls Nina Puri of the Family Planning Association of India. “All rationality was lost.”

Us and them

In 1968, the American biologist Paul Ehrlich caused a stir with his bestselling book, The Population Bomb, which suggested that it was already too late to save some countries from the dire effects of overpopulation, which would result in ecological disaster and the deaths of hundreds of millions of people in the 1970s.

Instead, governments should concentrate on drastically reducing population growth. He said financial assistance should be given only to those nations with a realistic chance of bringing birth rates down. Compulsory measures were not to be ruled out.

Western experts and local elites in the developing world soon imposed targets for reductions in family size, and used military analogies to drive home the urgency, says Matthew Connelly, a historian of population control at Columbia University in New York.

“They spoke of a war on population growth, fought with contraceptive weapons,” he says. “The war would entail sacrifices, and collateral damage.”

Such language betrayed a lack of empathy with their subjects, says Ms Germain: “People didn’t talk about people. They talked of acceptors and users of family planning.”

Emergency measures

Critics of population control had their say at the first ever UN population conference in 1974.

Karan Singh, India’s health minister at the time, declared that “development is the best contraceptive”.

But just a year later, Mr Singh’s government presided over one of the most notorious episodes in the history of population control.

In June 1975, the Indian premier, Indira Gandhi, declared a state of emergency after accusations of corruption threatened her government. Her son Sanjay used the measure to introduce radical population control measures targeted at the poor.

The Indian emergency lasted less than two years, but in 1975 alone, some eight million Indians – mainly poor men – were sterilised.

Yet, for all the official programmes and coercion, many poor women kept on having babies.

And where they did not, it arguably had less to do with coercive population control than with development, just as Karan Singh had argued in 1974, says historian Matt Connelly.

For example, in India, a disparity in birth rates could already be observed between the impoverished northern states and more developed southern regions like Kerala, where women were more likely to be literate and educated, and their offspring more likely to be healthy.

Women there realised that they could have fewer births and still expect to see their children survive into adulthood.

China: ‘We will not allow your baby to live’

Steven Mosher was a Stanford University anthropologist working in rural China who witnessed some of the early, disturbing moments of Beijing’s One Child Policy.

“I remember very well the evening of 8 March, 1980. The local Communist Party official in charge of my village came over waving a government document. He said: ‘The Party has decided to impose a cap of 1% on population growth this year.’ He said: ‘We’re going to decide who’s going to be allowed to continue their pregnancy and who’s going to be forced to terminate their pregnancy.’ And that’s exactly what they did.”

“These were women in the late second and third trimester of pregnancy. There were several women just days away from giving birth. And in my hearing, a party official said: ‘Do not think that you can simply wait until you go into labour and give birth, because we will not allow your baby to live. You will go home alone’.”

Total control

By now, this phenomenon could be observed in another country too – one that would nevertheless go on to impose the most draconian population control of all.

The One Child Policy is credited with preventing some 400 million births in China, and remains in place to this day. In 1983 alone, more than 16 million women and four million men were sterilised, and 14 million women received abortions.

Assessed by numbers alone, it is said to be by far the most successful population control initiative. Yet it remains deeply controversial, not only because of the human suffering it has caused.

A few years after its inception, the policy was relaxed slightly to allow rural couples two children if their first was not a boy. Boy children are prized, especially in the countryside where they provide labour and care for parents in old age.

But modern technology allows parents to discover the sex of the foetus, and many choose to abort if they are carrying a girl. In some regions, there is now a serious imbalance between men and women.

Moreover, since Chinese fertility was already in decline at the time the policy was implemented, some argue that it bears less responsibility for China’s falling birth rate than its supporters claim.

“I don’t think they needed to bring it down further,” says Indian demographer AR Nanda. “It would have happened at its own slow pace in another 10 years.”

Backlash

In the early 1980s, objections to the population control movement began to grow, especially in the United States.

In Washington, the new Reagan administration removed financial support for any programmes that involved abortion or sterilisation.

“If you give women the tools they need – education, employment, contraception, safe abortion – then they will make the choices that benefit society”

Adrienne Germain

The broad alliance to stem birth rates was beginning to dissolve and the debate become more polarised along political lines.

While some on the political right had moral objections to population control, some on the left saw it as neo-colonialism.

Faith groups condemned it as a Western attack on religious values, but women’s groups feared changes would mean poor women would be even less well-served.

By the time of a major UN conference on population and development in Cairo in 1994, women’s groups were ready to strike a blow for women’s rights, and they won.

The conference adopted a 20-year plan of action, known as the Cairo consensus, which called on countries to recognise that ordinary women’s needs – rather than demographers’ plans – should be at the heart of population strategies.

After Cairo

Today’s record-breaking global population hides a marked long-term trend towards lower birth rates, as urbanisation, better health care, education and access to family planning all affect women’s choices.

With the exception of sub-Saharan Africa and some of the poorest parts of India, we are now having fewer children than we once did – in some cases, failing even to replace ourselves in the next generation. And although total numbers are set to rise still further, the peak is now in sight.

Chinese poster from the 1960s of mother and baby, captioned: Practicing birth control is beneficial for the protection of the health of mother and childChina promoted birth control before implementing its one-child policy

Assuming that this trend continues, total numbers will one day level off, and even fall. As a result, some believe the sense of urgency that once surrounded population control has subsided.

The term population control itself has fallen out of fashion, as it was deemed to have authoritarian connotations. Post-Cairo, the talk is of women’s rights and reproductive rights, meaning the right to a free choice over whether or not to have children.

According to Adrienne Germain, that is the main lesson we should learn from the past 50 years.

“I have a profound conviction that if you give women the tools they need – education, employment, contraception, safe abortion – then they will make the choices that benefit society,” she says.

“If you don’t, then you’ll just be in an endless cycle of trying to exert control over fertility – to bring it up, to bring it down, to keep it stable. And it never comes out well. Never.”

Nevertheless, there remain to this day schemes to sterilise the less well-off, often in return for financial incentives. In effect, say critics, this amounts to coercion, since the very poor find it hard to reject cash.

“The people proposing this argue ‘Don’t worry, everything’ s fine now we have voluntary programmes on the Cairo model’,” says Betsy Hartmann.

“But what they don’t understand is the profound difference in power between rich and poor. The people who provide many services in poor areas are already prejudiced against the people they serve.”

Work in progress

For Mohan Rao, it is an example of how even the Cairo consensus fails to take account of the developing world.

“Cairo had some good things,” he says. “However Cairo was driven largely by First World feminist agendas. Reproductive rights are all very well, but [there needs to be] a whole lot of other kinds of enabling rights before women can access reproductive rights. You need rights to food, employment, water, justice and fair wages. Without all these you cannot have reproductive rights.”

Perhaps, then, the humanitarian ideals of Cairo are still a work in progress.

Meanwhile, Paul Ehrlich has also amended his view of the issue.

If he were to write his book today, “I wouldn’t focus on the poverty-stricken masses”, he told the BBC.

“I would focus on there being too many rich people. It’s crystal clear that we can’t support seven billion people in the style of the wealthier Americans.”

Mike Gallager is the producer of the radio programme Controlling People on BBC World Service

Where do you fit into 7 billion?

The world’s population is expected to hit seven billion in the next few weeks. After growing very slowly for most of human history, the number of people on Earth has more than doubled in the last 50 years. Where do you fit into this story of human life? Fill in your date of birth here to find out.

Anúncios

The world’s population will reach 7 billion at the end of October. Don’t panic (The Economist)

Demography

A tale of three islands

Oct 22nd 2011 | from the print edition

 

IN 1950 the whole population of the earth—2.5 billion—could have squeezed, shoulder to shoulder, onto the Isle of Wight, a 381-square-kilometre rock off southern England. By 1968 John Brunner, a British novelist, observed that the earth’s people—by then 3.5 billion—would have required the Isle of Man, 572 square kilometres in the Irish Sea, for its standing room. Brunner forecast that by 2010 the world’s population would have reached 7 billion, and would need a bigger island. Hence the title of his 1968 novel about over-population, “Stand on Zanzibar” (1,554 square kilometres off east Africa).

Brunner’s prediction was only a year out. The United Nations’ population division now says the world will reach 7 billion on October 31st 2011 (America’s Census Bureau delays the date until March 2012). The UN will even identify someone born that day as the world’s 7 billionth living person. The 6 billionth, Adnan Nevic, was born on October 12th 1999 in Sarajevo, in Bosnia. He will be just past his 12th birthday when the next billion clicks over.

That makes the world’s population look as if it is rising as fast as ever. It took 250,000 years to reach 1 billion, around 1800; over a century more to reach 2 billion (in 1927); and 32 years more to reach 3 billion. But to rise from 5 billion (in 1987) to 6 billion took only 12 years; and now, another 12 years later, it is at 7 billion (see chart 1). By 2050, the UN thinks, there will be 9.3 billion people, requiring an island the size of Tenerife or Maui to stand on.

Odd though it seems, however, the growth in the world’s population is actually slowing. The peak of population growth was in the late 1960s, when the total was rising by almost 2% a year. Now the rate is half that. The last time it was so low was in 1950, when the death rate was much higher. The result is that the next billion people, according to the UN, will take 14 years to arrive, the first time that a billion milestone has taken longer to reach than the one before. The billion after that will take 18 years.

Once upon a time, the passing of population milestones might have been cause for celebration. Now it gives rise to jeremiads. As Hillary Clinton’s science adviser, Nina Fedoroff, told the BBC in 2009, “There are probably already too many people on the planet.” But the notion of “too many” is more flexible than it seems. The earth could certainly not support 10 billion hunter-gatherers, who used much more land per head than modern farm-fed people do. But it does not have to. The earth might well not be able to support 10 billion people if they had exactly the same impact per person as 7 billion do today. But that does not necessarily spell Malthusian doom, because the impact humans have on the earth and on each other can change.

For most people, the big questions about population are: can the world feed 9 billion mouths by 2050? Are so many people ruining the environment? And will those billions, living cheek-by-jowl, go to war more often? On all three counts, surprising as it seems, reducing population growth any more quickly than it is falling anyway may not make much difference.

Start with the link between population and violence. It seems plausible that the more young men there are, the more likely they will be to fight. This is especially true when groups are competing for scarce resources. Some argue that the genocidal conflict in Darfur, western Sudan, was caused partly by high population growth, which led to unsustainable farming and conflicts over land and water. Land pressure also influenced the Rwandan genocide of 1994, as migrants in search of a livelihood in one of the world’s most densely populated countries moved into already settled areas, with catastrophic results.

But there is a difference between local conflicts and what is happening on a global scale. Although the number of sovereign states has increased almost as dramatically as the world’s population over the past half-century, the number of wars between states fell fairly continuously during the period. The number of civil wars rose, then fell. The number of deaths in battle fell by roughly three-quarters. These patterns do not seem to be influenced either by the relentless upward pressure of population, or by the slackening of that pressure as growth decelerates. The difference seems to have been caused by fewer post-colonial wars, the ending of cold-war alliances (and proxy wars) and, possibly, the increase in international peacekeepers.

More people, more damage?

Human activity has caused profound changes to the climate, biodiversity, oceanic acidity and greenhouse-gas levels in the atmosphere. But it does not automatically follow that the more people there are, the worse the damage. In 2007 Americans and Australians emitted almost 20 tonnes of carbon dioxide each. In contrast, more than 60 countries—including the vast majority of African ones—emitted less than 1 tonne per person.

This implies that population growth in poorer countries (where it is concentrated) has had a smaller impact on the climate in recent years than the rise in the population of the United States (up by over 50% in 1970-2010). Most of the world’s population growth in the next 20 years will occur in countries that make the smallest contribution to greenhouse gases. Global pollution will be more affected by the pattern of economic growth—and especially whether emerging nations become as energy-intensive as America, Australia and China.

Population growth does make a bigger difference to food. All things being equal, it is harder to feed 7 billion people than 6 billion. According to the World Bank, between 2005 and 2055 agricultural productivity will have to increase by two-thirds to keep pace with rising population and changing diets. Moreover, according to the bank, if the population stayed at 2005 levels, farm productivity would have to rise by only a quarter, so more future demand comes from a growing population than from consumption per person.

Increasing farm productivity by a quarter would obviously be easier than boosting it by two-thirds. But even a rise of two-thirds is not as much as it sounds. From 1970-2010 farm productivity rose far more than this, by over three-and-a-half times. The big problem for agriculture is not the number of people, but signs that farm productivity may be levelling out. The growth in agricultural yields seems to be slowing down. There is little new farmland available. Water shortages are chronic and fertilisers are over-used. All these—plus the yield-reductions that may come from climate change, and wastefulness in getting food to markets—mean that the big problems are to do with supply, not demand.

None of this means that population does not matter. But the main impact comes from relative changes—the growth of one part of the population compared with another, for example, or shifts in the average age of the population—rather than the absolute number of people. Of these relative changes, falling fertility is most important. The fertility rate is the number of children a woman can expect to have. At the moment, almost half the world’s population—3.2 billion—lives in countries with a fertility rate of 2.1 or less. That number, the so-called replacement rate, is usually taken to be the level at which the population eventually stops growing.

The world’s decline in fertility has been staggering (see chart 2). In 1970 the total fertility rate was 4.45 and the typical family in the world had four or five children. It is now 2.45 worldwide, and lower in some surprising places. Bangladesh’s rate is 2.16, having halved in 20 years. Iran’s fertility fell from 7 in 1984 to just 1.9 in 2006. Countries with below-replacement fertility include supposedly teeming Brazil, Tunisia and Thailand. Much of Europe and East Asia have fertility rates far below replacement levels.

The fertility fall is releasing wave upon wave of demographic change. It is the main influence behind the decline of population growth and, perhaps even more important, is shifting the balance of age groups within a population.

When gold turns to silver

A fall in fertility sends a sort of generational bulge surging through a society. The generation in question is the one before the fertility fall really begins to bite, which in Europe and America was the baby-boom generation that is just retiring, and in China and East Asia the generation now reaching adulthood. To begin with, the favoured generation is in its childhood; countries have lots of children and fewer surviving grandparents (who were born at a time when life expectancy was lower). That was the situation in Europe in the 1950s and in East Asia in the 1970s.

But as the select generation enters the labour force, a country starts to benefit from a so-called “demographic dividend”. This happens when there are relatively few children (because of the fall in fertility), relatively few older people (because of higher mortality previously), and lots of economically active adults, including, often, many women, who enter the labour force in large numbers for the first time. It is a period of smaller families, rising income, rising life expectancy and big social change, including divorce, postponed marriage and single-person households. This was the situation in Europe between 1945 and 1975 (“les trente glorieuses”) and in much of East Asia in 1980-2010.

But there is a third stage. At some point, the gilded generation turns silver and retires. Now the dividend becomes a liability. There are disproportionately more old people depending upon a smaller generation behind them. Population growth stops or goes into reverse, parts of a country are abandoned by the young and the social concerns of the aged grow in significance. This situation already exists in Japan. It is arriving fast in Europe and America, and soon after that will reach East Asia.

A demographic dividend tends to boost economic growth because a large number of working-age adults increases the labour force, keeps wages relatively low, boosts savings and increases demand for goods and services. Part of China’s phenomenal growth has come from its unprecedentedly low dependency ratio—just 38 (this is the number of dependents, children and people over 65, per 100 working adults; it implies the working-age group is almost twice as large as the rest of the population put together). One study by Australia’s central bank calculated that a third of East Asia’s GDP growth in 1965-90 came from its favourable demography. About a third of America’s GDP growth in 2000-10 also came from its increasing population.

The world as a whole reaped a demographic dividend in the 40 years to 2010. In 1970 there were 75 dependents for every 100 adults of working age. In 2010 the number of dependents dropped to just 52. Huge improvements were registered not only in China but also in South-East Asia and north Africa, where dependency ratios fell by 40 points. Even “ageing” Europe and America ended the period with fewer dependents than at the beginning.

A demographic dividend does not automatically generate growth. It depends on whether the country can put its growing labour force to productive use. In the 1980s Latin America and East Asia had similar demographic patterns. But while East Asia experienced a long boom, Latin America endured its “lost decade”. One of the biggest questions for Arab countries, which are beginning to reap their own demographic dividends, is whether they will follow East Asia or Latin America.

But even if demography guarantees nothing, it can make growth harder or easier. National demographic inheritances therefore matter. And they differ a lot.

Where China loses

Hania Zlotnik, the head of the UN’s Population Division, divides the world into three categories, according to levels of fertility (see map). About a fifth of the world lives in countries with high fertility—3 or more. Most are Africans. Sub-Saharan Africa, for example, is one of the fastest-growing parts of the world. In 1975 it had half the population of Europe. It overtook Europe in 2004, and by 2050 there will be just under 2 billion people there compared with 720m Europeans. About half of the 2.3 billion increase in the world’s population over the next 40 years will be in Africa.

The rest of the world is more or less equally divided between countries with below-replacement fertility (less than 2.1) and those with intermediate fertility (between 2.1 and 3). The first group consists of Europe, China and the rest of East Asia. The second comprises South and South-East Asia, the Middle East and the Americas (including the United States).

The low-fertility countries face the biggest demographic problems. The elderly share of Japan’s population is already the highest in the world. By 2050 the country will have almost as many dependents as working-age adults, and half the population will be over 52. This will make Japan the oldest society the world has ever known. Europe faces similar trends, less acutely. It has roughly half as many dependent children and retired people as working-age adults now. By 2050 it will have three dependents for every four adults, so will shoulder a large burden of ageing, which even sustained increases in fertility would fail to reverse for decades. This will cause disturbing policy implications in the provision of pensions and health care, which rely on continuing healthy tax revenues from the working population.

At least these countries are rich enough to make such provision. Not so China. With its fertility artificially suppressed by the one-child policy, it is ageing at an unprecedented rate. In 1980 China’s median age (the point where half the population is older and half younger) was 22 years, a developing-country figure. China will be older than America as early as 2020 and older than Europe by 2030. This will bring an abrupt end to its cheap-labour manufacturing. Its dependency ratio will rise from 38 to 64 by 2050, the sharpest rise in the world. Add in the country’s sexual imbalances—after a decade of sex-selective abortions, China will have 96.5m men in their 20s in 2025 but only 80.3m young women—and demography may become the gravest problem the Communist Party has to face.

Many countries with intermediate fertility—South-East Asia, Latin America, the United States—are better off. Their dependency ratios are not deteriorating so fast and their societies are ageing more slowly. America’s demographic profile is slowly tugging it away from Europe. Though its fertility rate may have fallen recently, it is still slightly higher than Europe’s. In 2010 the two sides of the Atlantic had similar dependency rates. By 2050 America’s could be nearly ten points lower.

But the biggest potential beneficiaries are the two other areas with intermediate fertility—India and the Middle East—and the high-fertility continent of Africa. These places have long been regarded as demographic time-bombs, with youth bulges, poverty and low levels of education and health. But that is because they are moving only slowly out of the early stage of high fertility into the one in which lower fertility begins to make an impact.

At the moment, Africa has larger families and more dependent children than India or Arab countries and is a few years younger (its median age is 20 compared with their 25). But all three areas will see their dependency ratios fall in the next 40 years, the only parts of the world to do so. And they will keep their median ages low—below 38 in 2050. If they can make their public institutions less corrupt, keep their economic policies outward-looking and invest more in education, as East Asia did, then Africa, the Middle East and India could become the fastest-growing parts of the world economy within a decade or so.

Here’s looking at you

Demography, though, is not only about economics. Most emerging countries have benefited from the sort of dividend that changed Europe and America in the 1960s. They are catching up with the West in terms of income, family size and middle-class formation. Most say they want to keep their cultures unsullied by the social trends—divorce, illegitimacy and so on—that also affected the West. But the growing number of never-married women in urban Asia suggests that this will be hard.

If you look at the overall size of the world’s population, then, the picture is one of falling fertility, decelerating growth and a gradual return to the flat population level of the 18th century. But below the surface societies are being churned up in ways not seen in the much more static pre-industrial world. The earth’s population may never need a larger island than Maui to stand on. But the way it arranges itself will go on shifting for centuries to come.

Occupy Wall Street turns to pedal power (The Raw Story)

By Muriel Kane
Sunday, October 30, 2011

The Occupy Wall Street protesters who were left without power after their gas-fueled generators were confiscated by New York City authorities on Friday may have found the idea solution in the form of a stationary bicycle hooked up to charge batteries.

Stephan Keegan of the non-profit environmental group Time’s Up showed off one of the bikes to The Daily News, explaining that OWS’s General Assembly has already authorized payment for additional bikes and that “soon we’ll have ten of these set up and we’ll be powering the whole park with batteries.”

Protester Lauren Minis told CBS New York, “We’ve got five bike-powered generator systems that are coming from Boston and we’ve got five more plus other ones that are going to supplement as well so we’re completely, completely off the grid.”

According to CBS, “Insiders at Occupy Wall Street say they expect to have their media center and the food service area fully powered and illuminated by Monday.”

“We need some exercise,” Keegan explained enthusiastically, “and we’ve got a lot of volunteers, so we should be able to power these, no problem. … We did an energy survey of the whole park, found out how much energy we were using. …. Ten will give us twice as much power.”

Keegan also boasted that the system is “very clean” and is environmentally superior not only to fossil fuel but even to solar panels, because it uses almost entirely recycled materials.

[Click que image to watch video, or click here]

Namorados adolescentes usam violência como forma de comunicação (Fapesp)

Pesquisa FAPESP
Edição 188 – Outubro 2011
Humanidades > Psicologia

Tempos de cólera no amor

O refrão da música de Belchior renova-se a cada geração como uma maldição sem antídoto: “Minha dor é perceber/ Que apesar de termos feito tudo o que fizemos/ Ainda somos os mesmos e vivemos como nossos pais”. É o que revela a pesquisa Violência entre namorados adolescentes (lançada agora em livro, Amor e violência, pela Editora Fiocruz), feita entre 2007 e 2010 a pedido do Centro Latino-Americano de Estudos da Violência e Saúde Jorge Careli (Claves/Fiocruz) e coordenada por Kathie Njaine, professora do Departamento de Saúde Pública da Universidade Federal de Santa Catarina (UFSC). O projeto reuniu um grupo de 11 pesquisadores de diversas universidades para investigar a violência nas relações afetivo-sexuais de “ficar” ou namorar entre jovens de 15 a 19 anos de idade, a partir de um universo de 3,2 mil estudantes de escolas públicas e privadas de 10 capitais brasileiras. “Os jovens de hoje, ao mesmo tempo que recriam novas formas e meios de se relacionar, em que o ‘ficar’ e o uso da internet para interação amorosa e sexual são o novo, repetem e reproduzem modelos relacionais tradicionais e conservadores, como o machismo e o sentimento de posse, expressos em suas falas e no trato com o parceiro e a parceira”, afirma a pesquisadora. Talvez até com maior intensidade do que faziam nossos pais.

Praticamente, nove em cada 10 jovens que namoram praticam ou sofrem variadas formas de violência e para marcar território casais jovens recorrem à violência para controlar seus parceiros, e a agressão virou sinônimo de domínio nas relações amorosas desses adolescentes. “Creio que a violência vem se tornando uma forma de comunicação entre muitos jovens, que alternam os papéis de vítima e autor, de acordo com o momento e o meio em que vivem. Esses atos estão se banalizando a ponto de serem incorporados naturalmente na convivência, sem reflexão alguma sobre o que isso pode significar para a vida afetiva-sexual”, observa Kathie. “Os adolescentes adotam cada vez mais cedo a violência em diversos graus e começam a achar isso muito natural. Acreditam que para ter o controle da relação e do companheiro é preciso usar a violência.” Belchior continua profético ao afirmar “que o novo sempre vem”, ainda que nem sempre num registro positivo. Segundo o estudo, as garotas são, ao mesmo tempo, as maiores agressoras e vítimas de violência verbal e na categoria de agressões físicas, que incluem tapas, puxão de cabelo, empurrão, socos e chutes, os números revelam que os homens são mais vítimas do que as mulheres: 28,5% delas informaram que agridem fisicamente o parceiro; 16,8% dos meninos confessaram o mesmo. Em termos de violência sexual, o esperado acontece, porém há surpresas: 49% dos homens relatam praticar esse tipo de agressão, enquanto 32,8% das moças admitem o comportamento. Curiosamente, na opinião de 22% dos jovens de ambos os sexos, a violência é o principal problema do mundo de hoje, bem à frente da fome, da pobreza e da miséria. Quem disse que coerência é o forte dos jovens?

Isso se reflete igualmente em práticas que os jovens, em casa, abominam em seus pais, como a vigilância constante de hábitos e vestuários. Para dominar o parceiro, o adolescente busca controlar o comportamento do outro, as roupas que usa, os nomes na agenda do celular, os acessos a redes virtuais de relacionamento, as pessoas com quem conversa. “Como se não bastasse isso, surge um elemento novo: a ameaça de difamação do outro pela divulgação de fotos íntimas pelo celular ou via internet foram estratégias citadas pelos jovens como tentar evitar o fim do namoro, em especial por parte dos meninos”, conta a socióloga e pesquisadora da Fiocruz Maria Cecília de Souza Minayo, organizadora do estudo ao lado de Kathie. A violência em tom de ameaça (provocar medo, ameaçar machucar ou destruir algo de valor) vitima 24,2% dos jovens, um jogo sujo perpetrado por 29,2% dos entrevistados. De acordo com os dados, 33,3% das meninas assumem que ameaçam mais seus parceiros em relação a 22,6% dos meninos. “Os números se aproximam. Tudo sugere que existe um ciclo de vitimização e perpetração. As experiências permanentes de situações agressivas se traduzem no estímulo a relacionamentos conflituosos e no aprendizado do uso da violência para obter poder e amedrontar os outros. Esse comportamento aprendido e aceito interfere no lugar que o jovem ocupará na rede social e no seu desempenho nas relações afetivas e sexuais”, observa a médica Simone Gonçalves de Assis, pesquisadora do Claves/Fiocruz e outra das organizadoras do projeto.

Afetivas – “O complexo é que existe uma identidade que ultrapassa regiões e classes sociais quando observamos o comportamento dos jovens dessas 10 capitais. Há também similaridades entre os estudantes das redes de ensino público e privado. Nas relações afetivas dos jovens chamam mais a atenção as semelhanças do que os eventuais aspectos divergentes”, nota Kathie. Um aspecto que reúne todos é o novo formato das relações amorosas contemporâneas. “Elas são mais provisórias, temporárias. Desde os anos 1980 vem sendo bastante usada entre os jovens a expressão ‘ficar’ para caracterizar uma fase de atração sem maiores compromissos e que pode envolver de beijos a relações sexuais”, observa Maria Cecília. No “ficar”, notam as pesquisadoras, o amor não é pré-requisito e implica uma aprendizagem amorosa, um tipo de teste para um eventual namoro, relação vista como mais “séria” e, principalmente, mais pública, simbolizando a entrada do jovem na cena dos adultos em visitas aos pais do parceiro e no planejamento do tempo em conjunto e o sentimento de maior solidez na relação. “É, no entanto, tudo muito nebuloso e muitos jovens afirmam que, depois de ‘ficar’, não sabem se estão namorando ou não”, diz a autora. Nos dois estados existe o ciúme e o desejo de controlar o outro. “Por causa da iminência de serem acusados de ciúme, desconfiança e traição nas relações de namoro, muitos rapazes e moças justificam sua preferência pelo ‘ficar’, relação em que supostamente não existem amarras e há menos risco de se apaixonar e de se decepcionar”, nota Kathie. Ou, na fala de um entrevistado: “Eu mesmo não confio em ninguém. Eu posso pensar: eu não vou trair ela, mas ninguém sabe o que está acontecendo com ela”.

“São sempre reações antagônicas: compromisso versus não compromisso; longa duração versus pouca duração; intimidade sexual versus superficialidade sexual; envolvimento afetivo versus não envolvimento afetivo; exclusividade versus traição”, avalia a pesquisadora. “No entanto, se há uma persistência do machismo como um (anti) valor de longa duração, existem mudanças provocadas pelas mulheres, que se colocam numa posição de parceiras capazes de questionar e propor novas modalidades de relacionamento. Muitas adotam comportamentos ditos masculinos, como a agressão física e verbal”, observa Maria Cecília. No caso do sexo, inclusive. “Os meninos usam estratégias românticas para transar com as parceiras, com argumentos de que seria uma ‘prova de amor’. Muitas meninas reproduzem valores de subjugação, mas um número não desprezível delas toma a iniciativa e testa os garotos na sua sexualidade, humilhando os que não querem transar com elas”, completa. O “ficar” trouxe novidades também para os homossexuais e bissexuais: 3% e 1% dos rapazes, respectivamente, assumiram o comportamento. “Para os jovens que se engajam nessas relações, o ‘ficar’ serve como experimentação e confirmação da opção sexual. Por serem menos públicas, as relações do ‘ficar’ geram menos suspeitas e minimizam rejeições, assédios e violências até que o jovem esteja seguro de sua orientação sexual”, nota Simone. Mas, apesar do discurso renovado dos jovens que dizem “adorar amigos gays”, a realidade mantém o preconceito dos velhos tempos e é uma fonte de bullying entre colegas.

Outro aliado do “ficar” é a internet, vista como espaço mais livre e de maior comunicação para a organização de encontros, ampliando a possibilidade de experimentação das relações e forma de conhecer melhor o parceiro, se aproximar e travar amizades. Mas nem mesmo a ferramenta moderna consegue pôr fim ao combustível natural das brigas: o ciúme, considerado entre os jovens como algo natural entre pessoas que se amam. Incluindo-se os célebres “gritos”: algumas adolescentes usam essa estratégia para evitar a subjugação, adotando uma postura agressiva antes que os rapazes o façam. Eles, por sua vez, ao contrário do que pensam as mulheres, consideram que gritar não resolve problemas de relação. Nisso há um dado preocupante. “Observamos que o jovem que é vítima da violência verbal do parceiro tem 2,6 vezes mais chances de ter sofrido esse tipo de agressão por parte dos pais, comparado com quem não sofreu nenhuma forma de violência”, diz Kathie. “Os adolescentes elegeram a família como a principal referência para questões afetivo-sexuais. Os dados revelam, porém, que raramente os adolescentes procuram ajuda em situações de violência no relacionamento e apenas 3,5% dentre eles afirmaram ter solicitado apoio profissional por causa de uma agressão causada pelo parceiro.” Para Kathie, os profissionais nas escolas e os amigos precisam ser informados para ajudar no processo.

Agressão – “Grande parte dos rapazes e moças considera normal a agressão verbal e física na resolução de seus conflitos amorosos. Romper com essas práticas implica o questionamento sobre certos modelos de existência instituídos no campo social. É importante questionar a associação mecânica de características tidas como universais ao ‘ser homem’ e ao ‘ser mulher’, bem como criticar a desqualificação de um gênero em prol da valorização do outro”, avisa a pesquisadora. Os padrões de violência afetivo-sexual tendem a se reproduzir, porque são estruturais e estruturantes. “Atua-se muito pouco em relação a essa violência entre jovens e adolescentes. Eles costumam ficar em seus próprios mundos, as escolas geralmente não se envolvem no assunto porque julgam que isso não é de sua alçada. Os pais ou não têm tempo ou não acompanham verdadeiramente a vida dos filhos e a tendência é a reprodução dos padrões familiares e grupais”, analisa Maria Cecília. Segundo ela, há uma supervalorização de modelos de consumo, beleza, competitividade e poder, em detrimento de outros modelos, incrementada em grande parte pela mídia, o que provoca uma crise de valores na sociedade. “A juventude reflete de muitos modos esses valores. Mas eu tendo a achar que os jovens de hoje, no meio de mudanças profundas e aceleradas, não são piores que os de nosso tempo, nem ideológica, nem do ponto de vista do compromisso social”, acredita a autora. “Ao contrário: como sempre eles estão aí para realizar uma nova direção do mundo e nos surpreender, como vem ocorrendo, politicamente em vários países do mundo.” Na contramão, felizmente, dos nossos pais.

Brasil já pesquisa efeitos da mudança do clima (Valor Econômico)

JC e-mail 4373, de 27 de Outubro de 2011.

As pesquisas em mudança climática no Brasil começam a mudar de rumo. Se há alguns anos o foco estava nos esforços de redução das emissões dos gases-estufa, agora miram a adaptação ao fenômeno.

“Sabemos que nos próximos cinco ou dez anos não há perspectiva para que seja firmado internacionalmente um acordo de redução nas emissões de gases-estufa de grandes proporções, com cortes entre 70% a 80%”, diz o físico Paulo Artaxo, da USP, um estudioso da Amazônia. “Esse panorama é cada vez mais longínquo. Portanto é fundamental que se estudem estratégias de adaptação.”

Em outras palavras, as pesquisas devem se voltar para os efeitos da mudança do clima nos ecossistemas, em ambientes urbanos, em contextos sociais. “Não é uma questão de dinheiro, mas de direcionamento dos estudos”, diz Artaxo, membro do conselho diretor do Painel Brasileiro de Mudança Climática, órgão científico ligado aos ministérios da Ciência e Tecnologia e Ambiente. “O País precisa se preparar mais adequadamente para a mudança climática.”

“É preciso pesquisar mais, por exemplo, as alterações no ciclo hidrológico”, cita Reynaldo Victoria, coordenador do Programa Fapesp de Pesquisa sobre Mudanças Climáticas Globais. “Saber onde vai chover mais e onde vai chover menos”, explica. É um dos braços da pesquisa de Artaxo na Amazônia. “Porque não se quer construir uma hidrelétrica onde choverá muito menos nas próximas décadas”, ilustra o físico.

O programa de mudança climática da Fapesp já conta com investimentos de US$ 30 milhões em projetos na área. É um dos braços mais novos da fundação, mas já está ganhando musculatura. Tem 21 projetos em andamento, 14 contratos novos, dois outros em parceria com instituições estrangeiras, como o britânico Natural Environment Research Council (Nerc) ou a francesa Agence Nationale de la Recherche (ANR). Em dez anos, a previsão é de investimentos de mais de R$ 100 milhões.

As pesquisas começam a se voltar para campos pouco estudados. “Vamos analisar questões críticas para o Brasil”, diz Artaxo. Ele cita, por exemplo, o ciclo de carbono na Amazônia – algo muito mais complexo do que estudar a fotossíntese e a respiração das plantas.

Victoria, que também é professor do Centro de Energia Nuclear Aplicada à Agricultura (Cena-USP), diz que a intenção do programa é mirar campos novos, como entender qual o papel do Atlântico Sul no clima da região Sul do Brasil e Norte da Argentina. Outro exemplo é obter registros históricos na área de paleoclima.

Os impactos na área de saúde também serão mais estudados. Já se sabe que a mudança do clima faz com que doenças que não existiam em determinado lugar, passem a ocorrer. A dengue, por exemplo, encontra ambiente propício em regiões mais quentes. Entre as novas pesquisas de doenças emergentes há o estudo de um tipo de leishmaniose, comum na Bolívia e no Peru, que não existia no Brasil e agora ameaça surgir no Acre. Provocada por um mosquito, a doença causa uma infecção cutânea e pode ser mortal.

Os pesquisadores falaram sobre seus projetos durante a Fapesp Week, evento que faz parte da comemoração pelos 50 anos da Fundação de Amparo à Pesquisa do Estado de São Paulo e terminou ontem, em Washington.

Terra, que Tempo é Esse? (PUC)

Por Gabriela Caesar – Do Portal, 28/10/2011. Fotos: Eduardo de Holanda.

Embora a “soberania nacional e o mercado criem cenário conflitoso”, a população está consciente de que o estilo de vida precisa mudar, acredita o antropólogo Roberto da Matta. Já a jornalista Sônia Bridi pondera que “não adianta discutir ou culpar quem começou”, mas trocar o modelo de produção. Reunidos na PUC-Rio para o debate “Terra, que tempo é esse?” (assista às partes 1 e 2 abaixo), nesta segunda-feira (24), com mediação do professor Paulo Ferracioli, do Departamento de Economia, eles reforçaram a importância de um desenvolvimento mais alinhado às demandas ambientais.

O secretário estadual do Ambiente, Carlos Minc (PT-RJ), acrescentou que a negociação com grandes empresas, como a Companhia Siderúrgica Nacional (CSN), deve incluir o acompanhamento de tecnologias que possam não só diminuir as agressões ambientais, mas também resguardar a saúde dos trabalhadores. Ainda em relação a tecnologias “ecologicamente corretas”, Sônia Bridi afirmou que o estado do Rio “erra ao se decidir por ônibus, em vez de veículo leve sobre trilho”.

Diante dos aproximadamente cem estudantes que acompanhavam o debate no auditório do RDC, Roberto da Matta destacou que a mudança para um estilo de vida mais saudável e comprometido com o ambiente revela-se igualmente importante para combater outro problema, segundo ele, agravado pela globalização: a obesidade mórbida, que dá origem ao neologismo “globesidade”. Para diminuir o avanço da doença, que aumentou em um terço na China, o antropólogo é categórico ao propor um padrão social menos consumista.

Usina de contrastes e um dos principais lubrificantes do consumo mundial, a China encara o desafio de reduzir as faturas ambientais – alvo recorrente de críticas em foruns internacionais – e de saúde. Para Sônia Bridi, a locomotiva da economia global investe no longo prazo:

– Até 2020, a China terá 20 mil quilômetros de trem bala. Eles estão preocupados com isso, porque a qualidade da saúde deles está piorando muito.

O trilho do desenvolvimento responsável não passa necessariemente por grandes investimentos. O diretor do Núcleo Interdisciplinar do Meio Ambiente (Nima), Luiz Felipe Guanaes, lembrou que iniciativas como a coleta seletiva, implantada em junho deste ano no campus da PUC-Rio, também aproximam o cidadão de um maior compromisso ambiental e social. Outra oportunidade de a “comunidade se engajar na causa”, lembrou ele, será o encontro de pesquisadores e especialistas na universidade em 2012, para a Rio+20, em parceria com a ONU.

Sônia também contou bastidores da série de reportagem “Terra, que país é esse?” – que mostrou os avanços do aquecimento global e nomeou o debate. No Peru, ela e o repórter cinematográfico Paulo Zero notaram o impacto no cotidiano, até em rituais.

– Num determinado dia, próximo à festa do Corpus Christi, confrarias do país inteiro sobem certa montanha e colhem blocos de gelo. Tiveram de mudar o ritual, que vem do tempo dos incas, incorporado pelo cristianismo. Eles pararam de tirar gelo.

Paulo Zero admite que a produção jornalística, atrelada ao cumprimento de prazos “curtos”, dificulta o tratamento do assunto. Outra barreira, diz Paulo, pode ser a logística. Para a reportagem na Groelândia, por exemplo, ele e Sônia navegaram por seis horas até chegar à ilha. Se o trajeto atrapalhou, a sorte foi uma aliada.

– Chegamos à geleira e, em cinco minutos, caiu um grande bloco de gelo. Ficamos mais três horas lá e não caiu mais nenhum pedaço de gelo. Ou seja, estávamos na hora certa e no lugar certo – contou o cinegrafista.

Parte 1 (clique na imagem)

Parte 2 (clique na imagem)

Limite próximo (Fapesp)

Amazônia está muito próxima de um ponto de não retorno para sua sobrevivência, diz Thomas Lovejoy, da George Mason University, no simpósio internacional FAPESP Week (foto: JVInfante Photography/Wilson Center)

27/10/2011

Agência FAPESP – A Amazônia está muito próxima de um ponto de não retorno para sua sobrevivência, devido a uma combinação de fatores que incluem aquecimento global, desflorestamento e queimadas que minam seu sistema hidrogeológico.

A advertência foi feita por Thomas Lovejoy, atualmente professor da George Mason University, no Estado de Virgínia, EUA, no primeiro dia do simpósio internacional FAPESP Week, em Washington, nesta segunda-feira.

O biólogo Lovejoy, um dos mais importantes especialistas em Amazônia do mundo, começou a trabalhar na floresta brasileira em 1965, “apenas três anos depois da fundação da FAPESP”, lembrou.

Apesar de muita coisa positiva ter acontecido nestes 47 anos (“quando pisei pela primeira vez em Belém, só havia uma floresta nacional e uma área indígena demarcada e quase nenhum cientista brasileiro se interessava em estudar a Amazônia; hoje esse situação está totalmente invertida”), também apareceram no período diversos fatores de preocupação.

Lovejoy acredita que restam cinco anos para inverter as tendências em tempo de evitar problemas de maior gravidade. O aquecimento da temperatura média do planeta já está na casa de 0,8 grau centígrado. Ele acredita que o limite aceitável é de 2 graus centígrados e que ele pode ser alcançado até 2016 se nada for feito para efetivamente reduzi-lo.

O objetivo fixado nas mais recentes reuniões sobre o clima em Cancun e Copenhague de limitar o aumento médio da temperatura média global em 2 graus centígrados pode ser insuficiente, na opinião de Lovejoy, devido a essa conjugação de elementos.

De forma similar, Lovejoy crê que 20% de desflorestamento em relação ao tamanho original da Amazônia é o máximo que ela consegue suportar e o atual índice já é de 17% (em 1965, a taxa era de 3%).

A boa notícia, diz o biólogo, é que há bastante terra abandonada, sem nenhuma perspectiva de utilização econômica na Amazônia e que pode ser de alguma forma reflorestada, o que poderia proporcionar certa margem de segurança.

Em sua palestra, Lovejoy saudou vários cientistas brasileiros como exemplares em excelência em suas pesquisas. Entre outros, Eneas Salati, Carlos Nobre e Carlos Joly.

Atrair a atenção do público é o grande desafio para os satisfeitos jornalistas de ciência (Fapesp)

Pesquisa FAPESP
Edição 188 – Outubro 2011
Política de C & T > Cultura científica
Leitores esquivos

Mariluce Moura

Dois estudos brasileiros sobre divulgação científica, citados em primeira mão na Conferência Mundial de Jornalismo Científico 2011, em Doha, Qatar, no final de junho, propõem quando superpostos um panorama curiosamente desconexo para esse campo no país: se de um lado os jornalistas de ciência revelam um alto grau de satisfação com seu trabalho profissional, de outro, uma alta proporção de uma amostra representativa da população paulistana (76%) informa nunca ler notícias científicas nos jornais, revistas ou internet. Agora o mais surpreendente: no universo de entrevistados ouvidos no estado de São Paulo nesta segunda pesquisa, 52,5% declararam ter “muita admiração” pelos jornalistas e 49,2%, pelos cientistas, a despeito de poucos lerem as notícias elaboradas por uns sobre o trabalho dos outros. Esses e outros dados dos estudos provocam muitas questões para os estudiosos da cultura científica nacional. Uma, só para começar: a satisfação profissional do jornalista de ciência independe de ele atingir com sua produção seus alvos, ou seja, os leitores, os telespectadores, os ouvintes ou, de maneira mais geral, o público?

A Conferência Mundial, transferida de última hora do Cairo para Doha, em razão dos distúrbios políticos no Egito iniciados em janeiro, reuniu 726 jornalistas de 81 países que, durante quatro dias, debateram desde o conceito central de jornalismo científico, passando pelas múltiplas formas de exercê-lo e suas dificuldades, até os variados problemas de organização desses profissionais na Ásia, na África, na Europa, na América do Norte ou na América Latina, nos países mais democráticos e nos mais autoritários. Uma questão que atravessou todos esses debates foi o desenvolvimento da noção de que fazer jornalismo científico não é traduzir para o público a informação científica – seria mais encontrar meios eficazes de narrar em linguagem jornalística o que dentro da produção científica pode ser identificado como notícia de interesse para a sociedade. A próxima Conferência Mundial será realizada na Finlândia, em 2013.

Apresentado por um dos representantes da FAPESP na conferência, o estudo que trouxe à tona a medida preocupante do desinteresse por notícias de ciência chama-se “Percepção pública da ciência e da tecnologia no estado de São Paulo” (confira o pdf) e constitui o 12º capítulo dos Indicadores de ciência, tecnologia e inovação em São Paulo – 2010, lançado pela FAPESP em agosto último. Elaborado pela equipe do Laboratório de Estudos Avançados em Jornalismo da Universidade Estadual de Campinas (Labjor-Unicamp) sob a coordenação de seu diretor, o linguista Carlos Vogt, em termos empíricos a pesquisa se baseou num questionário composto por 44 perguntas aplicado a 1.076 pessoas na cidade de São Paulo e a mais 749 no interior e no litoral do estado, em 2007. Portanto, foram 1.825 entrevistados em 35 municípios, distribuídos nas 15 regiões administrativas (RAs).

Vale ressaltar que esse foi o segundo levantamento direto em uma amostra da população a respeito de sua percepção da ciência realizado pelo Labjor e ambos estavam integrados a um esforço ibero- -americano em torno da construção de indicadores capazes de refletir a cultura científica nessa região. A primeira enquete, feita entre 2002 e 2003, incluiu amostras das cidades de Campinas, Buenos Aires, Montevidéu, além de Salamanca e Valladolid, na Espanha, e seus resultados foram apresentados nos Indicadores de C,T&I em São Paulo – 2004, também publicado pela FAPESP. Já em 2007, a pesquisa, com a metodologia mais refinada e amostra ampliada, alcançou sete países: além do Brasil, Colômbia, Argentina, Chile, Venezuela, Panamá e Espanha. O núcleo comum do questionário era constituído por 39 perguntas e cada região podia desenvolver outras questões de sua livre escolha.

O outro estudo brasileiro apresentado em Doha chama-se “Jornalismo científico na América Latina: conhecendo melhor os jornalistas de ciência na região” e, a rigor, ainda está em curso. Os resultados preliminares apresentados baseavam-se nas respostas a um questionário composto por 44 perguntas – desenvolvido pela London School of Economics and Political Science (LSE) –, encaminhadas até 21 de junho. Mas a essa altura, mais de 250 jornalistas responderam ao questionário, dentre eles aproximadamente 80 brasileiros, segundo sua coordenadora, a jornalista Luisa Massarani, diretora da Rede Ibero-americana de Monitoramento e Capacitação em Jornalismo Científico, instituição responsável pelo estudo, em parceria com o LSE. O levantamento tem ainda o apoio de associações de jornalismo científico e outras instituições ligadas à área de divulgação científica na Argentina, Bolívia, Brasil, Chile, Colômbia, Costa Rica, Equador, México, Panamá e Venezuela.

No alvo desse estudo, como indicado, aliás, pelo título, está uma preocupação em saber quantos são, quem são e que visão têm da ciência os jornalistas envolvidos com a cobertura sistemática dessa área na América Latina. “Não temos ideia sobre isso, sequer sabemos quantos jornalistas de ciência existem no Brasil e se eles são ou não representativos dentro da categoria”, diz Luisa Massarani, que é também diretora do Museu da Vida da Fundação Oswaldo Cruz (Fiocruz) e coordenadora para a América Latina da Rede de Ciência e Desenvolvimento (SciDev.Net). Até algum tempo, lembra, “a Associação Brasileira de Jornalismo Científico (ABJC), com base em seu registro de sócios, situava esse número em torno de 500, mas isso na verdade incluía cientistas e outros profissionais interessados em divulgação da ciência”. A propósito, a ABJC vai iniciar no próximo mês o recadastramento dos sócios, junto com uma chamada para novos associados, o que poderá contribuir para esse censo dos jornalistas de ciência no Brasil.

Crença na ciência – Com 46 gráficos e 55 tabelas anexas que podem ser cruzados de acordo com o interesse específico de cada estudioso, o estudo de percepção da ciência bancado pela FAPESP e coordenado por Vogt permite uma infinidade de conclusões e novas hipóteses a respeito de como a sociedade absorve ciência por via da mídia ou como as várias classes sociais ou econômicas no estado de São Paulo reagem à exposição a notícias da área científica. Ao próprio coordenador, um dos pontos que mais chamaram a atenção nos resultados da pesquisa foi a relação inversa que ela permite estabelecer entre crença na ciência e informação sobre ciência. “O axioma seria quanto mais informação, menos crença na ciência”, diz. Assim, se consultado o gráfico relativo a grau de consumo autodeclarado de informação científica versus atitude quanto aos riscos e benefícios da ciência (gráfico 12.11), pode-se constatar que 57% dos entrevistados que declararam alto consumo acreditam que ciência e tecnologia podem oferecer muitos riscos e muitos benefícios simultaneamente e 6,3% acreditam que podem trazer muitos riscos e poucos benefícios. Já daqueles que declararam consumo nulo de informação científica, 42,9% veem muitos riscos e muitos benefícios ao mesmo tempo e 25,5% veem muitos riscos e poucos benefícios. “Ou seja, entre os mais informados é bem alta a proporção dos que veem riscos e benefícios na ciência ao mesmo tempo”, destaca Vogt, presidente da FAPESP de 2002 a 2007 e hoje coordenador da Universidade Virtual do Estado de São Paulo (Univesp), indicando que essa seria uma visão realista. Registre-se que o grau de pessimismo é muito maior entre os que declararam consumo nulo de informação científica: 8,1% deles disseram que a ciência não traz nenhum risco e nenhum benefício, enquanto esse percentual foi de 5,8% entre os que declararam consumo baixo, de 2,3% entre os que se situaram na faixa de consumo médio baixo, de 0,7% na faixa médio alto e de zero entre os altos consumidores de informação científica.

 

Na parte do trabalho sobre interesse geral em C&T, chama a atenção como o tema está medianamente situado pelos entrevistados em quinto lugar, depois de esporte e antes de cinema, arte e cultura, dentre 10 assuntos usualmente cobertos pela mídia (gráfico 12.1). Mas enquanto para esporte 30,5% deles se declaram muito interessados e 34,9%, interessados, em ciência e tecnologia são 16,3% os muito interessados e 47,1% os interessados, ou seja, a intensidade do interesse é menor. Vale também observar como os diferentes graus de interesse em C&T aproximam a cidade de São Paulo de Madri e a distanciam imensamente de Bogotá (gráfico 12.2). Assim, respectivamente, 15,4% dos entrevistados em São Paulo e 16,7% dos entrevistados em Madri declararam-se muito interessados em C&T; para a categoria interessado, os percentuais foram 49,6% e 52,7%; para pouco interessado, 25,5% e 24,8%, e para nada interessado, respectivamente, 9,4% e 5,9%. Já em Bogotá, nada menos que 47,5% declararam-se muito interessados. Por quê, não se sabe. Os interessados totalizam 33,2%, os pouco interessados, 15,3% e os nada interessados, 4%.

Não há muita diferença no nível de interesse por idade. Jovens e pessoas mais velhas se distribuem democraticamente pelos diversos graus considerados (gráfico 12.6a). Já quanto ao grau de escolaridade, se dá exatamente o oposto: entre os muito interessados em ciência e tecnologia, 21,9% são graduados e pós-graduados, 53,9% têm grau de ensino médio, 21,5%, ensino fundamental, 1,7%, educação infantil e 1% não teve nenhuma escolaridade. Já na categoria nada interessado se encontra 1,2% de graduados e pós-graduados, 26,3% de pessoas com nível médio, 47,4% com ensino fundamental, 8,8% com educação infantil e 16,4% de pessoas que não tiveram nenhum tipo de escolaridade (gráfico 12.5).

A par de todas as inferências que os resultados tabulados e interpretados dos questionários permitem, Vogt destaca que se a maioria da população não lê notícias científicas, ela entretanto está exposta de forma mais ou menos passiva à informação que circula sobre ciência. “Cada vez que o Jornal Nacional ou o Globo Repórter fala, por exemplo, sobre um alimento funcional, praticamente a sociedade como um todo passa a tratar disso nos dias seguintes”, diz. Ele acredita que pesquisas de mídia e de frequência do noticiário sobre ciência na imprensa poderão dar parâmetros de indicação para estudos que possam complementar o que já se construiu até agora sobre percepção pública da ciência.

Profissionais satisfeitos – Luisa Massarani observa que se hoje já se avançou nos estudos de audiência em muitos campos, especialmente para as telenovelas no Brasil, na área de jornalismo científico ainda não existem estudos capazes de indicar o que acontece em termos de percepção quando a pessoa ouve e vê uma notícia dessa especialidade no Jornal Nacional. “As pessoas entendem bem? A informação suscita desconfiança? Não sabemos.” De qualquer sorte, permanece em seu entendimento como uma grande questão o que significa fazer jornalismo científico, em termos da produção e da recepção.

Por enquanto, o estudo que ela coordena conseguiu identificar que as mulheres são maioria entre os jornalistas de ciência na América Latina, 61% contra 39% de homens, e que essa é uma especialidade de jovens: quase 30% da amostra situa-se na faixa de 31 a 40 anos e 23% têm entre 21 e 30 anos. De forma coerente com esse último dado, 39% dos entrevistados trabalham há menos de 5 anos em jornalismo científico e 23% entre 6 e 10 anos. E, o dado impressionante, 62% estão satisfeitos com seu trabalho em jornalismo científico e mais 9% muito satisfeitos. É possível que isso tenha relação com o fato de 60% terem emprego formal de tempo integral na área.

Por outro lado, se os jornalistas de ciência da América Latina não têm muitas fontes oficiais que lhes deem um feedback de seu trabalho, 40% deles es–tão seguros de que seu papel é informar o público, 26% pensam que sua função é traduzir material complexo, 13% educar e 9% mobilizar o público. E avaliando o resultado do trabalho, 50% creem que o jornalismo científico produzido no Brasil é médio, 21% bom e somente 2% o classificam como muito bom.

A melhor indicação do quanto os jornalistas de ciência gostam do que fazem está na resposta à questão sobre se recomendariam a outros a carreira. Nada menos do que a metade respondeu que sim, com certeza, enquanto 40% responderam que provavelmente sim. De qualquer sorte, ainda há um caminho a percorrer na definição do papel que cabe aos jornalistas entre os atores que dizem o que a ciência é e faz. “Quem são esses atores?”, indaga Vogt. “Os cientistas achavam que eram eles. Os governos acreditavam que eram eles. Mas hoje dizemos que é a sociedade. Mas de que forma?”

Weathering Fights – Science: What’s It Up To? (The Daily Show with Jon Stewart)

http://media.mtvnservices.com/mgid:cms:video:thedailyshow.com:400760

Science claims it’s working to cure disease, save the planet and solve the greatest human mysteries, but Aasif Mandvi finds out what it’s really up to. (05:47) – Comedy Central

Global Warming May Worsen Effects of El Niño, La Niña Events (Climate Central)

Published: October 12th, 2011

By Michael D. Lemonick

Does this mean Texas is toast?

As just about everyone knows, El Niño is a periodic unusual warming of the surface water in the eastern and central tropical Pacific Ocean. Actually, that’s pretty much a lie. Most people don’t know the definition of El Niño or its mirror image, La Niña, and truthfully, most people don’t much care.

What you do care about if you’re a Texan suffering through the worst one-year drought on record, or a New Yorker who had to dig out from massive snowstorms last winter (tied in part to La Niña), or a Californian who has ever had to deal with the torrential rains that trigger catastrophic mudslides (linked to El Niño), is that these natural climate cycles can elevate the odds of natural disasters where you live.

At the moment, we’re now entering the second year of the La Niña part of the cycle. La Niña is one key reason why the Southwest was so dry last winter and through the spring and summer, and since La Niña is projected to continue through the coming winter, Texas and nearby states aren’t likely to get much relief.

Precipitation outlook for winter 2011-12, showing the likelihood of below average precipitation in Texas and other drought-stricken states.

But Niñas and Niños (the broader cycle, for you weather/climate geeks, is known as the “El Niño-Southern Oscillation,” or “ENSO”) don’t just operate in isolation. They’re part of the broader climate system, which means that climate change could theoretically change how they operate — make them develop more frequently, for example, or less frequently, or be more or less pronounced. Climate change could also intensify the effects of El Niño and La Niña events.

Climate scientists have been wrestling with the first question for a while now, and they still don’t really have a definitive answer. Some climate models have suggested that global warming has already begun to cause subtle changes in ENSO cycles, and that the changes will become more pronounced later this century. But a new study, published in the Journal of Climate, doesn’t find much evidence for that.

But on the second question, the new study is a lot more definitive. “Due to a warmer and moister atmosphere,” said co-author Baylor Fox-Kemper, of the University of Colorado in a press release, “the impacts of El Niño are changing even though El Niño itself doesn’t change.”

That’s because global warming has begun to change the playing field on which El Niño and La Niña operate, just as it’s changing the background conditions that give rise to our everyday weather. The Texas drought is a prime example. Its most likely cause is reduced rainfall from La Niña-related weather patterns. But however dry Texas and Oklahoma might have been otherwise, the killer heat wave that plagued the region this past summer — the sort of heat wave global warming is already making more commonplace — baked much of the remaining moisture out of both the soil and vegetation. No wonder large parts of the Lone Star State have gone up in smoke.

A map of sea surface temperature anomalies, showing a swath of cooler than average waters in the central and eastern tropical Pacific Ocean – a telltale sign La Niña conditions.

When the next El Niño occurs in a year or two, it will probably bring heavy rains to places like Southern California, whose unstable hillsides tend to slide when soggy. Except now, thanks to global warming, the typical El Niño-related storms that roll in off the Pacific may well be turbocharged, since a warmer atmosphere can hold more water. This is the reason, say many climate scientists, that downpours have become heavier in recent decades across broad geographical areas.

La Niña, plus the added moisture in the air from global warming, have also been partially implicated in the massive snowstorms that struck the Northeast and Mid-Atlantic states during the last two winters. Those could get worse as well, suggests the new analysis. “What we see,” says Fox-Kemper, “is that certain atmospheric patterns, such as the blocking high pressure south of Alaska typical of La Niña winters, strengthen…so, the cooling of North America expected in a La Niña winter would be stronger in future climates.” So to pre-answer the question that will inevitably be asked next winter: no, more snow does NOT contradict the idea that the planet is warming. Quite the contrary.

Finally, for those who really do want to know what El Niño and La Niña actually are, as opposed to what they do, you can go to NOAA’s El Niño page. But be warned: there will be a quiz, and the word “thermocline” will appear.

Comments

By Kirk Petersen (Maplewood, NJ 07040)
on October 13th, 2011

Seventh paragraph, third sentence should begin “Its most likely cause”—not “it’s”.

Vital Details of Global Warming Are Eluding Forecasters (Science)

Science 14 October 2011:
Vol. 334 no. 6053 pp. 173-174
DOI: 10.1126/science.334.6053.173

PREDICTING CLIMATE CHANGE

Richard A. Kerr

Decision-makers need to know how to prepare for inevitable climate change, but climate researchers are still struggling to sharpen their fuzzy picture of what the future holds.

Seattle Public Utilities officials had a question for meteorologist Clifford Mass. They were planning to install a quarter-billion dollars’ worth of storm-drain pipes that would serve the city for up to 75 years. “Their question was, what diameter should the pipe be? How will the intensity of extreme precipitation change?” Mass says. If global warming means that the past century’s rain records are no guide to how heavy future rains will be, he was asked, what could climate modeling say about adapting to future climate change? “I told them I couldn’t give them an answer,” says the University of Washington (UW), Seattle, researcher.

Climate researchers are quite comfortable with their projections for the world under a strengthening greenhouse, at least on the broadest scales. Relying heavily on climate modeling, they find that on average the globe will continue warming, more at high northern latitudes than elsewhere. Precipitation will tend to increase at high latitudes and decrease at low latitudes.

But ask researchers what’s in store for the Seattle area, the Pacific Northwest, or even the western half of the United States, and they’ll often demur. As Mass notes, “there’s tremendous uncertainty here,” and he’s not just talking about the Pacific Northwest. Switching from global models to models focusing on a single region creates a more detailed forecast, but it also “piles uncertainty on top of uncertainty,” says meteorologist David Battisti of UW Seattle.

First of all, there are the uncertainties inherent in the regional model itself. Then there are the global model’s uncertainties at the regional scale, which it feeds into the regional model. As the saying goes, if the global model gives you garbage, regional modeling will only give you more detailed garbage. And still more uncertainties are created as data are transferred from the global to the regional model.

Although uncertainties abound, “uncertainty tends to be downplayed in a lot of [regional] modeling for adaptation,” says global modeler Christopher Bretherton of UW Seattle. But help is on the way. Regional modelers are well into their first extensive comparison of global-regional model combinations to sort out the uncertainties, although that won’t help Seattle’s storm-drain builders.

Most humble origins

Policymakers have long asked for regional forecasts to help them adapt to climate change, some of which is now unavoidable. Even immediate, rather drastic action to curb emissions of greenhouse gases would not likely limit warming globally to 2°C, generally considered the threshold above which “dangerous” effects set in. And nothing at all can be done to reduce the global warming effects expected in the next several decades. They are already locked into climate change.

Sharp but true? Feeding a global climate model’s prediction for midcentury (top) into a regional model gives more details (bottom), but modelers aren’t sure how accurate the details are. CREDIT: NORTH AMERICAN REGIONAL CLIMATE CHANGE ASSESSMENT PROGRAM

So scientists have been doing what they can for decision-makers. Early on, it wasn’t much. A U.S. government assessment released in 2000, Climate Change Impacts on the United States, relied on the most rudimentary regional forecasting technique (Science, 23 June 2000, p. 2113). Expert committee members divided the country into eight regions and then considered what two of their best global climate models had to say about each region over the next century. The two models were somewhat consistent in the far southwest, where the report’s authors found it was likely that warmer and drier conditions would eliminate alpine ecosystems and shorten the ski season.

But elsewhere, there was far less consistency. Over the eastern two-thirds of the contiguous 48 states, for example, the two models couldn’t agree on how much moisture soils would hold in the summer. Kansas corn would either suffer severe droughts more frequently, as one model had it, or enjoy even more moisture than it currently does, as the other indicated. But at least the uncertainties were plain for all to see.

The uncertainties of regional projections nearly faded from view in the next U.S. effort, Global Climate Change Impacts in the United States. The 2009 study drew on not two but 15 global models melded into single projections. In a technique called statistical downscaling, its authors assumed that local changes would be proportional to changes on the larger scales. And they adjusted regional projections of future climate according to how well model simulations of past climate matched actual climate.

Statistical downscaling yielded a broad warming across the lower 48 states with less warming across the southeast and up the West Coast. Precipitation was mostly down, especially in the southwest. But discussion of uncertainties in the modeling fell largely to a footnote (number 110), in which the authors cite a half-dozen papers to support their assertion that statistical downscaling techniques are “well-documented” and thoroughly corroborated.

The other sort of downscaling, known as dynamical downscaling or regional modeling, has yet to be fully incorporated into a U.S. national assessment. But an example of state-of-the-art regional modeling appeared 30 June in Environmental Research Letters. To investigate what will happen in the U.S. wine industry, regional modeler Noah Diffenbaugh of Purdue University in West Lafayette, Indiana, and his colleagues embedded a detailed model that spanned the lower 48 states in a climate model that spanned the globe. The global model’s relatively fuzzy simulation of evolving climate from 1950 to 2039—calculated at points about 150 kilometers apart—then fed into the embedded regional model, which calculated a sharper picture of climate change at points only 25 kilometers apart.

Closely analyzing the regional model’s temperature projections on the West Coast, the group found that the projected warming would decrease the area suitable for production of premium wine grapes by 30% to 50% in parts of central and northern California. The loss in Washington state’s Columbia Valley would be more than 30%. But adaptation to the warming, such as the introduction of heat-tolerant varieties of grapes, could sharply reduce the losses in California and turn the Washington loss into a 150% gain.

Not so fast

A rapidly growing community of regional modelers is turning out increasingly detailed projections of future climate, but many researchers, mostly outside the downscaling community, have serious reservations. “Many regional modelers don’t do an adequate job of quantifying issues of uncertainty,” says Bretherton, who is chairing a National Academy of Sciences study committee on a national strategy for advancing climate modeling. “We’re not confident predicting the very things people are most interested in being predicted,” such as changes in precipitation.

Regional models produce strikingly detailed maps of changed climate, but they might be far off base. “The problem is that precision is often mistaken for accuracy,” Bretherton says. Battisti just doesn’t see the point of downscaling. “I would never use one of these products,” he says.

The problems start with the global models, as critics see it. Regional models must fill in the detail in the fuzzy picture of climate provided by global models, notes atmospheric scientist Edward Sarachik, professor emeritus at UW Seattle. But if the fuzzy picture of the region is wrong, the details will be wrong as well. And global models aren’t very good at painting regional pictures, he says. A glaring example, according to Sarachik, is the way global models place the cooler waters of the tropical Pacific farther west than they are in reality. Such ocean temperature differences drive weather and climate shifts in specific regions halfway around the world, but with the cold water in the wrong place, the global models drive climate change in the wrong regions.

Gregory Tripoli’s complaint about the global models is that they can’t create the medium-size weather systems that they should be sending into any embedded regional model. Tripoli, a meteorologist and modeler at the University of Wisconsin, Madison, cites the case of summertime weather disturbances that churn down off the Rocky Mountains and account for 80% of the Midwest’s summer rainfall. If a regional model forecasting for Wisconsin doesn’t extend to the Rockies, Wisconsin won’t get the major weather events that add up to be climate. And some atmospheric disturbances travel from as far away as Thailand to wreak havoc in the Midwest, he says, so they could never be included in the regional model.

A tougher nut. Predicting the details of precipitation using a regional model (bottom) fed by a global model (top) is even more uncertain than projecting regional temperature change. CREDIT: NORTH AMERICAN REGIONAL CLIMATE CHANGE ASSESSMENT PROGRAM

Even the things the global models get right have a hard time getting into regional models, critics say. “There are a lot of problems matching regional and global models,” Tripoli says. In one problem area, global and regional models usually have different ways of accounting for atmospheric processes such as individual cloud development that neither model can simulate directly, creating further clashes. Even the different philosophies involved in building global models and regional models can lead to mismatches that create phantom atmospheric circulations, Tripoli says. “It’s not straightforward you’re going to get anything realistic,” he says.

Redeeming regional modeling

“You could say all the global and regional models are wrong; some people do say that,” notes regional modeler Filippo Giorgi of the Abdus Salam International Centre for Theoretical Physics in Trieste, Italy. “My personal opinion is we do know something now. A few reports ago, it was really very, very difficult to say anything about regional climate change.”

But Giorgi says that in recent years he has been seeing increasingly consistent regional projections coming from combinations of many different models and from successive generations of models. “This means the projections are more and more reliable,” he says. “I would be confident saying the Mediterranean area will see a general decrease in precipitation in the next decades. I’ve seen this in several generations of models, and we understand the processes underlying this phenomenon. This is fairly reliable information, qualitatively. Saying whether the decrease will be 10% or 50% is a different issue.”

The skill of regional climate forecasting also varies from region to region and with what is being forecast. “Temperature is much, much easier” than precipitation, Giorgi notes. Precipitation depends on processes like atmospheric convection that operate on scales too small for any model to render in detail. Trouble simulating convection also means that higher-latitude climate is easier to project than that of the tropics, where convection dominates.

Regional modeling does have a clear advantage in areas with complex terrain such as mountainous regions, notes UW’s Mass, who does regional forecasting of both weather and climate. In the Pacific Northwest, the mountains running parallel to the coast direct onshore winds upward, predictably wringing rain and snow from the air without much difficult-to-simulate convection.

The downscaling of climate projections should be getting a boost as the Coordinated Regional Climate Downscaling Experiment (CORDEX) gets up to speed. Begun in 2009, CORDEX “is really the first time we’ll get a handle on all these uncertainties,” Giorgi says. Various groups will take on each of the world’s continent-size regions. Multiple global models will be matched with multiple regional models and run multiple times to tease out the uncertainties in each. “It’s a landmark for the regional climate modeling community,” Giorgi says.

 

Science 23 June 2000:
Vol. 288 no. 5474 p. 2113
DOI: 10.1126/science.288.5474.2113

GREENHOUSE WARMING

Dueling Models: Future U.S. Climate Uncertain

Richard A. Kerr

When Congress started funding a global climate change research program in 1990, it wanted to know what all this talk about greenhouse warming would mean for United States voters. Ten years later, a U.S. national assessment, drawing on the best available climate model predictions, concludes that the United States will indeed warm, affecting everything from the western snowpacks that supply California with water to New England’s fall foliage. But on a more detailed level, the assessment often draws a blank. Whether the cornfields of Kansas will be gripped by frequent, severe droughts, as one climate model has it, or blessed with more moisture than they now enjoy, as another predicts, the report can’t say. As much as policy-makers would like to know exactly what’s in store for Americans, the rudimentary state of regional climate science will not soon allow it, and the results of this 3-year effort brought the point home.

“This is the first time we’ve tried to take the physical [climate] system and see what effect it might have on ecosystems and socioeconomic systems,” says Thomas Karl, director of the National Oceanic and Atmospheric Administration’s (NOAA’s) National Climatic Data Center in Asheville, North Carolina, and a co-chair of the committee of experts that pulled together the assessment report “Climate Change Impacts on the United States” (available at http://www.nacc.usgcrp.gov/). “We don’t say we know there’s going to be catastrophic drought in Kansas,” he says. “What we do say is, ‘Here’s the range of our uncertainties.’ This document should get people to think.” If anything is certain, Karl says, it’s that “the past isn’t going to be a very good guide to future climate.”

By chance, the assessment had a handy way to convey the range of uncertainty that regional modeling serves up. The report, which divides the country into eight regions, is based on a pair of state-of-the-art climate models—one from the Canadian Climate Center and one from the U.K. Hadley Center for Climate Research and Prediction—that couple a simulated atmosphere and ocean. The two models solved the problems of simplifying a complex world in different ways, leading to very different predicted U.S. climates. “In terms of temperature, the Canadian model is at the upper end of the warming by 2100” predicted by a range of models, says modeler Eric Barron of Pennsylvania State University, University Park, and a member of the assessment team. “The Hadley model is toward the lower end. The Canadian model is on the dry side, and the Hadley model is on the wet side. We’re capturing a substantial portion of the range of simulations. We tried hard to convey that uncertainty.”

On a broad scale, the report can conclude: “Overall productivity of American agriculture will likely remain high, and is projected to increase throughout the 21st century,” although there will be winners and losers from place to place, and adapting agricultural practice to climate change will be key. Where the models are somewhat consistent, as in the far southwest, the report ventures what could be construed as predictions: “It is likely that some ecosystems, such as alpine ecosystems, will disappear entirely from the region,” or “Higher temperatures are likely to mean … a shorter season for winter activities, such as skiing.” Where the models clash, as on summer soil moisture over the eastern two-thirds of the lower 48 states, it explains the alternatives and suggests ways to adapt, such as switching crops.

The range of possible climate impacts laid out by the models “fairly reflects where we are in the science,” says Karl. But he notes that the effort did lack one important input: Congress mandated the assessment without funding it. “You get what you pay for,” says climatologist Kevin Trenberth of the National Center for Atmospheric Research in Boulder, Colorado. “A lot of it was done hastily.” Karl concedes that everyone involved would have liked to have had more funding delivered more reliably.

Even given more time and money, however, the assessment may not have come up with much better small-scale predictions, given the inherent limitations of the science. Even the best models today can say little that’s reliable about climate change at the regional level, never mind at the scale of a congressional district. Their picture of future climate is fuzzy—they might lump together San Francisco and Los Angeles because the models have such coarse geographic resolution—and the realism of such meteorological phenomena as clouds and precipitation is compromised by the inevitable simplifications of simulating the world in a computer.

“For the most part, these sorts of models give a warming,” says modeler Filippo Giorgi, “but they tend to give very different predictions, especially at the regional level, and there’s no way to say one should be believed over another.” Giorgi and his colleague Raquel Francisco of the Abdus Salam International Center for Theoretical Physics in Trieste, Italy, recently evaluated the uncertainties in five coupled climate models—including the two used in the national assessment—within 23 regions, the continental United States comprising roughly three regions. Giorgi concludes that as the scale of prediction shrinks, reliability drops until for small regions “the model data are not believable at all.”

Add in uncertainties external to the models, such as population and economic growth rates, says modeler Jerry D. Mahlman, director of NOAA’s Geophysical Fluid Dynamics Laboratory in Princeton, New Jersey, and the details of future climate recede toward unintelligibility. Some people in Congress and the policy community had “almost silly expectations there would be enormously useful, small-scale specifics, if you just got the right model. But the right model doesn’t exist,” says Mahlman.

Still, even though the national assessment does not offer the list of region-by-region impacts that Congress might have hoped for, it does show “where we are adaptable and where we are vulnerable,” says global change researcher Stephen Schneider of Stanford University. In 10 years, modelers say, they’ll do better.

The Post-Normal Seduction of Climate Science (Forbes)

William Pentland10/14/2011 @ 12:22AM |2,770 views

In early 2002, former U.S. Defense Secretary Donald Rumsfeld explained why the lack of evidence linking Saddam Hussein with terrorist groups did not mean there was no connection during a televised press conference.

“[T]here are known ‘knowns’ – there are things we know we know,” said Rumsfeld. “We also know there are known ‘unknowns’ – that is to say we know there are some things we do not know. But there are also unknown ‘unknowns’ – the ones we don’t know we don’t know . . . it is the latter category that tend to be the difficult ones.”

Rumsfeld turned out to be wrong about Hussein, but what if he had been talking about global warming?  Well, he probably would have been on to something there.  Unknowns of any ilk are a real pickle in climate science.

Indeed, uncertainty in climate science has induced a state of severe political paralysis. The trouble is that nobody really knows why. A rash of recent surveys and studies have exonerated most of the usual suspects – scientific illiteracy, industry distortions, skewed media coverage.

Now, the climate-science community is scrambling to crack the code on the “uncertainty” conundrum. Exhibit A: the October 2011 issue of the journal Climatic Change, the closest thing in climate science to gospel truth, which is devoted entirely to the subject of uncertainty.

While I have yet to digest all of the dozen or so essays, I suspect they are only the opening salvo in what is will soon become a robust debate about the significance of uncertainty in climate-change science. The first item up on the chopping block is called post-normal science (PNS).

PNS is a model of the scientific process pioneered by Jerome Ravetz and Silvio Funtowicz, which describes the peculiar challenges science encounters where “facts are uncertain, values in dispute, stakes high and decisions urgent.” Unlike “normal” science in the sense described by the philosopher of science Thomas Kuhn, post-normal science commonly crosses disciplinary lines and involves new methods, instruments and experimental systems.

Judith Curry, a professor at Georgia Tech, weighs the wisdom of taking the plunge on PNS in an excellent piece called “Reasoning about climate uncertainty.” Drawing on the work of Dutch wunderkind, Jeroen van der Sluijs, Curry calls on the Intergovernmental Panel on Climate Change to stop marginalizing uncertainty and get real about bias in the consensus building process. Curry writes:

The consensus approach being used by the IPCC has failed to produce a thorough portrayal of the complexities of the problem and the associated uncertainties in our understanding . . . Better characterization of uncertainty and ignorance and a more realistic portrayal of confidence levels could go a long way towards reducing the “noise” and animosity portrayed in the media that fuels the public distrust of climate science and acts to stymie the policy process.

PNS is especially seductive in the context of uncertainty. Not surprisingly, Curry suggests that instituting PNS-like strategies at the IPCC “could go a long way towards reducing the ‘noise’ and animosity” surrounding climate-change science.

While I personally believe PNS is persuasive, the PNS model provokes something closer to revulsion in many people. Last year, members of the U.S. House of Representatives filed a petition challenging the U.S. Environmental Protection Agency‘s Greenhouse Gas Endangerment seemed less sanguine about post-normal science:

. . . the conclusions of organizing bodies, especially the IPCC, cannot be said to reflect scientific “consensus” in any meaningful sense of that word. Instead, they reflect a political movement that has commandeered science to the service of its agenda. This is “post-normal science”: the long-dreaded arrival of deconstructionism to the natural sciences, according to which scientific quality is determined not by its fidelity to truth, but by its fidelity to the political agenda.

It seems unlikely that taking the PNS plunge would appreciably improve the U.S. public’s perception of the credibility, legitimacy and salience of climate-change assessments. This probably says more about Americans than it does about the analytic force of the PNS model.

Let’s face it. Americans do not agree on a whole hell of a lot. And they never have. Many U.S. institutions were deliberately designed to tolerate the coexistence of free states and slave-owning states. Ironically, Americans appear to agree more on climate-change science than other high-profile scientific controversies like the safety of genetically-modified organisms.

National Science Foundation

While it pains me to admit this, I am increasingly convinced that the IPCC’s role in assessing the science of climate change needs to be scaled back. The IPCC was an overly optimistic experiment in international governance designed for a world that never materialized.  The U.N. General Assembly established the IPCC in the months immediately preceding the fall of the Berlin Wall. Only two few years later, the IPCC’s first assessment report and the creation of the U.N. Framework Convention on Climate Change coincided with the collapse of the Soviet Union and the end of the Cold War.

A new world order seemed to be dawning in those days, which is probably why it seemed like a good idea to ask scientists to tell us what constitutes “dangerous climate change.”   Two decades and two world trade towers later, the world is a decidedly less hospitable place for institutions like the IPCC.

The proof is in the pudding – or, in this case, the atmosphere.

Climate Change Tumbles Down Europe’s Political Agenda as Economic Worries Take the Stage (N.Y. Times)

By JEREMY LOVELL of ClimateWire. Published: October 13, 2011

LONDON — Climate change has all but fallen off the political agenda across Europe as the resurging economic crisis empties national coffers and shakes economic confidence, and the public and the press turn their attention to more immediate issues of rising fuels bills and joblessness, analysts say.

Sputtering economies, a shift of attention to looming elections and the prospect of little or no movement in the December climate talks in Durban, South Africa, have combined to take the political momentum out of an issue that was a major cause in Europe.

“It is way down the agenda and will not feature in elections,” said Edward Cameron, director of the World Resources Institute think tank’s international climate initiative, on the sidelines of a meeting on climate change at London’s Chatham House think tank. “At a time of joblessness and fiscal crises, it is very difficult to advance the climate change issue.”

That is as true for next year’s presidential elections in the United States as it will be in France, despite the fact that there has been a series of environmental disasters, from the Texas drought this year to Russia’s heat wave and consequent steep rise in wheat prices last year.

According to acclaimed NASA scientist James Hansen, who has been warning of impending climatic doom for decades, the lack of focus on these events is in no small part due to the fact that scientists are poor communicators while the climate change skeptics have mounted a smoothly run campaign to capitalize on any mistakes and admissions of uncertainty.

“There is a strong campaign by those people who want to continue the fossil fuel business as usual. Climate contrarians … have managed in the public’s eye to muddy the waters enough that there is uncertainty why should we do anything yet,” he said on a visit to London’s Royal Society for a meeting on lessons to be learned from past climate change battles.

“They have been winning the argument in the last several years, even though the science has become clearer,” he added.

Nuclear power issue distracts Berlin

In Germany, where a generous feed-in tariff scheme has produced some 28 gigawatts of wind power capacity and more than 18 GW of solar photovoltaic capacity, Chancellor Angela Merkel’s coalition government was forced into an abrupt U-turn on a controversial move to extend the lives of the country’s fleet of nuclear power plants. There was a political revolt after the March 11 nuclear disaster at Fukushima in Japan.

The oldest seven of Germany’s nuclear plants were closed immediately after Fukushima and will now never reopen, while the remainder will close by 2022.

This has had the perverse effect in a country proud of its renewable energy efforts of increasing the use of coal-fired power plants and increasing the likelihood of new coal- or gas-fired plants being built. The price tag will include higher carbon emissions at exactly the time that the Germany along with the rest of the European Union is pledged to cut emissions.

While political observers believe the climate change issue will come back to the fore at some point in Germany — a country where the Greens have played a pivotal political role — the nuclear power issue is so politically charged that it is off the agenda for now.

Even in the United Kingdom, which has a huge wind energy program and where the Conservative-Liberal Democrat coalition came to power 15 months ago pledging to be the “greenest government ever,” there are major signs of backsliding. A long-awaited energy bill has been shelved, and renewable energy support costs and carbon emission reduction targets are either under review or about to be.

At the Conservative Party’s annual conference earlier this month, climate change was consigned to a brief debate on the opening Sunday, when delegates were mostly just arriving and finding their way around or still traveling to get there.

Damned by faint praise in London

Prime Minister David Cameron did not mention the issue in his speech to the conference — a performance that usually sets the broad agenda for the following year — and Chancellor of the Exchequer George Osborne caused environmental outrage but satisfaction to the party’s right wing by pledging that the United Kingdom would not go any faster than its E.U. neighbors on emission cuts.

This is despite the fact that the United Kingdom has a legal target to cut its carbon emissions by at least 80 percent below 1990 levels by 2050, with cuts of 35 percent by 2022 and 50 percent by 2025, whereas the European Union’s goal is 20 percent by 2020.

It was widely reported that the 2022 target was only agreed to after a major battle in the Cabinet between supporters of Conservative Osborne and those of Liberal Democrat Energy and Climate Change Minister Chris Huhne. It has since been announced that the carbon targets will be reviewed in 2014.

Even in London, where charismatic Conservative Mayor Boris Johnson came to power in 2008 in part on a green ticket, the issue has largely been parked and replaced by transport in the run-up to next year’s mayoral elections. The city’s aging transport system is feared likely to come under massive strain during the 2012 Olympic Games.

Then there is the strange case of a strategic plan on adapting London to climate change, the draft of which was launched with great fanfare and declarations of urgency in February 2010. It was on the brink of publication in September 2010, but after that, it appeared to have vanished without trace.

At the same time, most members of City Hall’s climate change team, set up under the previous Labour administration, have been moved to other jobs.

‘Too difficult — and not a vote winner’

“Political leaders get it, but the treasuries don’t. The men with the money don’t want to be first movers,” said Nick Mabey, co-founder of environmental think tank E3G. “But the political froth has gone. It has become too difficult — and not a vote winner.”

Compounding that problem, at least in the United Kingdom, has been a series of reports underscoring the likely high cost to households of green energy policies at a time when the prices of domestic electricity and gas are already rising sharply.

A recent opinion poll found that the climate change issue has been replaced by concerns over rising fuel bills and energy security.

But Mabey is not too concerned. While the subject may be off the immediate political agenda, behind the scenes, the more enlightened corporate leaders and investment fund managers have been making their own calculations. They are moving their money into the low-carbon economic transformation that in some cases is already profitable and in many eyes essential and inevitable.

The main danger, they say, is that if climate change as a driver of action is allowed to languish too long and become too invisible while energy becomes the main motivator, it will become far harder to resurrect climate change.

For Mabey and WRI’s Cameron, while the deep and seemingly returning global economic crisis has proved a serious distraction internationally as well as domestically, all is not lost.

For a number of reasons, including the rise of a new and major climate player — China — and a series of new scientific reports on climate change due over the next two or three years, 2015 will be the next pivotal moment for the world to take collective action, they say.

“Climate change doesn’t keep people awake at night. Our task for the next few years is to move it back up the political agenda again,” said WRI’s Cameron.

Copyright 2011 E&E Publishing. All Rights Reserved.

Group Urges Research Into Aggressive Efforts to Fight Climate Change (N.Y. Times)

By CORNELIA DEAN, Published: October 4, 2011

With political action on curbing greenhouse gases stalled, a bipartisan panel of scientists, former government officials and national security experts is recommending that the government begin researching a radical fix: directly manipulating the Earth’s climate to lower the temperature.

Members said they hoped that such extreme engineering techniques, which include scattering particles in the air to mimic the cooling effect of volcanoes or stationing orbiting mirrors in space to reflect sunlight, would never be needed. But in itsreport, to be released on Tuesday, the panel said it is time to begin researching and testing such ideas in case “the climate system reaches a ‘tipping point’ and swift remedial action is required.”

The 18-member panel was convened by the Bipartisan Policy Center, a research organization based in Washington founded by four senators — Democrats and Republicans — to offer policy advice to the government. In interviews, some of the panel members said they hoped that the mere discussion of such drastic steps would jolt the public and policy makers into meaningful action in reducing greenhouse gas emissions, which they called the highest priority.

The idea of engineering the planet is “fundamentally shocking,” David Keith, an energy expert at Harvard and the University of Calgary and a member of the panel, said. “It should be shocking.”

In fact, it is an idea that many environmental groups have rejected as misguided and potentially dangerous.

Jane Long, an associate director of the Lawrence Livermore National Laboratory and the panel’s co-chairwoman, said that by spewing greenhouse gases into the atmosphere, human activity was already engaged in climate modification. “We are doing it accidentally, but the Earth doesn’t know that,” she said, adding, “Going forward in ignorance is not an option.”

The panel, the Task Force on Climate Remediation Research, suggests that the White House Office of Science and Technology Policy begin coordinating research and estimates that a valuable effort could begin with a few million dollars in financing over the next few years.

One reason that the United States should embrace such research, the report suggests, is the threat of unilateral action by another country. Members say research is already under way in Britain, Germany and possibly other countries, as well as in the private sector.

“A conversation about this is going to go on with us or without us,” said David Goldston, a panel member who directs government affairs at the Natural Resources Defense Counciland is a former chief of staff of the House Committee on Science. “We have to understand what is at stake.”

In interviews, panelists said again and again that the continuing focus of policy makers and experts should be on reducing emissions of carbon dioxide and other greenhouse gases. But several acknowledged that significant action remained a political nonstarter. Last month, for example, the Obama administration told the federal Environmental Protection Agency to hold off on tightening ozone standards, citing complications related to the weak economy.

According to the United Nations Intergovernmental Panel on Climate Change, greenhouse gas emissions have contributed to raising the global average surface temperatures by about 1.3 degrees Fahrenheit in the past 100 years. It is impossible to predict how much impact the report will have. But given the panelists’ varied political and professional backgrounds, they seem likely to achieve one major goal: starting a broader conversation on the issue. Some climate experts have been working on it for years, but they have largely kept their discussions to themselves, saying they feared giving the impression that there might be quick fixes for climate change.

“Climate adaptation went through the same period of concern,” Mr. Goldston said, referring to the onetime reluctance of some researchers to discuss ways in which people, plants and animals might adjust to climate change. Now, he said, similar reluctance to discuss geoengineering is giving way, at least in part because “it’s possible we may have to do this no matter what.”

Although the techniques, which fall into two broad groups, are more widely known as geoengineering, the panel prefers “climate remediation.”

The first is carbon dioxide removal, in which the gas is absorbed by plants, trapped and stored underground or otherwise removed from the atmosphere. The methods are “generally uncontroversial and don’t introduce new global risks,” said Ken Caldeira, a climate expert at Stanford University and a panel member. “It’s mostly a question of how much do these things cost.”

Controversy arises more with the second group of techniques, solar radiation management, which involves increasing the amount of solar energy that bounces back into space before it can be absorbed by the Earth. They include seeding the atmosphere with reflective particles, launching giant mirrors above the earth or spewing ocean water into the air to form clouds.

These techniques are thought to pose a risk of upsetting earth’s natural rhythms. With them, Dr. Caldeira said, “the real question is what are the unknown unknowns: Are you creating more risk than you are alleviating?”

At the influential blog Climate Progress, Joe Romm, a fellow at the Center for American Progress, has made a similar point, likening geo-engineering to a dangerous course of chemotherapy and radiation to treat a condition curable through diet and exercise — or, in this case, emissions reduction.

The panel rejected any immediate application of climate remediation techniques, saying too little is known about them. In 2009, the Royal Society in Britain said much the same, assessing geoengineering technologies as “technically feasible” but adding that their potential costs, effectiveness and risks were unknown.

Similarly, in a 2010 review of federal research that might be relevant to climate remediation, the federal Government Accountability Office noted that “major uncertainties remain on the efficacy and potential consequences” of the approach. Its report also recommended that the White House Office of Science and Technology Policy “establish a clear strategy for geoengineering research.”

John P. Holdren, who heads that office, declined interview requests. He issued a statement reiterating the Obama administration’s focus on “taking steps to sensibly reduce pollution that is contributing to climate change.”

Yet in an interview with The Associated Press in 2009, Dr. Holdren said the possible risks and benefits of geoengineering should be studied very carefully because “we might get desperate enough to want to use it.”

In a draft plan made public on Friday, the U.S. Global Change Research Program, a coordinating effort administered by his office, outlined its own climate change research agenda, including studies of the impacts of rapid climate change.

The plan said that climate-related projections would be crucial to future studies of the “feasibility, effectiveness and unintended consequences of strategies for deliberate, large-scale manipulations of Earth’s environment,” including carbon dioxide removal and solar radiation management.

Many countries fault the United States for government inaction on climate change, especially given its longtime role as a chief contributor to the problem.

Frank Loy, a panelist and former chief climate negotiator for the United States, suggested that people around the world would see past those issues if the United States embraced geoengineering studies, provided that it was “very clear about what kind of research is undertaken and what the safeguards are.”

This article has been revised to reflect the following correction:

Correction: October 4, 2011

An earlier version of this article mistakenly referred to Frank Loy as the nation’s chief climate negotiator; he is a former chief climate negotiator. It also misstated the name of a federal agency that reported on the potential effectiveness of climate remediation. It is the Government Accountability Office, not the General Accountability Office.

NSF seeks cyber infrastructure to make sense of scientific data (Federal Computer Week)

By Camille Tuutti, Oct 04, 2011

The National Science Foundation has tapped a research team at the University of North Carolina-Chapel Hill to develop a national data infrastructure that would help future scientists and researchers manage the data deluge, share information and fuel innovation in the scientific community.

The UNC group will lead the DataNet Federation Consortium, which includes seven universities. The infrastructure that the consortium will try to create would support collaborative multidisciplinary research and will “democratize access to information among researchers and citizen scientists alike,” said Rob Pennington, program director in NSF’s Office of Cyberinfrastructure.

“It means researchers on the cutting edge have access to new, more extensive, multidisciplinary datasets that will enable breakthroughs and the creation of new fields of science and engineering,” he added.

The effort would be a “significant step in the right direction” in solving some of the key problems researchers run into, said Stan Ahalt, director at the Renaissance Computing Institute at UNC-Chapel Hill, which federates the consortium’s data repositories to enable cross-disciplinary research. One of the issues researchers today grapple with is how to best manage data in a way that maximizes its utility to the scientific community, he said. Storing massive quantities of data and the lack of well-designed methods that allow researchers to use unstructured and structured data simultaneously are additional obstacles for researchers, Ahalt added.

The national data infrastructure may not solve everything immediately, he said, “but it will give us a platform for start working meticulously on more long-term rugged solutions or robust solutions.”

DFC will use iRODS, the integrated Rule Oriented Data System, to implement a data management infrastructure. Multiple federal agencies are already using the technology: the NASA Center for Climate Simulation, for example, imported a Moderate Resolution Imaging Spectroradiometer satellite image dataset onto the environment so academic researchers would have access, said Reagan Moore, principal investigator for the Data Intensive Cyber Environments research group at UNC-Chapel Hill that leads the consortium.

It’s very typical for a scientific community to develop a set of practices around a particular methodology of collecting data, Ahalt explained. For example, hydrologists know where their censors are and what those mean from a geographical perspective. Those hydrologists put their data in a certain format that may not be obvious to someone who is, for example, doing atmospheric studies, he said.

“The long-term goal of this effort is to improve the ability to do research,” Moore said. “If I’m a researcher in any given area, I’d like to be able to access data from other people working in the same area, collaborate with them, and then build a new collection that represents the new research results that are found. To do that, I need access to the old research results, to the observational data, to simulations or analyze what happens using computers, etc. These environments then greatly minimize the effort required to manage and distribute a collection and make it available to research.”

For science research as a whole, Ahalt said the infrastructure could mean a lot more than just managing the data deluge or sharing information within the different research communities.

“Data is the currency of the knowledge economy,” he said. “Right now, a lot of what we do collectively and globally from an economic standpoint is highly dependent on our ability to manipulate and analyze data. Data is also the currency of science; it’s our ability to have a national infrastructure that will allow us to share those scientific assets.”

The bottom line: “We’ll be more efficient at producing new science, new innovation and new innovation knowledge,” he said.

About the Author

Camille Tuutti is a staff writer covering the federal workforce.

Little Ice Age Shrank Europeans, Sparked Wars (NetGeo)

Study aims to scientifically link climate change to societal upheaval.

London’s River Thames, frozen over in 1677. Painting by Abraham Hondius via Heritage Images/Corbis

Brian Handwerk, for National Geographic News

Published October 3, 2011

Pockmarked with wars, inflation, famines and shrinking humans, the 1600s in Europe came to be called the General Crisis.

But whereas historians have blamed those tumultuous decades on growing pains between feudalism and capitalism, a new study points to another culprit: the coldest stretch of the climate change period known as the Little Ice Age.

(Also see “Sun Oddly Quiet—Hints at Next ‘Little Ice Age’?”)

The Little Ice Age curbed agricultural production and eventually led to the European crisis, according to the authors of the study—said to be the first to scientifically verify cause-and-effect between climate change and large-scale human crises.

Prior to the industrial revolution, all European countries were by and large agrarian, and as study co-author David Zhang pointed out, “In agricultural societies, the economy is controlled by climate,” since it dictates growing conditions.

A team led by Zhang, of the University of Hong Kong, pored over data from Europe and other the Northern Hemisphere regions between A.D. 1500 to 1800.

The team compared climate data, such as temperatures, with other variables, including population sizes, growth rates, wars and other social disturbances, agricultural production figures and famines, grain prices, and wages.

The authors say some effects, such as food shortages and health problems, showed up almost immediately between 1560 and 1660—the Little Ice Age’s harshest period—during which growing seasons shortened and cultivated land shrank.

As arable land contracted, so too did Europeans themselves, the study notes. Average height followed the temperature line, dipping nearly an inch (two centimeters) during the late 1500s, as malnourishment spread, and rising again only as temperatures climbed after 1650, the authors found.

(Related: “British Have Changed Little Since Ice Age, Gene Study Says.”)

Others effects—such as famines, the Thirty Years’ War (1618-48), or the 164 Manchu conquest of China—took decades to manifest. “Temperature is not a direct cause of war and social disturbance,” Zhang said. “The direct cause of war and social disturbance is the grain price. That is why we say climate change is the ultimate cause.”

The new study is both history lesson and warning, the researchers added.

As our climate changes due to global warming (see interactive), Zhang said, “developing countries will suffer more, because large populations in these countries [directly] rely on agricultural production.”

More: “Climate Change Killed Neanderthals, Study Says” >>

Índios invadem obras de Belo Monte e bloqueiam Transamazônica (FSP)

27/10/2011 – 14h21

AGUIRRE TALENTO
DE BELÉM

O canteiro de obras da hidrelétrica de Belo Monte, localizado no município de Vitória do Xingu (oeste do Pará, a 945 km de Belém), foi invadido na manhã desta quinta-feira (27) em um protesto de indígenas, pescadores e moradores da região.

Eles também bloquearam a rodovia Transamazônica na altura do quilômetro 52, onde fica a entrada do canteiro de obras da usina.

O protesto, que começou às 5h da manhã, foi organizado durante seminário realizado nesta semana, em Altamira (também no oeste, a 900 km de Belém), que discutiu os impactos da instalação de usinas hidrelétricas na região.

Os seguranças permitiram a entrada dos manifestantes sem oferecer resistência, e os funcionários da empresa não apareceram para trabalhar. Com isso, as obras estão paradas.

“Acreditamos que a empresa ficou sabendo de nossa manifestação e não quis entrar em confronto”, afirmou Eden Magalhães, secretário-executivo do Cimi (Conselho Indigenista Missionário), uma das entidades participantes do protesto.

A Polícia Rodoviária Federal confirmou a ocorrência do protesto, mas ainda não sabe estimar a quantidade de pessoas presentes.Segundo ele, há cerca de 600 pessoas no local, entre índios, pescadores, população ribeirinha e até estudantes.

Os manifestantes exigem a presença de algum integrante do governo federal no local e pedem a paralisação das obras.

Ontem, foi adiado mais uma vez o julgamento na Justiça Federal sobre o licenciamento da usina de Belo Monte. O julgamento está empatado com um voto a favor da construção da usina e um voto contra. Falta o voto de desempate, mas ainda não há previsão de quando o processo voltará a ser colocado em pauta.

 

Novo Maracanã já nasce velho (O Globo)

André Trigueiro

O Globo, 27/10/11

O projeto do novo Maracanã confirma a exclusão de um item absolutamente importante para que qualquer projeto de engenharia do gênero possa ser chamado de “moderno e sustentável”. Apesar do variado cardápio de estádios de futebol espalhados pelo mundo com aproveitamento energético do sol, a caríssima obra de reconstrução do Maracanã – quase 1 bilhão de reais – ignorou essa possibilidade.

Estranho que isso tenha acontecido num país onde o sol brilha em média 280 dias por ano. Ainda mais estranho que isso tenha acontecido na cidade que sediou a Rio-92, que vai sediar a Rio+20, e que está situada na mesma faixa de exposição solar que Sidney, na Austrália, que se notabilizou por realizar os primeiros Jogos Verdes da História, inteiramente abastecidos de energia solar.

Cobri como jornalista os Jogos de Sidney em 2000 e lembro-me das imensas estruturas com placas fotovoltaicas que captavam energia solar para iluminar as competições no estádio olímpico, no Superdome e em todas as instalações esportivas. A Vila Olímpica com 665 casas se transformou no maior bairro dotado de energia solar do planeta. O porta-voz do Comitê Olímpico Internacional, o australiano Michael Bland, justificou assim os investimentos em energia solar: “Queremos fazer com que a energia solar se torne popular em todos os países. É ridículo que, na Austrália, todas as casas não usem um captador de energia solar. Temos os telhados, temos o sol, e os desperdiçamos. É um jeito estúpido de levar a vida”.

Que estupidez a nossa desperdiçar a imensa área das marquises do novo Maracanã – quase 29 mil metros quadrados – que poderiam abrigar um vistoso conjunto de placas fotovoltaicas capazes de gerar energia elétrica para até 3.000 domicílios. O custo varia de dez a vinte milhões de reais, dependendo da tecnologia empregada. Alguém poderá dizer: “É caro demais! Não vale a pena”. Mas será que a forma usual de comprar energia está valendo a pena?

Vivemos num país onde, segundo o IBGE, a tarifa de energia elétrica subiu mais do que o dobro da inflação oficial nos últimos 15 anos. A opção pelo solar – embora mais cara – oferece como vantagem a amortização do investimento em alguns poucos anos.

Alguém poderá dizer que a nova marquise – mais leve – poderia não suportar as tradicionais placas fotovoltaicas. Pois que se pensasse numa estrutura compatível. O que está em jogo é a possibilidade de tornar o estádio útil mesmo em dias que não aconteçam partidas de futebol. O Maracanã poderia ser uma usina de energia – ainda que com potência modesta – que além do benefício direto de gerar eletricidade, funcionaria também como elemento indutor de mais pesquisas e investimentos em energia solar no Brasil.

E quem disse que o custo de instalação de um projeto como esse só seria possível com recursos públicos? Se houvesse vontade política para promover inovação tecnológica no setor energético usando o novo Maracanã como garoto-propaganda, seria perfeitamente possível sondar o interesse de grandes empresas com know-how em energia solar que aceitassem instalar os equipamentos fotovoltaicos a custo zero, sem ônus para o governo. E o que essa empresa ganharia em troca? O direito de explorar a imagem do Maracanã como “estádio solar” graças à tecnologia oferecida pela empresa.

Alguém duvida que a imagem aérea do estádio tanto na Copa de 2014 quanto nas Olimpíadas de 2016 alcançará bilhões de telespectadores pelo mundo? É mídia espontânea, super-exposição positiva de imagem, e tudo aquilo que um bom negociador não levaria mais do que alguns minutos para convencer o investidor a botar a mão no bolso e bancar a ideia.

Com recursos públicos ou privados, o certo era fazer. Não basta instalar alguns coletores solares para aquecer a água do banho usadas pelos atletas nos vestiários. É pouco. Se os responsáveis pelo projeto do Maracanã marcaram um gol contra desprezando o sol, os estádios de Pituaçu, em Salvador, e Mineirão, em Belo Horizonte, terão a energia solar como aliada para a produção de energia elétrica. Acorda Rio! Maracanã sem energia solar é como o Rio sem praia. Infelizmente os cariocas continuarão usando o sol apenas para se bronzear.Símbolo da sustentabilidade por suas belezas naturais e por sediar grande conferências ambientais da ONU, o Rio de Janeiro continua com um Maracanã aquém do que merece.

 

 

Acre: In defence of life and the integrity of the peoples and their territories against REDD and the commodification of nature

Letter from the State of Acre

In defence of life and the integrity of the peoples and their territories against REDD and the commodification of nature

We gathered in Rio Branco, in the State of Acre, on 3-7 October 2011 for the workshop “Serviços Ambientais, REDD e Fundos Verdes do BNDES: Salvação da Amazônia ou Armadilha do Capitalismo Verde?” (Environmental Services, REDD and BNDES Green Funds: The Amazon’s Salvation or a Green Capitalism Trap?)

The participants included socio-environmental organizations, family agriculture associations, Extractive Reserve (RESEX) and Extractive Settlement organizations, human rights organizations (national and international), social pastoral organizations, professors, students, and members of civil society committed to the struggle of “the underdogs”.

We saw the emergence of a consensus around the belief that, since 1999 and the election of the Popular Front of Acre (FPA) government, initiatives have been adopted to establish a “new model” of development in the state. Since then, this model has been praised as a prime example of harmony between economic development and the preservation of forests, their natural resources and the way of life of their inhabitants. With strong support from the media, trade unions, NGOs that promote green capitalism in the Amazon region, multilateral banks, local oligarchies and international organizations, it is presented as a “successful model” to be emulated by other regions of Brazil and the world.

Over these past few days we have had the opportunity to learn first hand, in the field, about some of the initiatives in Acre that are considered as exemplary. We saw for ourselves the social and environmental impacts of the “sustainable development” underway in the state. We visited the Chico Mendes Agro-Extractive Settlement Project, the NATEX condom factory, and the Fazendas Ranchão I and II Sustainable Forest Management Project in Seringal São Bernardo (the São Bernardo rubber plantation). These field visits presented us with a reality that is rather far removed from the image portrayed nationally and internationally.

In Seringal São Bernardo, we were able to observe the priority placed on the interests of timber companies, to the detriment of the interests of local communities and nature conservation. Even the questionable rules of the forest management plans are not respected, and according to the local inhabitants, these violations are committed in collusion with the responsible state authorities. In the case of the Chico Mendes Agro-Extractive Settlement Project in Xapuri, we saw that the local population remains subjugated to monopoly control: they currently sell their timber to the company Laminados Triunfo at a rate of R$90 per cubic metre, when this same amount of wood can be sold for as much as R$1200 in the city. This is why we support the demands of various communities for the suspension of these famous forest management projects. We call for the investigation of all of the irregularities revealed, and we demand punishment for those guilty of the criminal destruction of natural resources.

During the course of the workshop we also analyzed the issues of environmental services, REDD and the BNDES (Brazilian Development Bank) Green Funds. We gained a greater understanding of the role of banks (World Bank, IMF, IDB and BNDES), of NGOs that promote green capitalism (e.g. WWF, TNC and CI) and other institutions such as the ITTO, FSC and USAID, and also sectors of civil society and the state and federal governments who have allied with international capital for the commodification of the natural heritage of the Amazon region.

It was stressed that, in addition to being anti-constitutional, Law Nº 2.308 of 22 October 2010, which regulates the State System of Incentives for Environmental Services, was created without the due debate with sectors of society directly impacted by the law, that is, the men and women of the countryside and forests. Slavishly repeating the arguments of the powerful countries, local state authorities present it as an effective means of contributing to climate equilibrium, protecting the forests and improving the quality of life of those who live in the forests. It should be noted, however, that this legislation generates “environmental assets” in order to negotiate natural resources on the “environmental services” market, such as the carbon market. It represents a reinforcement of the current phase of capitalism, whose defenders, in order to ensure its widespread expansion, utilize an environmental discourse to commodify life, privatize nature and plunder the inhabitants of the countryside and the cities. Under this law, the beauty of nature, pollination by insects, regulation of rainfall, culture, spiritual values, traditional knowledge, water, plants and even popular imagery are converted into merchandise. The current proposal to reform the Forest Code complements this new strategy of capital accumulation by authorizing the negotiation of forests on the financial market, through the issuing of “green bonds”, or so-called “Environmental Reserve Quota Certificates” (CCRAs). In this way, everything is placed in the sphere of the market, to be administered by banks and private corporations.

Although it is presented as a solution for global warming and climate change, the REDD proposal allows the powerful capitalist countries to maintain their current levels of production, consumption and, therefore, pollution. They will continue to consume energy generated by sources that produce more and more carbon emissions. Historically responsible for the creation of the problem, they now propose a “solution” that primarily serves their own interests. While making it possible to purchase the “right to pollute”, mechanisms like REDD strip “traditional” communities (riverine, indigenous and Afro-Brazilian communities, rubber tappers, women coconut gatherers, etc.) of their autonomy in the management of their territories.

As a result, roles are turned upside down. Capitalism, the most predatory civilization in the history of humankind, would not pose a danger; on the contrary, it would be the “solution”. The “destroyers” would now be those who fight to defend nature. And so those who have historically ensured the preservation of nature are now viewed as predators, and are therefore criminalized. It comes as no surprise then that the state has recently become more open in its repression, persecution and even the expulsion of local populations from their territories – all to ensure the free expansion of the natural resources market.

With undisguised state support, through this and other projects, capital is now promoting and combining two forms of re-territorialization in the Amazon region. On one hand, it is evicting peoples and communities from their territories (as in the case of mega projects like hydroelectric dams), stripping them of their means of survival. On the other hand, it is stripping those who remain on their territories of their relative autonomy, as in the case of environmental conservation areas. These populations may be allowed to remain on their land, but they are no longer able to use it in accordance with their ways of life. Their survival will no longer be guaranteed by subsistence farming – which has been transformed into a “threat” to the earth’s climate stability – but rather by a “bolsa verde” or “green allowance”, which in addition to being insufficient is paid in order to maintain the oil civilization.

Because we are fully aware of the risks posed by projects like these, we oppose the REDD agreement between California, Chiapas and Acre, which has already caused serious problems for indigenous and traditional communities such as those in the Amador Hernández region of Chiapas, Mexico. This is why we share our solidarity with the poor communities of California and Chiapas, who have already suffered from its consequences. We also share our solidarity with the indigenous peoples of the Isiboro Sécure National Park and Indigenous Territory (TIPNIS) in Bolivia, who are facing the threat of the violation of their territory by a highway linking Cochabamba and Beni, financed by the BNDES.

We are in a state which, in the 1970s and 1980s, was the stage for historical struggles against the predatory expansion of capital and in defence of territories inhabited by indigenous peoples and peasant communities of the forests. These struggles inspired many others in Brazil and around the world. In the late 1990s, however, Acre was converted into a laboratory for the IDB’s and World Bank’s experiments in the commodification and privatization of nature, and is now a state “intoxicated” by environmental discourse and victimized by the practice of “green capitalism”. Among the mechanisms used to legitimize this state of affairs, one of the most striking is the manipulation of the figure of Chico Mendes. To judge by what they present us with, we would have to consider him the patron saint of green capitalism. The name of this rubber tapper and environmental activist is used to defend oil exploitation, monoculture sugar cane plantations, large-scale logging activity and the sale of the air we breathe.

In view of this situation, we would have to ask if there is anything that could not be made to fit within this “sustainable development” model. Perhaps at no other time have cattle ranchers and logging companies met with a more favourable scenario. This is why we believe it is necessary and urgent to fight it, because under the guise of something new and virtuous, it merely reproduces the old and perverse strategies of the domination and exploitation of humans and nature.

Finally, we want to express here our support for the following demands: agrarian reform, official demarcation of indigenous lands, investments in agroecology and the solidarity economy, autonomous territorial management, health and education for all, and democratization of the media. In defence of the Amazon, of life, of the integrity of the peoples and their territories, and against REDD and the commodification of nature. Our struggle continues.

Rio Branco, Acre, 7 October 2011

Signed:

Assentamento de Produção Agro-Extrativista Limoeiro-Floresta Pública do Antimary (APAEPL)

Amazonlink

Cáritas – Manaus

Centro de Defesa dos Direitos Humanos e Educação Popular do Acre (CDDHEP/AC)

Centro de Estudos e Pesquisas para o Desenvolvimento do Extremo Sul da Bahia (CEPEDES)

Comissão Pastoral da Terra – CPT Acre

Conselho Indigenista Missionário – CIMI Regional Amazônia Ocidental

Conselho de Missão entre Índios – COMIN Assessoria Acre e Sul do Amazonas

Coordenação da União dos Povos Indígenas de Rondônia, Sul do Amazonas e Noroeste do Mato Grosso – CUNPIR

FERN

Fórum da Amazônia Ocidental (FAOC)

Global Justice Ecology Project

Grupo de Estudo sobre Fronteira e Identidade – Universidade Federal do Acre

Instituto Madeira Vivo (IMV-Rondônia)

Instituto Mais Democracia

Movimento Anticapitalista Amazônico – MACA

Movimento de Mulheres Camponesas (MMC – Roraima)

Nós Existimos – Roraima

Núcleo Amigos da Terra Brasil

Núcleo de Pesquisa Estado, Sociedade e Desenvolvimento na Amazônia Ocidental -Universidade Federal do Acre.

Oposição Sindical do STTR de Brasiléia

Rede Alerta Contra o Deserto Verde

Rede Brasil sobre Instituições Financeiras Multilaterais

Sindicato dos Trabalhadores Rurais de Bujarí (STTR – Bujarí)

Sindicato dos Trabalhadores Rurais de Xapuri (STTR- Xapuri)

Terra de Direitos

União de Mulheres Indígenas da Amazonia Brasileira

World Rainforest Movement (WRM)

Carta del Estado de Acre

En defensa de la vida, de la integridad de los pueblos y de sus territorios contra el REDD y la mercantilización de la naturaleza

Estuvimos reunidos en Rio Branco – Estado de Acre, entre los días 3 y 7 de octubre de 2011 en el Taller: “Serviços Ambientais, REDD e Fundos Verdes do BNDES: Salvação da Amazônia ou Armadilha do Capitalismo Verde?” (Servicios Ambientales, REDD y Fondos Verdes del BNDES: ¿Salvación de la Amazonia o Trampa del Capitalismo Verde? )

Estábamos presentes organizaciones socioambientales, de trabajadoras y trabajadores de la agricultura familiar, organizaciones de Resex (Reservas Extractivistas) y Asentamientos Extractivistas, de derechos humanos (nacionales e internacionales), organizaciones indígenas, organizaciones de mujeres, pastorales sociales, profesores, estudiantes y personas de la sociedad civil comprometidas con la lucha “de los de abajo”.

Percibimos la formación de un consenso en torno a la idea de que, desde 1999, con la elección del gobierno del Frente Popular de Acre (FPA), se tomaron iniciativas para la implantación de un “nuevo modelo” de desarrollo. Desde entonces, dicho modelo es celebrado como primor de armonía entre desarrollo económico y conservación del bosque, de sus bienes naturales y del modo de vida de sus habitantes. Con fuerte apoyo de los medios de comunicación, de sindicatos, de ONGs promotoras del capitalismo verde en la región amazónica, de bancos multilaterales, de oligarquías locales, de organizaciones internacionales, éste es presentado como “modelo exitoso” a ser seguido por otras regiones del Brasil y del mundo.

En estos días tuvimos la oportunidad de conocer, en el campo, algunas iniciativas consideradas como referencia en Acre. Vimos de cerca los impactos sociales y ambientales del “desarrollo sustentable” en curso en el estado. Visitamos el “Projeto de Assentamento Agroextrativista Chico Mendes”, “Fábrica de Preservativos NATEX” y el “Seringal São Bernardo” (“Projeto de Manejo Florestal Sustentável das Fazendas Ranchão I e II”). Las visitas nos colocaron frente a un escenario bastante distinto a aquello que es publicitado a nivel nacional e internacional.

En “Seringal São Bernardo” pudimos constatar que la atención de los intereses de las madereras se hace en detrimento de los intereses de las poblaciones locales y de la conservación de la naturaleza. Incluso las cuestionables reglas de los planes de manejo no son respetadas y, según dicen los pobladores, con connivencia de gestores estatales. En el caso del “Projeto de Assentamento Agroextrativista Chico Mendes Cachoeira” (en Xapuri), constatamos que los pobladores continúan subyugados al dominio monopolista, actualmente venden la madera a la empresa “Laminados Triunfo” a R$90,00 el m3, cuando la misma cantidad de madera llega a valer hasta R$1200 en la ciudad. Por ello, apoyamos la reivindicación de diversas comunidades por la suspensión de los célebres proyectos de manejo. Solicitamos la determinación de todas las irregularidades y exigimos la penalización de los culpables por la destrucción delictiva de los bienes naturales.

Los días en que estuvimos reunidos fueron dedicados asimismo al estudio sobre Servicios Ambientales, REDD y Fondos Verdes del BNDES. Comprendimos el papel de los Bancos (Banco Mundial, FMI, BID y BNDES), ONGs comprometidas con el capitalismo verde, tales como WWF, TNC y CI; así como el papel de otras instituciones como ITTO, FSC y USAID, sectores de la sociedad civil y Gobiernos de los Estados y Federal que se han aliado al capital internacional con la intención de mercantilizar el patrimonio natural de la Amazonia.

Destacamos que, además de desprovista de amparo constitucional, la Ley Nº 2.308 de fecha 22 de octubre de 2010, que reglamenta el Sistema del Estado de Incentivo a Servicios Ambientales, se creó sin el debido debate con los sectores de la sociedad directamente impactados por ella, esto es, los hombres y mujeres del campos y del bosque. Reproduciendo servilmente los argumentos de los países centrales, los gestores estatales locales la presentan como una forma eficaz de contribuir con el equilibrio del clima, proteger el bosque y mejorar la calidad de vida de aquellos que habitan en él. Debe decirse, sin embargo, que la referida ley genera “activos ambientales” para negociar los bienes naturales en el mercado de “servicios ambientales” como el mercado de carbono. Se trata de un desdoblamiento de la actual fase del capitalismo cuyos defensores, con el fin de asegurar su reproducción ampliada, recurren al discurso ambiental para mercantilizar la vida, privatizar la naturaleza y despojar a los pobladores del campo y de la ciudad. Por la ley, la belleza natural, la polinización de insectos, la regulación de lluvias, la cultura, los valores espirituales, los saberes tradicionales, el agua, las plantas y hasta el propio imaginario popular, todo pasa a ser mercadería. La actual propuesta de modificación del Código Forestal complementa esta nueva estrategia de acumulación del capital, al autorizar la negociación de los bosques en el mercado financiero, con la emisión de “papeles verdes”, el llamado “Certificado de Cuotas de Reserva Ambiental” (CCRA). De este modo, todo se coloca en el ámbito del mercado para ser administrado por bancos y empresas privadas.

Aunque sea presentada como solución para el calentamiento global y para los cambios climáticos, la propuesta REDD permite a los países centrales del capitalismo mantener sus estándares de producción, consumo y, por lo tanto, también de contaminación. Continuarán consumiendo energía de fuentes que producen más y más emisiones de carbono. Históricamente responsables de la creación del problema, ahora proponen una “solución” que atiende más a sus intereses. Posibilitando la compra del “derecho de contaminar”, mecanismos como REDD fuerzan a las “poblaciones tradicionales” (ribereños, indígenas, afrobrasileños, trabajadoras del coco, caucheros, etc.) a renunciar a la autonomía en la gestión de sus territorios.

Con esto, se confunden los papeles. El capitalismo, la civilización más predadora de la historia de la humanidad, no representaría ningún problema. Por lo contrario, sería la solución. Los destructores serían ahora los grandes defensores de la naturaleza. Y aquellos que históricamente garantizaron la conservación natural son, ahora, encarados como predadores y por eso mismo son criminalizados. No sorprende, por lo tanto, que recientemente el Estado haya vuelto más ostensiva la represión, la persecución y hasta la expulsión de las poblaciones locales de sus territorios. Todo para asegurar la libre expansión del mercado de los bienes naturales.

Con el indisfrazable apoyo estatal, por ese y otros proyectos, el capital hoy promueve y conjuga dos formas de reterritorialización en la región amazónica. Por una parte, expulsa pueblos y comunidades del territorio (como es el caso de los grandes proyectos como las hidroeléctricas), privándolos de las condiciones de supervivencia. Por otra parte, quita la relativa autonomía de aquellos que permanecen en sus territorios, como es el caso de las áreas de conservación ambiental. Tales poblaciones pueden incluso permanecer en la tierra, pero ya no pueden utilizarla según su modo de vida. Su supervivencia ya no sería más garantizada por el cultivo de subsistencia –convertido en amenaza al buen funcionamiento del clima del planeta-, sino por “bolsas verdes”, que, además de insuficientes, son pagadas para el mantenimiento de la civilización del petróleo.

Conscientes de los riesgos que dichos proyectos traen, rechazamos el acuerdo de REDD entre California, Chiapas, y Acre que ya ha causado serios problemas a comunidades indígenas y tradicionales, como en la región de Amador Hernández, en Chiapas, México. Por ello nos solidarizamos con las poblaciones pobres de California y Chiapas, que ya han sufrido con las consecuencias. También nos solidarizamos con los pueblos indígenas del TIPNIS, en Bolivia, bajo amenaza de que su territorio sea violado por la carretera que liga Cochabamba a Beni, financiada por el BNDES.

Estamos en un estado que, en los años 1970-80, fue escenario de luchas históricas contra la expansión predatoria del capital y por la defensa de los territorios ocupados por pueblos indígenas y poblaciones campesinas del bosque. Luchas que inspiraron muchas otras en el Brasil y en el mundo. Convertido, sin embargo, a partir de fines de los años 90 en laboratorio del BID y del Banco Mundial para experimentos de mercantilización y privatización de la naturaleza, Acre es hoy un estado “intoxicado” por el discurso verde y victimizado por la práctica del “capitalismo verde”. Entre los mecanismos utilizados con el fin de legitimar ese orden de cosas, adquiere relevancia la manipulación de la figura de Chico Mendes. A juzgar por lo que nos presentan, deberíamos considerarlo el patrono del capitalismo verde. En nombre del cauchero se defiende la explotación de petróleo, el monocultivo de la caña de azúcar, la explotación maderera en gran escala y la venta del aire que se respira.

Ante tal cuadro, cabe preguntar qué es lo que no cabría en este modelo de “desarrollo sustentable”. Tal vez en ningún otro momento los ganaderos y madereros hayan encontrado un escenario más favorable. Es por esa razón que creemos necesario y urgente combatirlo, puesto que, bajo la apariencia de algo nuevo y virtuoso, reproduce las viejas y perversas estrategias de dominación y explotación del hombre y de la naturaleza.

Finalmente dejamos aquí nuestra reivindicación por la atención de las siguientes demandas: reforma agraria, homologación de tierras indígenas, inversiones en agroecología y economía solidaria, autonomía de gestión de los territorios, salud y educación para todos, democratización de los medios de comunicación. En defensa de la Amazonia, de la vida, de la integridad de los pueblos y de sus territorios y contra el REDD y la mercantilización de la naturaleza. Estamos en lucha.

Rio Branco, Acre, 07 de octubre de 2011.

Firman esta carta:

Assentamento de Produção Agro-Extrativista Limoeiro-Floresta

Pública do Antimary (APAEPL)

Amazonlink

Cáritas – Manaus

Centro de Defesa dos Direitos Humanos e Educação Popular do Acre (CDDHEP/AC)

Centro de Estudos e Pesquisas para o Desenvolvimento do Extremo Sul da Bahia (CEPEDES)

Comissão Pastoral da Terra – CPT Acre

Conselho Indigenista Missionário – CIMI Regional Amazônia Ocidental

Conselho de Missão entre Índios – COMIN Assessoria Acre e Sul do Amazonas

Coordenação da União dos Povos Indígenas de Rondônia, Sul do Amazonas e Noroeste do Mato Grosso – CUNPIR

FERN

Fórum da Amazônia Ocidental (FAOC)

Global Justice Ecology Project

Grupo de Estudo sobre Fronteira e Identidade – Universidade Federal do Acre

Instituto Madeira Vivo (IMV-Rondônia)

Instituto Mais Democracia

Movimento Anticapitalista Amazônico – MACA

Movimento de Mulheres Camponesas (MMC – Roraima)

Nós Existimos – Roraima

Núcleo Amigos da Terra Brasil

Núcleo de Pesquisa Estado, Sociedade e Desenvolvimento na Amazônia Ocidental -Universidade Federal do Acre.

Oposição Sindical do STTR de Brasiléia

Rede Alerta Contra o Deserto Verde

Rede Brasil sobre Instituições Financeiras Multilaterais

Sindicato dos Trabalhadores Rurais de Bujarí (STTR – Bujarí)

Sindicato dos Trabalhadores Rurais de Xapuri (STTR- Xapuri)

Terra de Direitos

União de Mulheres Indígenas da Amazonia Brasileira

World Rainforest Movement (WRM)

Forgetting Is Part of Remembering (Science Daily)

ScienceDaily (Oct. 18, 2011) — It’s time for forgetting to get some respect, says Ben Storm, author of a new article on memory in Current Directions in Psychological Science, a journal of the Association for Psychological Science. “We need to rethink how we’re talking about forgetting and realize that under some conditions it actually does play an important role in the function of memory,” says Storm, who is a professor at the University of Illinois at Chicago.

“Memory is difficult. Thinking is difficult,” Storm says. Memories and associations accumulate rapidly. “These things could completely overrun our life and make it impossible to learn and retrieve new things if they were left alone, and could just overpower the rest of memory,” he says.

But, fortunately, that isn’t what happens. “We’re able to get around these strong competing inappropriate memories to remember the ones we want to recall.” Storm and other psychological scientists are trying to understand how our minds select the right things to recall — if someone’s talking about beaches near Omaha, Nebraska, for example, you will naturally suppress any knowledge you’ve collected about Omaha Beach in Normandy.

In one kind of experiment, participants are given a list of words that have some sort of relation to each other. They might be asked to memorize a list of birds, for example. In the next part of the test, they have to do a task that requires remembering half the birds. “That’s going to make you forget the other half of the birds in that list,” Storm says. That might seem bad — it’s forgetting. “But what the research shows is that this forgetting is actually a good thing.”

People who are good at forgetting information they don’t need are also good at problem solving and at remembering something when they’re being distracted with other information. This shows that forgetting plays an important role in problem solving and memory, Storm says.

There are plenty of times when forgetting makes sense in daily life. “Say you get a new cell phone and you have to get a new phone number, do you really want to remember your old phone number every time someone asks what your number is?” Storm asks. Or where you parked your car this morning — it’s important information today, but you’d better forget it when it comes time to go get your car for tomorrow afternoon’s commute. “We need to be able to update our memory so we can remember and think about the things that are currently relevant.”

Questioning Privacy Protections in Research (New York Times)

Dr. John Cutler, center, during the Tuskegee syphilis experiment. Abuses in that study led to ethics rules for researchers. Coto Report

By PATRICIA COHEN
Published: October 23, 2011

Hoping to protect privacy in an age when a fingernail clipping can reveal a person’s identity, federal officials are planning to overhaul the rules that regulate research involving human subjects. But critics outside the biomedical arena warn that the proposed revisions may unintentionally create a more serious problem: sealing off vast collections of publicly available information from inspection, including census data, market research, oral histories and labor statistics.

Organizations that represent tens of thousands of scholars in the humanities and social sciences are scrambling to register their concerns before the Wednesday deadline for public comment on the proposals.

The rules were initially created in the 1970s after shocking revelations that poor African-American men infected with syphilis in Tuskegee, Ala., were left untreated by the United States Public Health Service so that doctors could study the course of the disease. Now every institution that receives money from any one of 18 federal agencies must create an ethics panel, called an institutional review board, or I.R.B.

More than 5,875 boards have to sign off on research involving human participants to ensure that subjects are fully informed, that their physical and emotional health is protected, and that their privacy is respected. Although only projects with federal financing are covered by what is known as the Common Rule, many institutions routinely subject all research with a human factor to review.

The changes in the ethical guidelines — the first comprehensive revisions in more than 30 years — were prompted by a surge of health-related research and technological advances.

Researchers in the humanities and social sciences are pleased that the reforms would address repeated complaints that medically oriented regulations have choked off research in their fields with irrelevant and cumbersome requirements. But they were dismayed to discover that the desire to protect individuals’ privacy in the genomics age resulted in rules that they say could also restrict access to basic data, like public-opinion polls.

Jerry Menikoff, director of the federal Office for Human Research Protections, which oversees the Common Rule, cautions that any alarm is premature, saying that federal officials do not intend to pose tougher restrictions on information that is already public. “If the technical rules end up doing that, we’ll try to come up with a result that’s appropriate,” he said.

Critics welcomed the assurance but remained skeptical. Zachary Schrag, a historian at George Mason University who wrote a book about the review process, said, “For decades, scholars in the social sciences and humanities have suffered because of rules that were well intended but poorly considered and drafted and whose unintended consequences restricted research.”

The American Historical Association, with 15,000 members, and the Oral History Association, with 900 members, warn that under the proposed revisions, for example, new revelations that Public Health Service doctors deliberately infected Guatemalan prisoners, soldiers and mental patients with syphilis in the 1940s might never have come to light. The abuses were uncovered by a historian who by chance came across notes in the archives of the University of Pittsburgh. That kind of undirected research could be forbidden under guidelines designed to prevent “data collected for one purpose” from being “used for a new purpose to which the subjects never consented,” said Linda Shopes, who helped draft the historians’ statement.

The suggested changes, she said, “really threaten access to information in a democratic society.”

Numerous organizations including the Consortium of Social Science Associations, which represents dozens of colleges, universities and research centers, expressed particular concern that the new standards might be modeled on federal privacy rules relating to health insurance and restrict use of the broadest of identifying information, like a person’s ZIP code, county or city.

The 11,000-member American Anthropological Association declared in a statement that any process that is based on the health insurance act’s privacy protections “would be disastrous for social and humanities research.” The 45,000-member American Association of University Professors warned that such restrictions “threaten mayhem” and “render impossible a great deal of social-science research, ranging from ethnographic community studies to demographic analysis that relies on census tracts to traffic models based on ZIP code to political polls that report by precinct.”

Dr. Menikoff said references to the statutes governing health insurance information were meant to serve as a starting point, not a blueprint. “Nothing is ruled out,” he said, though he wondered how the review system could be severed from the issue of privacy protection, as the consortium has discussed, “if the major risk for most of these studies is that you’re going to disclose information inadvertently.” If there is confidential information on a laptop, he said, requiring a password may be a reasonable requirement.

Ms. Shopes, Mr. Schrag and other critics emphasized that despite their worries they were happy with the broader effort to fix some longstanding problems with institutional review boards that held, say, an undergraduate interviewing Grandma for an oral history project to the same guidelines as a doctor doing experimental research on cancer patients.

“The system has been sliding into chaos in recent years,” said Alice Kessler-Harris, president of the 9,000-member Organization of American Historians. “No one can even agree on what is supposed to be covered in the humanities and social sciences.”

Vague rules designed to give the thousands of review boards flexibility when dealing with nonmedical subjects have instead resulted in higgledy-piggledy enforcement and layers of red tape even when no one is at risk, she said.

For example Columbia University, where Ms. Kessler-Harris teaches, exempts oral history projects from review, while boards at the University of Illinois in Urbana-Champaign and the University of California, San Diego, have raised lengthy objections to similar interview projects proposed by undergraduate and master’s students, according to professors there.

Brown University has been sued by an associate professor of education who said the institutional review board overstepped its powers by barring her from using three years’ worth of research on how the parents of Chinese-American children made use of educational testing.

Ms. Shopes said board members at one university had suggested at one point that even using recorded interviews deposited at the Ronald Reagan Presidential Foundation and Library would have needed Reagan’s specific approval when he was alive.

Many nonmedical researchers praised the idea that scholars in fields like history, literature, journalism, languages and classics who use traditional methods of research should not have to submit to board review. They would like the office of human protections to go further and lift restrictions on research that may cause participants embarrassment or emotional distress. “Our job is to hold people accountable,” Ms. Kessler-Harris said.

Dr. Menikoff said, “We want to hear all these comments.” But he maintained that when the final language is published, critics may find themselves saying, “Wow, this is reasonable stuff.”

 

This article has been revised to reflect the following correction:

Correction: October 26, 2011

An article on Monday about federal officials’ plans to overhaul privacy rules that regulate research involving human subjects, and concerns raised by scholars, paraphrased incorrectly from comments by Linda Shopes, who helped draft a statement by historians about possible changes. She said that board members at a university (which she did not name) — not board members at the University of Chicago — suggested at one point that using recorded interviews deposited at the Ronald Reagan Presidential Foundation and Library would have needed Reagan’s specific approval when he was alive.