Arquivo da tag: Previsão
Uma leitura de antropólogos e sociólogos sobre o futuro da Amazônia (Jornal da Ciência)
JC e-mail 4549, de 27 de Julho de 2012.
O enfraquecimento de agências multilaterais de cooperação internacional começa a ameaçar as políticas para conservação da Amazônia Legal. A afirmativa é do presidente do Programa Nova Cartografia Social, Alfredo Wagner de Almeida, que ministrou conferência ontem (26) na 64ª Reunião Anual da Sociedade Brasileira para o Progresso da Ciência (SBPC), realizada na Universidade Federal do Maranhão (UFMA), em São Luís.
Sob o tema “Povos e comunidades tradicionais atingidos por projetos militares”, o antropólogo alertou sobre a ação de sete estados que buscam reduzir a Amazônia Legal, cujos projetos tramitam no Legislativo. Dentre os quais estão o Mato Grosso que prevê retirar a participação de sua área como Amazônia Legal, igualmente a Rondônia, que quer retirar esse título de suas terras da região. Outros estados como Maranhão e Tocantins querem tirar o título de todas suas áreas consideradas Amazônia Legal.
A região engloba uma superfície de aproximadamente 5.217.423 km², o equivalente a cerca de 61% do território brasileiro. Foi instituída com objetivo de definir a delimitação geográfica da região política captadora de incentivos fiscais para promoção do desenvolvimento regional.
“Essa é uma primeira tentativa de reduzir a Amazônia Legal, pois esses estados agora não gozam mais dos benefícios concedidos pelas agências internacionais multilaterais”, analisou Almeida, também conselheiro da SBPC e professor da Universidade do Estado do Amazonas (UEA).
Segundo o pesquisador, os organismos internacionais, até então, eram fontes de recursos para programas de proteção à Amazônia. Tais como, o Projeto Integrado de Proteção às Populações e Terras Indígenas da Amazônia Legal (PPTAL), destinado à demarcação de terras indígenas, fomentado principalmente pelo governo da Alemanha. E o PPG7 (Programa Piloto para Proteção das Florestas Tropicais do Brasil). Foram essas políticas que fortaleceram a criação do Ministério do Meio Ambiente. “Sem o apoio das agências multilaterais as políticas para a Amazônia encolheram”, disse, sem citar valores.
Conforme o antropólogo, a decisão dos estados que querem sair da Amazônia Legal significa para eles “liderar mais terras segundo as quais consideram ser produtivas”, em detrimento da conservação das florestas.
As declarações do antropólogo são baseadas no dossiê “Amazônia: sociedade, fronteiras e políticas”, produzido por Edna Maria Ramos de Castro, socióloga do Núcleo de Altos Estudos Amazônicos, da Universidade Federal do Pará (UFPA), e diretora da SBPC, que intermediou a conferência. A íntegra do documento foi publicada recentemente no Caderno CRH da Bahia.
Terras indígenas – Na avaliação da autora do dossiê, os dispositivos jurídicos desses estados ameaçam as terras indígenas – protagonistas na conservação da biodiversidade que precisam da natureza para sobreviver. “São dispositivos legais, são claros na Constituição, mas essa prática pode levar a uma situação de impasse [da sociedade]”, analisou. Edna citou o caso da polêmica obra da hidrelétrica de Belo Monte que se tornou um ícone de um processo de resistência da sociedade brasileira.
Mudança de paradigma – O antropólogo fez uma leitura sobre o atual modelo político brasileiro administrativo. Ele vê uma mudança de uma política “de proteção” para uma “ideia de protecionismo”. “A distinção entre proteção e protecionismo revela em primeiro lugar o enfraquecimento das agências multilaterais internacionais”, disse. Segundo ele, o protecionismo “erige” fora do âmbito da proteção.
Do ponto de vista de Alfredo Wagner, os sinais de mudança refletem principalmente os desacordos na reunião da Organização Mundial do Comércio (OMC) em dezembro de 2011 em Genebra. Na ocasião, houve sinais de ruptura de acordos internacionais – até então chamados de mercado comum. Um exemplo “é o engavetamento” da chamada Rodada de Doha, em razão de divergência entre as partes sobre subsídios agrícolas concedidos por países desenvolvidos.
Expansão da área militar e infraestrutura – O antropólogo lembra que no auge dos organismos multilaterais a área de segurança, isto é, a dos militares, não era fomentada porque não fazia parte de uma política de mercado único. Ele observa, entretanto, uma mudança a partir de 2009 quando há um deslocamento do modelo e problemas com os militares começam a aparecer, em decorrência da reedição de projetos de fronteiras militarizadas. “A partir daí inicia um capítulo de conflitos”.
Afastamento de fundos internacionais e órgãos reguladores – Segundo ele, o que mais sobressai na “ideia do protecionismo” é a identificação de recursos naturais estratégicos, como commodities agrícolas e minérios, que – sob o argumento de desenvolvimento sustentável – podem ser utilizados para o incremento de grandes obras de infraestrutura.
“Tudo passa a ser interpretado como interesses nacionais. A ideia de bloco vai perdendo força, o que pode explicar as próprias tensões no Mercosul, quando a Venezuela é levada ao bloco em momentos de crise. Esses interesses nacionais passam a se articular de maneira disciplinada sem passar pelas entidades multilaterais”, considera o antropólogo.
Segundo ele, atual ação do Estado brasileiro não passa pelas entidades multilaterais. Reflexo é o afastamento do Fundo Monetário Internacional (FMI) e de duas normas estrangeiras. Uma delas é a Lei de Direitos Humanos Internacional da OEA (Organização dos Estados Americanos). Ele lembra que o Brasil deixou de investir “nessa corte” a partir do momento em que a hidrelétrica de Belo Monte foi condenada pelo órgão. “O Brasil passa a ter uma posição unilateral, semelhante a dos norte-americanos na Guerra do Golfo”, observa o antropólogo. “A ideia do protecionismo vem de forma bastante forte”.
Alfredo Wagner também observa sinais de afastamento da Convenção 169 em que obriga a consulta prévia de comunidades prejudicadas por grandes obras de infraestrutura, por exemplo. Segundo ele, o Brasil é condenado a seis violações em projetos militares. Uma é pela construção do Centro de Lançamentos de Alcântara (CLA) em comunidades quilombolas no Maranhão, sem licenciamento ambiental e sem consulta às comunidades “afetadas”.
Ele alerta também sobre quatro medidas preocupantes em andamento segundo as quais preveem a construção emergencial de hidrelétricas. Um exemplo é a Medida Provisória 558 de 18 de janeiro de 2012 em que prevê redução de unidades protegidas e de conservação de florestas sob o argumento de desenvolvimento. Segundo ele, o Ibama aprovou em apenas cinco dias uma minuta de termo de referência da Eletronorte para construção de uma hidrelétrica em São Luiz de Tapajós. Na prática, foi aprovado o plano de trabalho encaminhado para diagnosticar as obras. “Com o ritmo emergencial para essas obras parece que os direitos são colocados em suspenso”.
Recursos de inconstitucionalidade – Tal MP foi questionada pela Procuradoria Geral da República por uma ADIN (Ação Direta de Inconstitucionalidade). O Ministério Público Federal considerou que as unidades de conservação nas áreas de hidrelétricas são essenciais para minimizar os impactos ambientais dos projetos; e argumentou que qualquer discussão sobre a redução dessas áreas florestais deve ser realizada no Congresso Nacional, a fim de evitar a edição de uma MP. “O Brasil hoje vive o império das Medidas Provisórias que impedem a ampla discussão da sociedade. Isso dá uma ideia de capitalismo autoritário”, disse o antropólogo.
Privatização de terras na Amazônia – Ele também alerta sobre a privatização das terras públicas na Amazônia sob o “eufemismo” de regularização fundiária, via o programa Terra Legal, pela Lei 11.952 de julho de 2009. Encaminhada pela Presidência da República, a medida prevê privatizar 70 milhões de hectares de terras públicas, um volume considerável em relação ao total de 850 milhões de hectares de terras que compõem o Brasil, segundo o antropólogo. Alfredo Wagner alerta sobre a agilidade na titularidade das terras para grandes propriedades que a MP permite, em detrimento dos pequenos proprietários.
Inicialmente, a medida foi questionada pelo Ministério Público por uma ADIN pela justificativa de que ela estabelece “privilégios injustificáveis” em favor de grileiros que no passado se beneficiaram de terras públicas e houve concentração de terras. “Essa MP é tão cruel quanto a Lei de Terras Sarney de 1969”, disse o antropólogo.
Judicialização do Estado – Buscando tranquilizar os ânimos da plateia lotada por alunos, pesquisadores, cientistas, dentre outros – estimada em cerca de 140 pessoas – que temia ser a volta da ditadura militar, o antropólogo respondeu sobre o atual modelo: “Ele não é igual à ditadura militar”, respondeu o atribuindo a um “judicialização do Estado” e de “uma coisa esquisita”.
Na ocasião, o antropólogo usou a frase de sociólogos para explicar uma crise: “O velho ainda não morreu e o novo ainda não nasceu. Mas está havendo uma transformação.”
(Viviane Monteiro – Jornal da Ciência)
Stop bullying the ‘soft’ sciences (L.A.Times)
OP-ED
The social sciences are just that — sciences.
By Timothy D. Wilson
July 12, 2012
A student is seen at the UC Irvine archive doing research for her sociology dissertation. (Los Angeles Times / July 9, 2009)
Once, during a meeting at my university, a biologist mentioned that he was the only faculty member present from a science department. When I corrected him, noting that I was from the Department ofPsychology, he waved his hand dismissively, as if I were a Little Leaguer telling a member of the New York Yankees that I too played baseball.
There has long been snobbery in the sciences, with the “hard” ones (physics, chemistry, biology) considering themselves to be more legitimate than the “soft” ones ( psychology, sociology). It is thus no surprise that many members of the general public feel the same way. But of late, skepticism about the rigors of social science has reached absurd heights.
The U.S. House of Representativesrecently voted to eliminate funding for political science research through the National Science Foundation. In the wake of that action, an opinion writer for the Washington Post suggested that the House didn’t go far enough. The NSF should not fund any research in the social sciences, wrote Charles Lane, because “unlike hypotheses in the hard sciences, hypotheses about society usually can’t be proven or disproven by experimentation.”
Lane’s comments echoed ones by Gary Gutting in the Opinionator blog of the New York Times. “While the physical sciences produce many detailed and precise predictions,” wrote Gutting, “the social sciences do not. The reason is that such predictions almost always require randomized controlled experiments, which are seldom possible when people are involved.”
This is news to me and the many other social scientists who have spent their careers doing carefully controlled experiments on human behavior, inside and outside the laboratory. What makes the criticism so galling is that those who voice it, or members of their families, have undoubtedly benefited from research in the disciplines they dismiss.
Most of us know someone who has suffered from depression and sought psychotherapy. He or she probably benefited from therapies such as cognitive behavioral therapy that have been shown to work in randomized clinical trials.
Problems such as child abuse and teenage pregnancy take a huge toll on society. Interventions developed by research psychologists, tested with the experimental method, have been found to lower the incidence of child abuse and reduce the rate of teenage pregnancies.
Ever hear of stereotype threat? It is the double jeopardy that people face when they are at risk of confirming a negative stereotype of their group. When African American students take a difficult test, for example, they are concerned not only about how well they will do but also about the possibility that performing poorly will reflect badly on their entire group. This added worry has been shown time and again, in carefully controlled experiments, to lower academic performance. But fortunately, experiments have also showed promising ways to reduce this threat. One intervention, for example, conducted in a middle school, reduced the achievement gap by 40%.
If you know someone who was unlucky enough to be arrested for a crime he didn’t commit, he may have benefited from social psychological experiments that have resulted in fairer lineups and interrogations, making it less likely that innocent people are convicted.
An often-overlooked advantage of the experimental method is that it can demonstrate what doesn’t work. Consider three popular programs that research psychologists have debunked: Critical Incident Stress Debriefing, used to prevent post-traumatic stress disorders in first responders and others who have witnessed horrific events; the D.A.R.E. anti-drug program, used in many schools throughout America; and Scared Straight programs designed to prevent at-risk teens from engaging in criminal behavior.
All three of these programs have been shown, with well-designed experimental studies, to be ineffective or, in some cases, to make matters worse. And as a result, the programs have become less popular or have changed their methods. By discovering what doesn’t work, social scientists have saved the public billions of dollars.
To be fair to the critics, social scientists have not always taken advantage of the experimental method as much as they could. Too often, for example, educational programs have been implemented widely without being adequately tested. But increasingly, educational researchers are employing better methodologies. For example, in a recent study, researchers randomly assigned teachers to a program called My Teaching Partner, which is designed to improve teaching skills, or to a control group. Students taught by the teachers who participated in the program did significantly better on achievement tests than did students taught by teachers in the control group.
Are the social sciences perfect? Of course not. Human behavior is complex, and it is not possible to conduct experiments to test all aspects of what people do or why. There are entire disciplines devoted to the experimental study of human behavior, however, in tightly controlled, ethically acceptable ways. Many people benefit from the results, including those who, in their ignorance, believe that science is limited to the study of molecules.
Timothy D. Wilson is a professor of psychology at the University of Virginia and the author of “Redirect: The Surprising New Science of Psychological Change.”
World Bank’s Jim Yong Kim: ‘I want to eradicate poverty’ (The Guardian)
Sarah Boseley, health editor, in Washington
guardian.co.uk, Wednesday 25 July 2012 13.48 BST
Jim Yong Kim, president of the World Bank, speaks at the opening session of the International Aids Conference in Washington on 22 July. Photograph: Jacquelyn Martin/AP
The new president of the World Bank is determined to eradicate globalpoverty through goals, targets and measuring success in the same way that he masterminded an Aids drugs campaign for poor people nearly a decade ago.
Jim Yong Kim, in an exclusive interview with the Guardian, said he was passionately committed to ending absolute poverty, which threatens survival and makes progress impossible for the 1.3 billion people living on less than $1.25 a day.
“I want to eradicate poverty,” he said. “I think that there’s a tremendous passion for that inside the World Bank.”
Kim, who took over at the World Bank three weeks ago and is not only the first doctor and scientist (he is also an anthropologist) to be president but the first with development experience, will set “a clear, simple goal” in the eradication of absolute poverty. Getting there, however, needs progress on multiple, but integrated, fronts.
“The evidence suggests that you’ve got to do a lot of good, good things in unison, to be able to make that happen,” said Kim. “The private sectorhas to grow, you have to have social protection mechanisms, you have to have a functioning health and education system. The scientific evidence strongly suggests that it has to be green – you have to do it in a way that is sustainable both for the environment and financially. All the great themes that we’ve been dealing with here have to come together to eradicate poverty from the face of the Earth.”
Kim, who was previously head of the Ivy League Dartmouth College, is probably best known for his stint at the World Health Organisation (WHO), where he challenged the system to move faster in making Aids drugs available to people with HIV in the developing world who were dying in large numbers. In 2003, he set a target of 3 million people being on treatment by 2005 – thereafter known as “3 by 5”. The target was not met on time, but it did focus minds and rapidly speed up the pace of the rollout, which included setting up clinics and training healthcare staff.
Now, he says, he thinks he can do the same for poverty. “What 3 by 5 did that we just didn’t expect was to set a tempo to the response; it created a sense of urgency. There was pace and rhythm in the way we did things. We think we can do something similar for poverty,” he said.
Asked if he would set a date this time, he said he was sorely tempted, but would not yet. “We don’t know what they will be yet, but [there will be] goals, and counting. We need to keep up and say where we are making successes and why, and when are we going to be held to account next for the level of poverty. If we can build that kind of pace and rhythm into the movement, we think we can make a lot more progress,” he said in his office at the Bank in Washington.
Kim was seen by many as a surprise choice for president. During the election, critics argued there should be an economist at the helm. Some said that, as a doctor, he would focus too much on health.
But Kim, who co-founded Partners In Health, which pioneered sustainable, high-quality healthcare for poor people, first in Haiti and later in Africa, said his three years at the WHO have been the only ones of his career that were solely devoted to health.
“It’s always been about poverty, so for me, making the switch to being here at the Bank is really not that much of a stretch. I’ve been doing this all my life and we’re in a bit of the spotlight because of the stuff we did in healthcare but it was really always about poverty,” he said.
Partners in Health offered HIV and tuberculosis treatment to poor people in Haiti for the first time. “We were trying to make a point. And the point we were trying to make was that just because people are poor shouldn’t mean that they shouldn’t have access to high quality healthcare. It was always based in social justice, it was always based in the notion that people had a right to live a dignified life. The good news is that this place – the Bank – is just full of people like that.”
Kim, who has spent his first weeks talking to Bank staff with expertise in a huge range of areas, strongly believes in the integration of all aspects of development, and says the staff do too. He cites a new hospital Partners built in Rwanda, which led to the building of a road to get there and then the expansion of mobile phone networks in the area. “In a very real sense, we’ve always believed that investing in health means investing in the wellbeing and development of that entire community,” he said.
Speaking to the International Aids Conference in Washington this week – the first World Bank president to do so – Kim told activists and scientists that the end of Aids no longer looked as far-fetched as the 3 by 5 plan had appeared in 2003. Science has delivered tools, such as drugs that not only treat but prevent infection.
But the cost of drugs for life for 15 million or more people is not sustainable, he says. Donors are unlikely to foot the bill. Hard-hit developing countries have to be helped to grow so they can pay for the drugs and healthcare systems they need.
Kim would like the highly active HIV community to broaden its focus. “We’ve had Aids exceptionalism for a long time and Aids exceptionalism has been incredibly important. It has been so productive for all of us,” he said. “But I think that as we go beyond the emergency response and think about the long-term sustainable response, conversations such as how do we spur growth in the private sector have to be part of the discussion.”
Every country wants economic growth, he says, and people want jobs. “If I care about poverty, I have to care a lot about investments in the private sector. The private sector creates the vast majority of jobs in the world and social protection only goes so far,” he said.
Nevertheless, he is a big proponent of social protection policies. “I’ve always been engaged in social protection programmes. But now it is really a signature of the World Bank. We’re very good at helping people look at their public expenditures and we say to them things like, fuel subsidies really aren’t very helpful to the poor – what you really need is to remove fuel subsidies and focus on things like conditional cash transfer plans. The Bank is great at that.”
New to him are climate change and sustainability, he says. “We are watching things happen with one degree changes in ocean temperature that we thought wouldn’t happen until there were two or three degree changes in ocean temperature. These are facts. These are things that have actually happened … I think we now have plenty of evidence that should push us into thinking that this is disturbing data and should spur us to think ever more seriously about clean energy and how can we move our focus more towards clean energy.”
But poor countries are saying they need more energy and we must respect that, he says. “It’s hard to say to them we still do it but you can’t … I think our role is to say the science suggests strongly to us that we should help you looking for clean energy solutions.”
Local Weather Patterns Affect Beliefs About Global Warming (Science Daily)
People living in places experiencing warmer-than-normal temperatures at the time they were surveyed were significantly more likely than others to say there is evidence for global warming. (Credit: © Rafael Ben-Ari / Fotolia)
ScienceDaily (July 25, 2012) — Local weather patterns temporarily influence people’s beliefs about evidence for global warming, according to research by political scientists at New York University and Temple University. Their study, which appears in theJournal of Politics, found that those living in places experiencing warmer-than-normal temperatures at the time they were surveyed were significantly more likely than others to say there is evidence for global warming.
“Global climate change is one of the most important public policy challenges of our time, but it is a complex issue with which Americans have little direct experience,” wrote the study’s co-authors, Patrick Egan of New York University and Megan Mullin of Temple University. “As they try to make sense of this difficult issue, many people use fluctuations in local temperature to reassess their beliefs about the existence of global warming.”
Their study examined five national surveys of American adults sponsored by the Pew Research Center: June, July, and August 2006, January 2007, and April 2008. In each survey, respondents were asked the following question: “From what you’ve read and heard, is there solid evidence that the average temperature on earth has been getting warmer over the past few decades, or not?” On average over the five surveys, 73 percent of respondents agreed that Earth is getting warmer.
Egan and Mullin wondered about variation in attitudes among the survey’s respondents, and hypothesized that local temperatures could influence perceptions. To measure the potential impact of temperature on individuals’ opinions, they looked at zip codes from respondents in the Pew surveys and matched weather data to each person surveyed at the time of each poll. They used local weather data to determine if the temperature in the location of each respondent was significantly higher or lower than normal for that area at that time of year.
Their results showed that an abnormal shift in local temperature is associated with a significant shift in beliefs about evidence for global warming. Specifically, for every three degrees Fahrenheit that local temperatures in the past week have risen above normal, Americans become one percentage point more likely to agree that there is ”solid evidence” that Earth is getting warmer. The researchers found cooler-than-normal temperatures have similar effects on attitudes — but in the opposite direction.
The study took into account other variables that may explain the results — such as existing political attitudes and geography — and found the results still held.
The researchers also wondered if heat waves — or prolonged higher-than-normal temperatures — intensified this effect. To do so, they looked at respondents living in areas that experienced at least seven days of temperatures of 10° or more above normal in the three weeks prior to interview and compared their views with those who experienced the same number of hot days, but did not experience a heat wave.
Their estimates showed that the effect of a heat wave on opinion is even greater, increasing the share of Americans believing in global warming by 5.0 to 5.9 percentage points.
However, Egan and Mullin found the effects of temperature changes to be short-lived — even in the wake of heat waves. Americans who had been interviewed after 12 or more days had elapsed since a heat wave were estimated to have attitudes that were no different than those who had not been exposed to a heat wave.
“Under typical circumstances, the effects of temperature fluctuations on opinion are swiftly wiped out by new weather patterns,” they wrote. “More sustained periods of unusual weather cause attitudes to change both to a greater extent and for a longer period of time. However, even these effects eventually decay, leaving no long-term impact of weather on public opinion.”
The findings make an important contribution to the political science research on the relationship between personal experience and opinion on a larger issue, which has long been studied with varying results.
“On issues such as crime, the economy, education, health care, public infrastructure, and taxation, large shares of the public are exposed to experiences that could logically be linked to attitude formation,” the researchers wrote. “But findings from research examining how these experiences affect opinion have been mixed. Although direct experience — whether it be as a victim of crime, a worker who has lost a job or health insurance, or a parent with children in public schools — can influence attitudes, the impact of these experiences tends to be weak or nonexistent after accounting for typical predictors such as party identification and liberal-conservative ideology.”
“Our research suggests that personal experience has substantial effects on political attitudes,” Egan and Mullin concluded. “Rich discoveries await those who can explore these questions in ways that permit clean identification of these effects.”
Egan is an assistant professor in the Wilf Family Department of Politics at NYU and Mullin is an associate professor in the Department of Political Science at Temple University
Concerns Over Accuracy of Tools to Predict Risk of Repeat Offending (Science Daily)
ScienceDaily (July 24, 2012) — Use of risk assessment instruments to predict violence and antisocial behavior in 73 samples involving 24,827 people: systematic review and meta-analysis
Tools designed to predict an individual’s risk of repeat offending are not sufficient on their own to inform sentencing and release or discharge decisions, concludes a study published on the British Medical Journal website.
Although they appear to identify low risk individuals with high levels of accuracy, the authors say “their use as sole determinants of detention, sentencing, and release is not supported by the current evidence.”
Risk assessment tools are widely used in psychiatric hospitals and criminal justice systems around the world to help predict violent behavior and inform sentencing and release decisions. Yet their predictive accuracy remains uncertain and expert opinion is divided.
So an international research team, led by Seena Fazel at the University of Oxford, set out to investigate the predictive validity of tools commonly used to assess the risk of violence, sexual, and criminal behavior.
They analyzed risk assessments conducted on 24,827 people from 13 countries including the UK and the US. Of these, 5,879 (24%) offended over an average of 50 months.
Differences in study quality were taken into account to identify and minimize bias.
Their results show that risk assessment tools produce high rates of false positives (individuals wrongly identified as being at high risk of repeat offending) and predictive accuracy at around chance levels when identifying risky persons. For example, 41% of individuals judged to be at moderate or high risk by violence risk assessment tools went on to violently offend, while 23% of those judged to be at moderate or high risk by sexual risk assessment tools went on to sexually offend.
Of those judged to be at moderate or high risk of committing any offense, just over half (52%) did. However, of those predicted not to violently offend, 91% did not, suggesting that these tools are more effective at screening out individuals at low risk of future offending.
Factors such as gender, ethnicity, age or type of tool used did not appear to be associated with differences in predictive accuracy.
Although risk assessment tools are widely used in clinical and criminal justice settings, their predictive accuracy varies depending on how they are used, say the authors.
“Our review would suggest that risk assessment tools, in their current form, can only be used to roughly classify individuals at the group level, not to safely determine criminal prognosis in an individual case,” they conclude. The extent to which these instruments improve clinical outcomes and reduce repeat offending needs further research, they add.
What is a carbon price and why do we need one? (The Guardian)
This Q&A is part of the Guardian’s Ultimate climate change FAQ
A carbon price is a cost applied to carbon pollution to encourage polluters to reduce the amount of greenhouse gas they emit into the atmosphere. Economists widely agree that introducing a carbon price is the single most effective way for countries to reduce their emissions.
Climate change is considered a market failure by economists, because it imposes huge costs and risks on future generations who will suffer the consequences of climate change, without these costs and risks normally being reflected in market prices. To overcome this market failure, they argue, we need to internalise the costs of future environmental damage by putting a price on the thing that causes it – namely carbon emissions.
A carbon price not only has the effect of encouraging lower-carbon behaviour (eg using a bike rather than driving a car), but also raises money that can be used in part to finance a clean-up of “dirty” activities (eg investment in research into fuel cells to help cars pollute less). With a carbon price in place, the costs of stopping climate change are distributed across generations rather than being borne overwhelmingly by future generations.
There are two main ways to establish a carbon price. First, a government can levy a carbon tax on the distribution, sale or use of fossil fuels, based on their carbon content. This has the effect of increasing the cost of those fuels and the goods or services created with them, encouraging business and people to switch to greener production and consumption. Typically the government will decide how to use the revenue, though in one version, the so-called fee-and-dividend model – the tax revenues are distributed in their entirety directly back to the population.
The second approach is a quota system called cap-and-trade. In this model, the total allowable emissions in a country or region are set in advance (“capped”). Permits to pollute are created for the allowable emissions budget and either allocated or auctioned to companies. The companies can trade permits between one another, introducing a market for pollution that should ensure that the carbon savings are made as cheaply as possible.
To serve its purpose, the carbon price set by a tax or cap-and-trade scheme must be sufficiently high to encourage polluters to change behaviour and reduce pollution in accordance with national targets. For example, the UK has a target to reduce carbon emissions by 80% by 2050, compared with 1990 levels, with various intermediate targets along the way. The government’s independent advisers, the Committee on Climate Change, estimates that a carbon price of £30 per tonne of carbon dioxide in 2020 and £70 in 2030 would be required to meet these goals.
Currently, many large UK companies pay a price for the carbon they emit through the EU’s emissions trading scheme. However, the price of carbon through the scheme is considered by many economists to be too low to help the UK to meet its targets, so the Treasury plans to make all companies covered by the scheme pay a minimum of £16 per tonne of carbon emitted from April 2013.
Ideally, there should be a uniform carbon price across the world, reflecting the fact that a tonne of carbon dioxide does the same amount of damage over time wherever it is emitted. Uniform pricing would also remove the risk that polluting businesses flee to so-called “pollution havens”‘ – countries where a lack of environmental regulation enables them to continue to pollute unrestrained. At the moment, carbon pricing is far from uniform but a growing number of countries and regions have, or plan to have, carbon pricing schemes in place, whether through cap-and-trade or carbon taxes. These include the European Union, Australia, South Korea, South Africa, parts of China and California.
• This article was written by Alex Bowen of the Grantham Research Institute on Climate Change and the Environment at LSE in collaboration with the Guardian
A Century Of Weather Control (POP SCI)
Posted 7.19.12 at 6:20 pm – http://www.popsci.com

Keeping Pilots Updated, November 1930
It’s 1930 and, for obvious reasons, pilots want regular reports on the weather. What to do? Congress’s solution was to give the U.S. Weather Bureau cash to send them what they needed. It was a lot of cash, too: $1.4 million, or “more than one third the sum it spend annually for all of its work.”
About 13,000 miles of airway were monitored for activity, and reports were regularly sent via the now quaintly named “teletype”–an early fax machine, basically, that let a typed message be reproduced. Pilots were then radioed with the information.
From the article “Weather Man Makes the Air Safe.”

Battling Hail, July 1947
We weren’t shy about laying on the drama in this piece on hail–it was causing millions in damage across the country and we were sick of it. Our writer says, “The war against hail has been declared.” (Remember: this was only two years after World War II, which was a little more serious. Maybe our patriotism just wouldn’t wane.)
The idea was to scatter silver iodide as a form of “cloud seeding”–turning the moisture to snow before it hails. It’s a process that’s still toyed with today.
From the article “The War Against Hail.”

Hunting for a Tornado “Cure,” March 1958
1957 was a record-breaking year for tornadoes, and PopSci was forecasting even rougher skies for 1958. As described by an official tornado watcher: ‘”They’re coming so fast and thick … that we’ve lost count.'”
To try to stop it, researchers wanted to learn more. Meteorologists asked for $5 million more a year from Congress to be able to study tornadoes whirling through the Midwest’s Tornado Alley, then, hopefully, learn what they needed to do to stop them.
From the article “What We’re Learning About Tornadoes.”

Spotting Clouds With Nimbus, November 1963
Weather satellites were a boon to both forecasters and anyone affected by extreme weather. The powerful Hurricane Esther was discovered two days before anything else spotted it, leaving space engineers “justifiably proud.” The next satellite in line was the Nimbus, which Popular Science devoted multiple pages to covering, highlighting its ability to photograph cloud cover 24 hours a day and give us better insight into extreme weather.
Spoiler: the results really did turn out great, with Nimbus satellites paving the way for modern GPS devices.
From the article “The Weather Eye That Never Blinks.”

Saving Money Globally With Forecasts, November 1970
Optimism for weather satellites seemed to be reaching a high by the ’70s, with Popular Science recounting all the disasters predicted–how they “saved countless lives through early hurricane warnings”–and now even saying they’d save your vacation.
What they were hoping for then was an accurate five-day forecast for the world, which they predicted would save billions and make early warnings even better.
From the article “How New Weather Satellites Will Give You More Reliable Forecasts.”

Extreme Weather Alerts on the Radio, July 1979
Those weather alerts that come on your television during a storm–or at least one radio version of those–were documented byPopular Science in 1979. But rather than being something that anyone could tune in to, they were specialized radios you had to purchase, which seems like a less-than-great solution to the problem. But at this point the government had plans to set up weather monitoring stations near 90 percent of the country’s population, opening the door for people to find out fast what the weather situation was.
From the article “Weather-Alert Radios–They Could Save Your Life.”

Stopping “Bolts From the Blue,” May 1990
Here Popular Science let loose a whooper for anyone with a fear of extreme weather: lightning kills a lot more people every year than you think, and sometimes a lightning bolt will come and hit you even when there’s not a storm. So-called “bolts from the blue” were a part of the story on better predicting lightning, a phenomenon more manic than most types of weather. Improved sensors played a major part in better preparing people before a storm.
From the article “Predicting Deadly Lightning.”

Infrared Views of Weather, August 1983
Early access to computers let weather scientists get a 3-D, radar-based view of weather across the country. The system culled information from multiple sources and placed it in one viewable display. (The man pictured looks slightly bored for how revolutionary it is.) The system was an attempt to take global information and make it into “real-time local predictions.”
From the article “Nowcasting: New Weather Computers Pinpoint Deadly Storms.”

Modernizing the National Weather Service, August 1997
A year’s worth of weather detection for every American was coming at the price of “a Big Mac, fries, and a Coke,” the deputy director of the National Weather Service said in 1997. The computer age better tied together the individual parts of weather forecasting for the NWS, leaving a unified whole that could grab complicated meteorological information and interpret it in just a few seconds.
From the article “Weather’s New Outlook.”

Modeling Weather With Computers, September 2001
Computer simulations, we wrote, would help us predict future storms more accurately. But it took (at the time) the largest supercomputer around to give us the kinds of models we wanted. Judging by the image, we might’ve already made significant progress on the weather modeling front.
From the article “Better Weather Through Computers.”
Researchers Produce First Complete Computer Model of an Organism (Science Daily)
ScienceDaily (July 21, 2012) — In a breakthrough effort for computational biology, the world’s first complete computer model of an organism has been completed, Stanford researchers reported last week in the journal Cell.
A team led by Markus Covert, assistant professor of bioengineering, used data from more than 900 scientific papers to account for every molecular interaction that takes place in the life cycle of Mycoplasma genitalium, the world’s smallest free-living bacterium.
By encompassing the entirety of an organism in silico, the paper fulfills a longstanding goal for the field. Not only does the model allow researchers to address questions that aren’t practical to examine otherwise, it represents a stepping-stone toward the use of computer-aided design in bioengineering and medicine.
“This achievement demonstrates a transforming approach to answering questions about fundamental biological processes,” said James M. Anderson, director of the National Institutes of Health Division of Program Coordination, Planning and Strategic Initiatives. “Comprehensive computer models of entire cells have the potential to advance our understanding of cellular function and, ultimately, to inform new approaches for the diagnosis and treatment of disease.”
The research was partially funded by an NIH Director’s Pioneer Award from the National Institutes of Health Common Fund.
From information to understanding
Biology over the past two decades has been marked by the rise of high-throughput studies producing enormous troves of cellular information. A lack of experimental data is no longer the primary limiting factor for researchers. Instead, it’s how to make sense of what they already know.
Most biological experiments, however, still take a reductionist approach to this vast array of data: knocking out a single gene and seeing what happens.
“Many of the issues we’re interested in aren’t single-gene problems,” said Covert. “They’re the complex result of hundreds or thousands of genes interacting.”
This situation has resulted in a yawning gap between information and understanding that can only be addressed by “bringing all of that data into one place and seeing how it fits together,” according to Stanford bioengineering graduate student and co-first author Jayodita Sanghvi.
Integrative computational models clarify data sets whose sheer size would otherwise place them outside human ken.
“You don’t really understand how something works until you can reproduce it yourself,” Sanghvi said.
Small is beautiful
Mycoplasma genitalium is a humble parasitic bacterium known mainly for showing up uninvited in human urogenital and respiratory tracts. But the pathogen also has the distinction of containing the smallest genome of any free-living organism — only 525 genes, as opposed to the 4,288 of E. coli, a more traditional laboratory bacterium.
Despite the difficulty of working with this sexually transmitted parasite, the minimalism of its genome has made it the focus of several recent bioengineering efforts. Notably, these include the J. Craig Venter Institute’s 2008 synthesis of the first artificial chromosome.
“The goal hasn’t only been to understand M. genitalium better,” said co-first author and Stanford biophysics graduate student Jonathan Karr. “It’s to understand biology generally.”
Even at this small scale, the quantity of data that the Stanford researchers incorporated into the virtual cell’s code was enormous. The final model made use of more than 1,900 experimentally determined parameters.
To integrate these disparate data points into a unified machine, the researchers modeled individual biological processes as 28 separate “modules,” each governed by its own algorithm. These modules then communicated to each other after every time step, making for a unified whole that closely matched M. genitalium‘s real-world behavior.
Probing the silicon cell
The purely computational cell opens up procedures that would be difficult to perform in an actual organism, as well as opportunities to reexamine experimental data.
In the paper, the model is used to demonstrate a number of these approaches, including detailed investigations of DNA-binding protein dynamics and the identification of new gene functions.
The program also allowed the researchers to address aspects of cell behavior that emerge from vast numbers of interacting factors.
The researchers had noticed, for instance, that the length of individual stages in the cell cycle varied from cell to cell, while the length of the overall cycle was much more consistent. Consulting the model, the researchers hypothesized that the overall cell cycle’s lack of variation was the result of a built-in negative feedback mechanism.
Cells that took longer to begin DNA replication had time to amass a large pool of free nucleotides. The actual replication step, which uses these nucleotides to form new DNA strands, then passed relatively quickly. Cells that went through the initial step quicker, on the other hand, had no nucleotide surplus. Replication ended up slowing to the rate of nucleotide production.
These kinds of findings remain hypotheses until they’re confirmed by real-world experiments, but they promise to accelerate the process of scientific inquiry.
“If you use a model to guide your experiments, you’re going to discover things faster. We’ve shown that time and time again,” said Covert.
Bio-CAD
Much of the model’s future promise lies in more applied fields.
CAD — computer-aided design — has revolutionized fields from aeronautics to civil engineering by drastically reducing the trial-and-error involved in design. But our incomplete understanding of even the simplest biological systems has meant that CAD hasn’t yet found a place in bioengineering.
Computational models like that of M. genitalium could bring rational design to biology — allowing not only for computer-guided experimental regimes, but also for the wholesale creation of new microorganisms.
Once similar models have been devised for more experimentally tractable organisms, Karr envisions bacteria or yeast specifically designed to mass-produce pharmaceuticals.
Bio-CAD could also lead to enticing medical advances — especially in the field of personalized medicine. But these applications are a long way off, the researchers said.
“This is potentially the new Human Genome Project,” Karr said. “It’s going to take a really large community effort to get close to a human model.”
Stanford’s Department of Bioengineering is jointly operated by the School of Engineering and the School of Medicine.
Anarchists attack science (Nature)
Armed extremists are targeting nuclear and nanotechnology workers.
Leigh Phillips
28 May 2012
Investigations of the shooting of nuclear-engineering head Roberto Adinolfi have confirmed the involvement of an eco-anarchist group. P. RATTINI/AFP/GETTY
A loose coalition of eco-anarchist groups is increasingly launching violent attacks on scientists.
A group calling itself the Olga Cell of the Informal Anarchist Federation International Revolutionary Front has claimed responsibility for the non-fatal shooting of a nuclear-engineering executive on 7 May in Genoa, Italy. The same group sent a letter bomb to a Swiss pro-nuclear lobby group in 2011; attempted to bomb IBM’s nanotechnology laboratory in Switzerland in 2010; and has ties with a group responsible for at least four bomb attacks on nanotechnology facilities in Mexico. Security authorities say that such eco-anarchist groups are forging stronger links.
On 11 May, the cell sent a four-page letter to the Italian newspaper Corriere della Sera claiming responsibility for the shooting of Roberto Adinolfi, the chief executive of Ansaldo Nucleare, the nuclear-engineering subsidiary of aerospace and defence giant Finmeccanica. Believed by authorities to be genuine, the letter is riddled with anti-science rhetoric. The group targeted Adinolfi because he is a “sorcerer of the atom”, it wrote. “Adinolfi knows well that it is only a matter of time before a European Fukushima kills on our continent.”
“Science in centuries past promised us a golden age, but it is pushing us towards self-destruction and total slavery,” the letter continues. “With this action of ours, we return to you a tiny part of the suffering that you, man of science, are pouring into this world.” The group also threatened to carry out further attacks.
The Italian Ministry of the Interior has subsequently beefed up security at thousands of potential political, industrial and scientific targets. The measures include assigning bodyguards to 550 individuals.
The Olga Cell, named after an imprisoned Greek anarchist, is part of the Informal Anarchist Federation, which, in April 2011, claimed responsibility for sending a parcel bomb that exploded at the offices of the Swiss nuclear lobby group, Swissnuclear, in Olten. A letter found in the remains of the bomb demanded the release of three individuals who had been detained for plotting an attack on IBM’s flagship nanotechnology facility in Zurich earlier that year. In a situation report published this month, the Swiss Federal Intelligence Service explicitly linked the federation to the IBM attack.
The Informal Anarchist Federation argues that technology, and indeed civilization, is responsible for the world’s ills, and that scientists are the handmaidens of capitalism. “Finmeccanica means bio- and nanotechnology. Finmeccanica means death and suffering, new frontiers of Italian capitalism,” the letter reads.
Gathering momentum
The cell says that it is uniting with eco-anarchist groups in other countries, including Mexico, Chile, Greece and the United Kingdom. Mexico has already seen similar attacks: in August 2011, a group called Individuals Tending Towards Savagery sent a parcel bomb that wounded two nanotechnology researchers at the Monterrey Institute of Technology. One received burns to his legs and a perforated eardrum and the other had his lung pierced by shrapnel (G. Herrera Corral Nature 476,373; 2011). The package contained enough explosive to collapse part of the building, according to police, but failed to detonate properly.
Earlier that year, the same group sent two bombs to the nanotechnology facility at the Polytechnic University of the Valley of Mexico. One was intercepted before anyone could be harmed, but the second detonated, injuring a security guard. It is not clear how closely the group is tied to the Informal Anarchist Federation, but in online forums the two bodies offer “direct support” for each other’s activities and talk of a “blossoming” of a more organized eco-anarchist movement.
In the wake of the Mexican bombings, the Monterrey Institute installed metal detectors, began to use police sniffer dogs and started random inspections of vehicles and packages. After a letter bomb addressed to a nanotechnology researcher at the Polytechnic University of Pachuca in Hidalgo exploded in December last year, the institute installed a perimeter fence and scanners, and campuses across the state heightened security measures.
Italian police investigating the shooting say that they are concerned about the rise in violent action by anarchist groups amid Europe’s economic crisis. On 23 May, for example, members of the Informal Anarchist Federation attacked railway signals in Bristol, UK, causing severe transport delays. An online message from the group said that the targets had been chosen to disrupt employees of the Ministry of Defence and defence-technology businesses in the area, including Raytheon and QinetiQ.
The Swiss report also noted signs of “an increasing degree of international networking between perpetrators”. The level of risk to scientists depends on their field of work, says Simon Johner, a spokesman for the Swiss Federal Intelligence Service. “We are not able to tell them what to do. We can only make them aware of the dangers. It’s up to institutions to take preventative actions.” The agency is working with police forces, businesses and research communities to assess and tackle the threat.
“These people do not represent mainstream opinion. But I am still pretty frightened by this violence,” says Michael Hagmann, a biochemist and head of corporate communications for the Swiss Federal Laboratories for Materials Science and Technology near Zurich, a public-sector partner of the IBM facility that also does nanotechnology research.
“Just a few weeks after the attempted bombing, we were due to have a large conference on nanotechnology and we were really quite nervous” about going ahead with it, Hagmann says. “But we concluded that the public discussion was more important and didn’t want to scare people by having 20 police guarding us. It would have sent the wrong message.”
Nature 485, 561 (31 May 2012) doi:10.1038/485561a
* * *
Published online 22 August 2011 | Nature 476, 373 (2011) | doi:10.1038/476373a
Column: World View
Stand up against the anti-technology terrorists

Home-made bombs are being sent to physicists in Mexico. Colleagues around the world should ensure their own security, urges Gerardo Herrera Corral.
Gerardo Herrera Corral
My elder brother, Armando Herrera Corral, was this month sent a tube of dynamite by terrorists who oppose his scientific research. The home-made bomb, which was in a shoe-box-sized package labelled as an award for his personal attention, exploded when he pulled at the adhesive tape wrapped around it. My brother, director of the technology park at the Monterrey Institute of Technology in Mexico, was standing at the time, and suffered burns to his legs and a perforated eardrum. More severely injured by the blast was his friend and colleague Alejandro Aceves López, whom my brother had gone to see in his office to share a cup of coffee and open the award. Aceves López was sitting down when my brother opened the package; he took the brunt of the explosion in his chest, and shrapnel pierced one of his lungs.
Both scientists are now recovering from their injuries, but they were extremely fortunate to survive. The bomb failed to go off properly, and only a fraction of the 20-centimetre-long cylinder of dynamite ignited. The police estimate that the package contained enough explosive to take down part of the building, had it worked as intended.
The next day, I, too, was sent a suspicious package. I have been advised by the police not to offer details of why the package was judged of concern, but it arrived by an unusual procedure, and on a Sunday. It tested positive for explosives, and was taken away by the bomb squad, which declared a false alarm after finding that the parcel contained only books. My first reaction was to leave the country. Now, I am confused as to how I should respond.
As an academic scientist, why was my brother singled out in this way? He does not work in a field that is usually considered high-risk for terrorist activity, such as medical research on animals. He works on computer science, and Aceves López is an expert in robotics. I am a high-energy physicist and coordinate the Mexican contribution to research using the Large Hadron Collider at CERN, Europe’s particle-physics laboratory; I have worked in the field for 15 years.
An extremist anarchist group known as Individuals Tending to Savagery (ITS) has claimed responsibility for the attack on my brother. This is confirmed by a partially burned note found by the authorities at the bomb site, signed by the ITS and with a message along the lines of: “If this does not get to the newspapers we will produce more explosions. Wounding or killing teachers and students does not matter to us.”
In statements posted on the Internet, the ITS expresses particular hostility towards nanotechnology and computer scientists. It claims that nanotechnology will lead to the downfall of mankind, and predicts that the world will become dominated by self-aware artificial-intelligence technology. Scientists who work to advance such technology, it says, are seeking to advance control over people by ‘the system’. The group praises Theodore Kaczynski, the Unabomber, whose anti-technology crusade in the United States in 1978–95 killed three people and injured many others.
The group’s rhetoric is absurd, but I urge colleagues around the world to take the threat that it poses to researchers seriously. Information gathered by Mexican federal authorities and Interpol link it to actions in countries including Spain, France and Chile. In April this year, the ITS sent a bomb — similar to the one posted to my brother — to the head of the Nanotechnology Engineering Division at the Polytechnic University of Mexico Valley in Tultitlan, although that device did not explode. In May, the university received a second parcel bomb, with a message reading: “This is not a joke: last month we targeted Oscar Camacho, today the institution, tomorrow who knows? Open fire on nanotechnology and those who support it!”
“I believe that terror should not succeed in establishing fear and imposing conduct.”
The scientific community must be made aware of such organizations, and of their capacity for destruction. Nanotechnology-research institutes and departments, companies and professional associations must beef up their security procedures, particularly on how they receive and accept parcels and letters.
I would like to stand up and speak in this way because I believe that terror should not succeed in establishing fear and imposing conduct that takes us far from the freedom we enjoy. I would like the police to take these events seriously; they are becoming a real threat to society. I would also like to express my solidarity with the Monterrey Institute of Technology — the institution that gave me both financial support to pursue my undergraduate studies and high-level academic training.
To oppose technology is not an unacceptable way to think. We may well debate the desirability of further technical development in our society. Yet radical groups such as the ITS overlook a crucial detail: it is not technology that is the problem, but how we use it. After Alfred Nobel invented dynamite he became a rich man, because it found use in mining, quarrying, construction and demolition. But people can also decide to put dynamite into a parcel and address it to somebody with the intention of killing them.
Gerardo Herrera Corral is a physicist at the Research and Advanced Studies Centre of the National Polytechnic Institute of Mexico in Mexico City.
Climate Change Strikes Especially Hard Blow to Native Americans (PBS)
CLIMATE CHANGE — July 19, 2012 at 3:42 PM EDT
BY: SASKIA DE MELKER AND REBECCA JACOBSON
Watch Native American Communities Plan for Climate Change Future on PBS. See more from PBS NewsHour.
On Thursday’s NewsHour, NewsHour correspondent Hari Sreenivasan moderated a panel discussion on how Native American tribes are coping with climate change.
The panel included four native leaders representing their communities at the First Stewards symposium:
- Jeff Mears – Oneida tribe, Wisconsin, Environmental Area Manager
- Micah McCarty – Makah tribe, Washington, Chairman
- Mike Williams – Akiak tribe, Alaska, Vice Chairman
- Kitty Simonds – Western Pacific Regional Fishery Management Council and native Hawaiian
When we began our NewsHour coverage on communities across the United States coping with climate change, we didn’t plan to focus on Native American tribes. But we soon realized that indigenous communities are on the frontlines of America’s climate-related dangers.
Native Americans make up about one percent of the United States population, but they manage more than 95 million acres of land. Their reservations lie in some of the most diverse ecosystems in the country, ranging from Alaska to the coasts of Florida. That diversity – both geographically and culturally – makes them a sort of demographic microcosm of the United States. That means the climate shifts that they are feeling now could give clues to what other Americans can expect might see in the near future.
Recent studies, including those from the National Wildlife Federation ,the EPA, and the USDA, highlight the disproportionate vulnerability of tribes to climate-related hazards such as coastal erosion, rising temperatures and extreme weather. Tribes depend on the land and natural resources for their culture and livelihood. What’s more, reservations often have high rates of poverty, unemployment and a lack of resources that would allow them to adapt to long-term climate changes.
We’ve reported on how rising seas threaten tribal land along the Louisiana coast. We’ve looked at the impact of a depleted salmon population on Northwest tribes. And we recently visited Washington state’s Quileute tribe, which has fought to reclaim land threatened by floods and sea level rise.
View photo essay
Relocating to adapt to environmental threats or disasters declines is not always a viable option for tribes, both because of the connection to their origins but also because they may lack the resources needed to move, said Larry Wasserman, environmental policy manager for the Swinomish tribe in the Pacific Northwest.
“Rather than being a mobile society that can move away from climatic changes, they need to think about how do they stay on this piece of ground and continue to live the lifestyle that they’ve been able to live, and how can their great-great-great-grandchildren do that,” Wasserman said.
Tony Foster, chairman of the Quileute Nation said that native people are in tune with the climate of their homelands and know early on when the balance of the ecosystem has been disrupted. “The Quileute has been here for over 10,000 years,” he said. “We know the layout of the land, and we know the conditions of our environment.”
“Traditional values teach us to be good ancestors,” added Micah McCarty, chairman of the Makah Tribe in Neah Bay, Washington. “Future generations are going to look back at us and say, ‘What did you do about this?'”
That forward thinking is necessary for planning for climate change which is defined over at least a 30-year range and is often modeled on time scales looking more than hundreds of years into the future.
And Jeff Mears, member and environmental area manager for the Oneida tribe in Wisconsin, said it’s important that the tribes are defined by more than their past.
Because many tribes have a unique status as sovereign nations, they can also implement their own initiatives and models for managing their environment. The Swinomish tribe, for example, has developed its own climate adaptation plan.
Tribal governments also want more say at the federal level when it comes to addressing in climate change.
There needs to be more “recognition from western science of the value of traditional ecological knowledge,” McCarty said. “So we need to look at how we can better inform the government of what tribal leaders bring to the table in regard to responding to climate change.”
And that’s the aim of a gathering to be held at the Smithsonian’s National Museum of the American Indian in Washington D.C. this week. The First Stewards symposium will bring together hundreds of indigenous tribal elders, leaders, and scientists from across America to discuss how best to confront past, present, and future adaptation to climate change.
See all of our coverage of how Native American communities are coping with climate change:
Native Lands Wash Away as Sea Levels Rise
Native Americans’ tribal lands along the Louisiana coast are washing away as sea levels rise and marshes sink. We report from Isle de Jean Charles, a community that is slowly disappearing into the sea.
The Northwest’s Salmon People Face a Salmon-less Future
For Northwest tribes, fishing for salmon is more than a food source, it’s a way of life. Now the climate may push the fish towards extinction. Together with KCTS 9 and EarthFix, NewsHour recently visited the Swinomish Indian reservation to see how they are coping.
Climate Change Threatens the ‘Twilight’ Tribe
Washington’s Quileute tribe, thrust into the spotlight by the “Twilight” series,’ has been caught in a struggle to reclaim land threatened by floods and sea level rise. Together with KCTS9 and EarthFix, NewsHour visited the tribe to hear their story.
IMF’s Peter Doyle scorns its ‘tainted’ leadership (BBC)
20 July 2012 Last updated at 11:50 GMT
Peter Doyle claims there was a “fundamental illegitimacy” in Christine Lagarde’s appointmentA top economist at the International Monetary Fund has poured scorn on its “tainted” leadership and said he is “ashamed” to have worked there.
Peter Doyle said in a letter to the IMF executive board that he wanted to explain his resignation after 20 years.
He writes of “incompetence”, “failings” and “disastrous” appointments for the IMF’s managing director, stretching back 10 years.
No one from the Washington-based IMF was immediately available for comment.
Mr Doyle, former adviser to the IMF’s European Department, which is running the bailout programs for Greece, Portugal and Ireland, said the Fund’s delay in warning about the urgency of the global financial crisis was a failure of the “first order”.
In the letter, dated 18 June and obtained by the US broadcaster CNN, Mr Doyle said the failings of IMF surveillance of the financial crisis “are, if anything, becoming more deeply entrenched”.
He writes: “This fact is most clear in regard to appointments for managing director which, over the past decade, have all-too-evidently been disastrous.
“Even the current incumbent [Christine Lagarde] is tainted, as neither her gender, integrity, or elan can make up for the fundamental illegitimacy of the selection process.”
Mr Doyle is thought to be echoing here widespread criticism that the head of the IMF is always a European, while the World Bank chief is always a US appointee.
Mr Doyle concludes his letter: “There are good salty people here. But this one is moving on. You might want to take care not to lose the others.”
The IMF could not be reached immediately by the BBC. However, CNN reported that a Fund spokesman told it that there was nothing to substantiate Mr Doyle’s claims and that the IMF had held its own investigations into surveillance of the financial crisis.
Analysis
Andrew WalkerBBC World Service Economics correspondentPeter Doyle’s letter is short but the criticism excoriating. Perhaps the bigger of the two main charges is that the IMF failed to warn enough about the problems that led to the global financial crises.
The IMF has had investigations which have, up to a point, made similar criticisms, but not in such inflammatory terms. The IMF did issue some warnings, but the allegation that they were not sustained or timely enough and were actively suppressed raises some very big questions about the IMF’s role.
Then there is the description of the managing director as tainted. It’s not personal. It’s a familiar attack on a process which always selects a European. It’s still striking, though, to hear it from someone so recently on the inside.
Disorderly Conduct: Probing the Role of Disorder in Quantum Coherence (Science Daily)
ScienceDaily (July 19, 2012) — A new experiment conducted at the Joint Quantum Institute (JQI)* examines the relationship between quantum coherence, an important aspect of certain materials kept at low temperature, and the imperfections in those materials. These findings should be useful in forging a better understanding of disorder, and in turn in developing better quantum-based devices, such as superconducting magnets.
Most things in nature are imperfect at some level. Fortunately, imperfections — a departure, say, from an orderly array of atoms in a crystalline solid — are often advantageous. For example, copper wire, which carries so much of the world’s electricity, conducts much better if at least some impurity atoms are present.
In other words, a pinch of disorder is good. But there can be too much of this good thing. The issue of disorder is so important in condensed matter physics, and so difficult to understand directly, that some scientists have been trying for some years to simulate with thin vapors of cold atoms the behavior of electrons flowing through solids trillions of times more dense. With their ability to control the local forces over these atoms, physicists hope to shed light on more complicated case of solids.
That’s where the JQI experiment comes in. Specifically, Steve Rolston and his colleagues have set up an optical lattice of rubidium atoms held at temperature close to absolute zero. In such a lattice atoms in space are held in orderly proximity not by natural inter-atomic forces but by the forces exerted by an array of laser beams. These atoms, moreover, constitute a Bose Einstein condensate (BEC), a special condition in which they all belong to a single quantum state.
This is appropriate since the atoms are meant to be a proxy for the electrons flowing through a solid superconductor. In some so called high temperature superconductors (HTSC), the electrons move in planes of copper and oxygen atoms. These HTSC materials work, however, only if a fillip of impurity atoms, such as barium or yttrium, is present. Theorists have not adequately explained why this bit of disorder in the underlying material should be necessary for attaining superconductivity.
The JQI experiment has tried to supply palpable data that can illuminate the issue of disorder. In solids, atoms are a fraction of a nanometer (billionth of a meter) apart. At JQI the atoms are about a micron (a millionth of a meter) apart. Actually, the JQI atom swarm consists of a 2-dimensional disk. “Disorder” in this disk consists not of impurity atoms but of “speckle.” When a laser beam strikes a rough surface, such as a cinderblock wall, it is scattered in a haphazard pattern. This visible speckle effect is what is used to slightly disorganize the otherwise perfect arrangement of Rb atoms in the JQI sample.
In superconductors, the slight disorder in the form of impurities ensures a very orderly “coherence” of the supercurrent. That is, the electrons moving through the solid flow as a single coordinated train of waves and retain their cohesiveness even in the midst of impurity atoms.
In the rubidium vapor, analogously, the slight disorder supplied by the speckle laser ensures that the Rb atoms retain their coordinated participation in the unified (BEC) quantum wave structure. But only up to a point. If too much disorder is added — if the speckle is too large — then the quantum coherence can go away. Probing this transition numerically was the object of the JQI experiment. The setup is illustrated in figure 1.
And how do you know when you’ve gone too far with the disorder? How do you know that quantum coherence has been lost? By making coherence visible.
The JQI scientists cleverly pry their disk-shaped gas of atoms into two parallel sheets, looking like two thin crepes, one on top of each other. Thereafter, if all the laser beams are turned off, the two planes will collide like miniature galaxies. If the atoms were in a coherent condition, their collision will result in a crisp interference pattern showing up on a video screen as a series of high-contrast dark and light stripes.
If, however, the imposed disorder had been too high, resulting in a loss of coherence among the atoms, then the interference pattern will be washed out. Figure 2 shows this effect at work. Frames b and c respectively show what happens when the degree of disorder is just right and when it is too much.
“Disorder figures in about half of all condensed matter physics,” says Steve Rolston. “What we’re doing is mimicking the movement of electrons in 3-dimensional solids using cold atoms in a 2-dimensional gas. Since there don’t seem to be any theoretical predictions to help us understand what we’re seeing we’ve moved into new experimental territory.”
Where does the JQI work go next? Well, in figure 2a you can see that the interference pattern is still visible but somewhat garbled. That arises from the fact that for this amount of disorder several vortices — miniature whirlpools of atoms — have sprouted within the gas. Exactly such vortices among electrons emerge in superconductivity, limiting their ability to maintain a coherent state.
The new results are published in the New Journal of Physics: “Disorder-driven loss of phase coherence in a quasi-2D cold atom system,” by M C Beeler, M E W Reed, T Hong, and S L Rolston.
Another of the JQI scientists, Matthew Beeler, underscores the importance of understanding the transition from the coherent state to incoherent state owing to the fluctuations introduced by disorder: “This paper is the first direct observation of disorder causing these phase fluctuations. To the extent that our system of cold atoms is like a HTSC superconductor, this is a direct connection between disorder and a mechanism which drives the system from superconductor to insulator.”
Global CO2 Emissions Continued to Increase in 2011, With Per Capita Emissions in China Reaching European Levels (Science Daily)
ScienceDaily (July 19, 2012) — Global emissions of carbon dioxide (CO2) — the main cause of global warming — increased by 3% last year, reaching an all-time high of 34 billion tonnes in 2011. In China, the world’s most populous country, average emissions of CO2 increased by 9% to 7.2 tonnes per capita. China is now within the range of 6 to 19 tonnes per capita emissions of the major industrialised countries. In the European Union, CO2 emissions dropped by 3% to 7.5 tonnes per capita. The United States remains one of the largest emitters of CO2, with 17.3 tones per capita, despite a decline due to the recession in 2008-2009, high oil prices and an increased share of natural gas.
These are the main findings of the annual report ‘Trends in global CO2emissions’, released July 19 by the European Commission’s Joint Research Centre (JRC) and the Netherlands Environmental Assessment Agency (PBL).
Based on recent results from the Emissions Database for Global Atmospheric Research (EDGAR) and latest statistics on energy use and relevant activities such as gas flaring and cement production, the report shows that global CO2 emissions continued to grow in 2011, despite reductions in OECD countries. Weak economic conditions, a mild winter, and energy savings stimulated by high oil prices led to a decrease of 3% in CO2 emissions in the European Union and of 2% in both the United States and Japan. Emissions from OECD countries now account for only one third of global CO2 emissions — the same share as that of China and India combined, where emissions increased by 9% and 6% respectively in 2011. Economic growth in China led to significant increases in fossil fuel consumption driven by construction and infrastructure expansion. The growth in cement and steel production caused China’s domestic coal consumption to increase by 9.7%.
The 3% increase in global CO2 emissions in 2011 is above the past decade’s average annual increase of 2.7%, with a decrease in 2008 and a surge of 5% in 2010. The top emitters contributing to the 34 billion tonnes of CO2 emitted globally in 2011 are: China (29%), the United States (16%), the European Union (11%), India (6%), the Russian Federation (5%) and Japan (4%).
Cumulative CO2 emissions call for action
An estimated cumulative global total of 420 billion tonnes of CO2 were emitted between 2000 and 2011 due to human activities, including deforestation. Scientific literature suggests that limiting the rise in average global temperature to 2°C above pre-industrial levels — the target internationally adopted in UN climate negotiations — is possible only if cumulative CO2emissions in the period 2000-2050 do not exceed 1 000 to 1 500 billion tonnes. If the current global trend of increasing CO2emissions continues, cumulative emissions will surpass this limit within the next two decades.
Fortunately, this trend is being mitigated by the expansion of renewable energy supplies, especially solar and wind energy and biofuels. The global share of these so-called modern renewables, which exclude hydropower, is growing at an accelerated speed and quadrupled from 1992 to 2011. This potentially represents about 0.8 billion tonnes of CO2emissions avoided as a result of using renewable energy supplies in 2011, which is close to Germany’s total CO2emissions in 2011.
“Trends in global CO2 emissions” report:http://edgar.jrc.ec.europa.eu/CO2REPORT2012.pdf
Society’s Response to Climate Change Is Critical (Science Daily)
ScienceDaily (July 18, 2012) — Lancaster University (UK) scientists have proposed a new way of considering society’s reactions to global warming by linking societal actions to temperature change.
Using this framework to analyse climate change policies aimed at avoiding dangerous climate change, they suggest that society will have to become fifty times more responsive to global temperature change than it has been since 1990.
The researchers, Dr Andy Jarvis, Dr David Leedal and Professor Nick Hewitt from the Lancaster Environment Centre, also show that if global energy use continues to grow as it has done historically, society would have to up its decarbonization efforts from its historic (160 year) value of 0.6% per year to 13% per year.
Dr Andy Jarvis said: “In order to avoid dangerous climate change, society will have to become much more responsive to the risks and damages that growth in global greenhouse gas emissions impose.”
The research, published in Nature Climate Change on 15 July has found that the global growth of new renewable sources of energy since 1990 constitutes a climate-society feedback of a quarter percent per year in the growth rate of CO2 emissions per degree temperature rise.
Professor Nick Hewitt said “If left unmanaged, the climate damages that we experience will motivate society to act to a greater or lesser degree. This could either amplify the growth in greenhouse gas emissions as we repair these damages or dampen them through loss of economic performance. Both are unpredictable and potentially dangerous.”
Social Identification, Not Obedience, Might Motivate Unspeakable Acts (Science Daily)
ScienceDaily (July 18, 2012) — What makes soldiers abuse prisoners? How could Nazi officials condemn thousands of Jews to gas chamber deaths? What’s going on when underlings help cover up a financial swindle? For years, researchers have tried to identify the factors that drive people to commit cruel and brutal acts and perhaps no one has contributed more to this knowledge than psychological scientist Stanley Milgram.
Just over 50 years ago, Milgram embarked on what were to become some of the most famous studies in psychology. In these studies, which ostensibly examined the effects of punishment on learning, participants were assigned the role of “teacher” and were required to administer shocks to a “learner” that increased in intensity each time the learner gave an incorrect answer. As Milgram famously found, participants were willing to deliver supposedly lethal shocks to a stranger, just because they were asked to do so.
Researchers have offered many possible explanations for the participants’ behavior and the take-home conclusion that seems to have emerged is that people cannot help but obey the orders of those in authority, even when those orders go to the extremes.
This obedience explanation, however, fails to account for a very important aspect of the studies: why, and under what conditions, people did not obey the experimenter.
In a new article published in Perspectives on Psychological Science, a journal of the Association for Psychological Science, researchers Stephen Reicher of the University of St. Andrews and Alexander Haslam and Joanne Smith of the University of Exeter propose a new way of looking at Milgram’s findings.
The researchers hypothesized that, rather than obedience to authority, the participants’ behavior might be better explained by their patterns of social identification. They surmised that conditions that encouraged identification with the experimenter (and, by extension, the scientific community) led participants to follow the experimenters’ orders, while conditions that encouraged identification with the learner (and the general community) led participants to defy the experimenters’ orders.
As the researchers explain, this suggests that participants’ willingness to engage in destructive behavior is “a reflection not of simple obedience, but of active identification with the experimenter and his mission.”
Reicher, Haslam, and Smith wanted to examine whether participants’ willingness to administer shocks across variants of the Milgram paradigm could be predicted by the extent to which the variant emphasized identification with the experimenter and identification with the learner.
For their study, the researchers recruited two different groups of participants. The expert group included 32 academic social psychologists from two British universities and on Australian university. The nonexpert group included 96 first-year psychology students who had not yet learned about the Milgram studies.
All participants were read a short description of Milgram’s baseline study and they were then given details about 15 variants of the study. For each variant, they were asked to indicate the extent to which that variant would lead participants to identify with the experimenter and the scientific community and the extent to which it would lead them to identify with the learner and the general community.
The results of the study confirmed the researchers’ hypotheses. Identification with the experimenter was a very strong positive predictor of the level of obedience displayed in each variant. On the other hand, identification with the learner was a strong negative predictor of the level of obedience. The relative identification score (identification with experimenter minus identification with learner) was also a very strong predictor of the level of obedience.
According to the authors, these new findings suggest that we need to rethink obedience as the standard explanation for why people engage in cruel and brutal behavior. This new research “moves us away from a dominant viewpoint that has prevailed within and beyond the academic world for nearly half a century — a viewpoint suggesting that people engage in barbaric acts because they have little insight into what they are doing and conform slavishly to the will of authority,” they write.
These new findings suggest that social identification provides participants with a moral compass and motivates them to act as followers. This followership, as the authors point out, is not thoughtless — “it is the endeavor of committed subjects.”
Looking at the findings this way has several advantages, Reicher, Haslam, and Smith argue. First, it mirrors recent historical assessments suggesting that functionaries in brutalizing regimes — like the Nazi bureaucrat Adolf Eichmann — do much more than merely follow orders. And it simultaneously accounts for why participants are more likely to follow orders under certain conditions than others.
The researchers acknowledge that the methodology used in this research is somewhat unorthodox — the most direct way to examine the question of social identification would involve recreating the Milgram paradigm and varying different aspects of the paradigm to manipulate social identification with both experimenter and learner. But this kind of research involves considerable ethical challenges. The purpose of the article, the authors say, is to provide a strong theoretical case for such research, “so that work to address the critical question of why (and not just whether) people still prove willing to participate in brutalizing acts can move forward.”
* * *
Most People Will Administer Shocks When Prodded By ‘Authority Figure’
ScienceDaily (Dec. 22, 2008) — Nearly 50 years after one of the most controversial behavioral experiments in history, a social psychologist has found that people are still just as willing to administer what they believe are painful electric shocks to others when urged on by an authority figure.
Jerry M. Burger, PhD, replicated one of the famous obedience experiments of the late Stanley Milgram, PhD, and found that compliance rates in the replication were only slightly lower than those found by Milgram. And, like Milgram, he found no difference in the rates of obedience between men and women.
Burger’s findings are reported in the January issue of American Psychologist. The issue includes a special section reflecting on Milgram’s work 24 years after his death on Dec. 20, 1984, and analyzing Burger’s study.
“People learning about Milgram’s work often wonder whether results would be any different today,” said Burger, a professor at Santa Clara University. “Many point to the lessons of the Holocaust and argue that there is greater societal awareness of the dangers of blind obedience. But what I found is the same situational factors that affected obedience in Milgram’s experiments still operate today.”
Stanley Milgram was an assistant professor at Yale University in 1961 when he conducted the first in a series of experiments in which subjects – thinking they were testing the effect of punishment on learning – administered what they believed were increasingly powerful electric shocks to another person in a separate room. An authority figure conducting the experiment prodded the first person, who was assigned the role of “teacher” to continue shocking the other person, who was playing the role of “learner.” In reality, both the authority figure and the learner were in on the real intent of the experiment, and the imposing-looking shock generator machine was a fake.
Milgram found that, after hearing the learner’s first cries of pain at 150 volts, 82.5 percent of participants continued administering shocks; of those, 79 percent continued to the shock generator’s end, at 450 volts. In Burger’s replication, 70 percent of the participants had to be stopped as they continued past 150 volts – a difference that was not statistically significant.
“Nearly four out of five of Milgram’s participants who continued after 150 volts went all the way to the end of the shock generator,” Burger said. “Because of this pattern, knowing how participants react at the 150-volt juncture allows us to make a reasonable guess about what they would have done if we had continued with the complete procedure.”
Milgram’s techniques have been debated ever since his research was first published. As a result, there is now an ethics codes for psychologists and other controls have been placed on experimental research that have effectively prevented any precise replications of Milgram’s work. “No study using procedures similar to Milgram’s has been published in more than three decades,” according to Burger.
Burger implemented a number of safeguards that enabled him to win approval for the work from his university’s institutional review board. First, he determined that while Milgram allowed his subjects to administer “shocks” of up to 450 volts in 15-volt increments, 150 volts appeared to be the critical point where nearly every participant paused and indicated reluctance to continue. Thus, 150 volts was the top range in Burger’s study.
In addition, Burger screened out any potential subjects who had taken more than two psychology courses in college or who indicated familiarity with Milgram’s research. A clinical psychologist also interviewed potential subjects and eliminated anyone who might have a negative reaction to the study procedure.
In Burger’s study, participants were told at least three times that they could withdraw from the study at any time and still receive the $50 payment. Also, these participants were given a lower-voltage sample shock to show the generator was real – 15 volts, as compared to 45 volts administered by Milgram.
Several of the psychologists writing in the same issue of American Psychologist questioned whether Burger’s study is truly comparable to Milgram’s, although they acknowledge its usefulness.
“…there are simply too many differences between this study and the earlier obedience research to permit conceptually precise and useful comparisons,” wrote Arthur G. Miller, PhD, of Miami University in Oxford, Ohio.
“Though direct comparisons of absolute levels of obedience cannot be made between the 150-volt maximum of Burger’s research design and Milgram’s 450-volt maximum, Burger’s ‘obedience lite’ procedures can be used to explore further some of the situational variables studied by Milgram, as well as look at additional variables,” wrote Alan C. Elms, PhD, of the University of California, Davis. Elms assisted Milgram in the summer of 1961.
Dummies guide to the latest “Hockey Stick” controversy (Real Climate)
http://www.realclimate.org
by Gavin Schmidt and Caspar Amman
Due to popular demand, we have put together a ‘dummies guide’ which tries to describe what the actual issues are in the latest controversy, in language even our parents might understand. A pdf version is also available. More technical descriptions of the issues can be seen here and here.
This guide is in two parts, the first deals with the background to the technical issues raised byMcIntyre and McKitrick (2005) (MM05), while the second part discusses the application of this to the original Mann, Bradley and Hughes (1998) (MBH98) reconstruction. The wider climate science context is discussed here, and the relationship to other recent reconstructions (the ‘Hockey Team’) can be seen here.
NB. All the data that were used in MBH98 are freely available for download atftp://holocene.evsc.virginia.edu/pub/sdr/temp/nature/MANNETAL98/ (and also as supplementary data at Nature) along with a thorough description of the algorithm.
Part I: Technical issues:
1) What is principal component analysis (PCA)?
This is a mathematical technique that is used (among other things) to summarize the data found in a large number of noisy records so that the essential aspects can more easily seen. The most common patterns in the data are captured in a number of ‘principal components’ which describe some percentage of the variation in the original records. Usually only a limited number of components (‘PC’s) have any statistical significance, and these can be used instead of the larger data set to give basically the same description.
2) What do these individual components represent?
Often the first few components represent something recognisable and physical meaningful (at least in climate data applications). If a large part of the data set has a trend, than the mean trend may show up as one of the most important PCs. Similarly, if there is a seasonal cycle in the data, that will generally be represented by a PC. However, remember that PCs are just mathematical constructs. By themselves they say nothing about the physics of the situation. Thus, in many circumstances, physically meaningful timeseries are ‘distributed’ over a number of PCs, each of which individually does not appear to mean much. Different methodologies or conventions can make a big difference in which pattern comes up tops. If the aim of the PCA analysis is to determine the most important pattern, then it is important to know how robust that pattern is to the methodology. However, if the idea is to more simply summarize the larger data set, the individual ordering of the PCs is less important, and it is more crucial to make sure that as many significant PCs are included as possible.
3) How do you know whether a PC has significant information?
This determination is usually based on a ‘Monte Carlo’ simulation (so-called because of the random nature of the calculations). For instance, if you take 1000 sets of random data (that have the same statistical properties as the data set in question), and you perform the PCA analysis 1000 times, there will be 1000 examples of the first PC. Each of these will explain a different amount of the variation (or variance) in the original data. When ranked in order of explained variance, the tenth one down then defines the 99% confidence level: i.e. if your real PC explains more of the variance than 99% of the random PCs, then you can say that this is significant at the 99% level. This can be done for each PC in turn. (This technique was introduced by Preisendorfer et al. (1981), and is called the Preisendorfer N-rule).
The figure to the right gives two examples of this. Here each PC is plotted against the amount of fractional variance it explains. The blue line is the result from the random data, while the blue dots are the PC results for the real data. It is clear that at least the first two are significantly separated from the random noise line. In the other case, there are 5 (maybe 6) red crosses that appear to be distinguishable from the red line random noise. Note also that the first (‘most important’) PC does not always explain the same amount of the original data.
4) What do different conventions for PC analysis represent?
Some different conventions exist regarding how the original data should be normalized. For instance, the data can be normalized to have an average of zero over the whole record, or over a selected sub-interval. The variance of the data is associated with departures from the whatever mean was selected. So the pattern of data that shows the biggest departure from the mean will dominate the calculated PCs. If there is an a priori reason to be interested in departures from a particular mean, then this is a way to make sure that those patterns move up in the PC ordering. Changing conventions means that the explained variance of each PC can be different, the ordering can be different, and the number of significant PCs can be different.
5) How can you tell whether you have included enough PCs?
This is rather easy to tell. If your answer depends on the number of PCs included, then you haven’t included enough. Put another way, if the answer you get is the same as if you had used all the data without doing any PC analysis at all, then you are probably ok. However, the reason why the PC summaries are used in the first place in paleo-reconstructions is that using the full proxy set often runs into the danger of ‘overfitting’ during the calibration period (the time period when the proxy data are trained to match the instrumental record). This can lead to a decrease in predictive skill outside of that window, which is the actual target of the reconstruction. So in summary, PC selection is a trade off: on one hand, the goal is to capture as much variability of the data as represented by the different PCs as possible (particularly if the explained variance is small), while on the other hand, you don’t want to include PCs that are not really contributing any more significant information.
Part II: Application to the MBH98 ‘Hockey Stick’
1) Where is PCA used in the MBH methodology?
When incorporating many tree ring networks into the multi-proxy framework, it is easier to use a few leading PCs rather than 70 or so individual tree ring chronologies from a particular region. The trees are often very closely located and so it makes sense to summarize the general information they all contain in relation to the large-scale patterns of variability. The relevant signal for the climate reconstruction is the signal that the trees have in common, not each individual series. In MBH98, the North American tree ring series were treated like this. There are a number of other places in the overall methodology where some form of PCA was used, but they are not relevant to this particular controversy.
2) What is the point of contention in MM05?
MM05 contend that the particular PC convention used in MBH98 in dealing with the N. American tree rings selects for the ‘hockey stick’ shape and that the final reconstruction result is simply an artifact of this convention.
3) What convention was used in MBH98?
MBH98 were particularly interested in whether the tree ring data showed significant differences from the 20th century calibration period, and therefore normalized the data so that the mean over this period was zero. As discussed above, this will emphasize records that have the biggest differences from that period (either positive of negative). Since the underlying data have a ‘hockey stick’-like shape, it is therefore not surprising that the most important PC found using this convention resembles the ‘hockey stick’. There are actual two significant PCs found using this convention, and both were incorporated into the full reconstruction.
4) Does using a different convention change the answer?
As discussed above, a different convention (MM05 suggest one that has zero mean over the whole record) will change the ordering, significance and number of important PCs. In this case, the number of significant PCs increases to 5 (maybe 6) from 2 originally. This is the difference between the blue points (MBH98 convention) and the red crosses (MM05 convention) in the first figure. Also PC1 in the MBH98 convention moves down to PC4 in the MM05 convention. This is illustrated in the figure on the right, the red curve is the original PC1 and the blue curve is MM05 PC4 (adjusted to have same variance and mean). But as we stated above, the underlying data has a hockey stick structure, and so in either case the ‘hockey stick’-like PC explains a significant part of the variance. Therefore, using the MM05 convention, more PCs need to be included to capture the significant information contained in the tree ring network.
This figure shows the difference in the final result whether you use the original convention and 2 PCs (blue) and the MM05 convention with 5 PCs (red). The MM05-based reconstruction is slightly less skillful when judged over the 19th century validation period but is otherwise very similar. In fact any calibration convention will lead to approximately the same answer as long as the PC decomposition is done properly and one determines how many PCs are needed to retain the primary information in the original data.

5) What happens if you just use all the data and skip the whole PCA step?
This is a key point. If the PCs being used were inadequate in characterizing the underlying data, then the answer you get using all of the data will be significantly different. If, on the other hand, enough PCs were used, the answer should be essentially unchanged. This is shown in the figure below. The reconstruction using all the data is in yellow (the green line is the same thing but with the ‘St-Anne River’ tree ring chronology taken out). The blue line is the original reconstruction, and as you can see the correspondence between them is high. The validation is slightly worse, illustrating the trade-off mentioned above i.e. when using all of the data, over-fitting during the calibration period (due to the increase number of degrees of freedom) leads to a slight loss of predictability in the validation step.
6) So how do MM05 conclude that this small detail changes the answer?
MM05 claim that the reconstruction using only the first 2 PCs with their convention is significantly different to MBH98. Since PC 3,4 and 5 (at least) are also significant they are leaving out good data. It is mathematically wrong to retain the same number of PCs if the convention of standardization is changed. In this case, it causes a loss of information that is very easily demonstrated. Firstly, by showing that any such results do not resemble the results from using all data, and by checking the validation of the reconstruction for the 19th century. The MM version of the reconstruction can be matched by simply removing the N. American tree ring data along with the ‘St Anne River’ Northern treeline series from the reconstruction (shown in yellow below). Compare this curve with the ones shown above.
As you might expect, throwing out data also worsens the validation statistics, as can be seen by eye when comparing the reconstructions over the 19th century validation interval. Compare the green line in the figure below to the instrumental data in red. To their credit, MM05 acknowledge that their alternate 15th century reconstruction has no skill.
7) Basically then the MM05 criticism is simply about whether selected N. American tree rings should have been included, not that there was a mathematical flaw?
Yes. Their argument since the beginning has essentially not been about methodological issues at all, but about ‘source data’ issues. Particular concerns with the “bristlecone pine” data were addressed in the followup paper MBH99 but the fact remains that including these data improves the statistical validation over the 19th Century period and they therefore should be included.
No. If you use the MM05 convention and include all the significant PCs, you get the same answer. If you don’t use any PCA at all, you get the same answer. If you use a completely different methodology (i.e. Rutherford et al, 2005), you get basically the same answer. Only if you remove significant portions of the data do you get a different (and worse) answer.
9) Was MBH98 the final word on the climate of last millennium?
Not at all. There has been significant progress on many aspects of climate reconstructions since MBH98. Firstly, there are more and better quality proxy data available. There are new methodologies such as described in Rutherford et al (2005) or Moberg et al (2005) that address recognised problems with incomplete data series and the challenge of incorporating lower resolution data into the mix. Progress is likely to continue on all these fronts. As of now, all of the ‘Hockey Team’ reconstructions (shown left) agree that the late 20th century is anomalous in the context of last millennium, and possibly the last two millennia.
The climate of the climate change debate is changing (The Guardian)
Quantifying how greenhouse gases contribute to extreme weather is a crucial step in calculating the cost of human influence
Myles Allen
guardian.co.uk, Wednesday 11 July 2012 12.08 BST
The climate may have changed this week. Not the physical climate, but the climate of the climate change debate. Tuesday marked thepublication of a series of papers examining the factors behind extreme weather events in 2011. Nothing remarkable about that, you might think, except, if all goes well, this will be the first of a regular, annual assessment quantifying how external drivers of climate contribute to damaging weather.
Some of these drivers, like volcanoes, are things we can do nothing about. But others, like rising levels of greenhouse gases, we can. And quantifying how greenhouse gases contribute to extreme weather is a crucial step in pinning down the real cost of human influence on climate. While most people think of climate change in terms of shrinking ice-sheets and slowly rising sea levels, it is weather events that actually do harm.
This week also saw a workshop in Oxford for climate change negotiators from developing countries. Again, nothing remarkable about that except, for the first time, the issue of “loss and damage” was top of the agenda. For years negotiations have been over emission reductions and sharing the costs of adaptation. Now the debate is turning to: who is going to pay for damage done?
It is a good time to ask, since the costs that can unambiguously be attributed to human-induced climate change are still relatively small. Although Munich Re estimates that weather events in 2011 cost more than $100bn and claimed many thousands of lives, only a few of these events were clearly made more likely by human influence. Others may have been made less likely, but occurred anyway – chance remains the single dominant factor in when and where a weather event occurs. For the vast majority of events, we simply don’t yet know either way.
Connecting climate change and specific weather events is only one link in the causal chain between greenhouse gas emissions and actual harm. But it is a crucial link. If, as planned, the assessment of 2011 becomes routine, we should be able to compare actual weather-related damage, in both good years and bad, with the damage that might have been in a world without human influence on climate. This puts us well on our way to a global inventory of climate change impacts. And as soon as that is available, the question of compensation will not be far behind.
The presumption in climate change negotiations is that “countries with historically high emissions” would be first in line to foot the bill for loss and damage. There may be some logic to this, but if you are an African (or Texan) farmer hit by greenhouse-exacerbated drought, is the European or American taxpayer necessarily the right place to look for compensation? As any good lawyer knows, there is no point in suing a man with empty pockets.
The only institution in the world that could deal with the cost of climate change without missing a beat is the fossil fuel industry: BP took a $30bn charge for Deepwater Horizon, very possibly more than the total cost of climate change damages last year, and was back in profit within months. Of the $5 trillion per year we currently spend on fossil energy, a small fraction would take care of all the loss and damage attributable to climate change for the foreseeable future several times over.
Such a pay-as-you-go liability regime would not address the impacts of today’s emissions on the 22nd century. Governments cannot wash their hands of this issue entirely. But we have been so preoccupied with the climate of the 22nd century that we have curiously neglected to look after the interests of those being affected by climate change today.
So rather than haggling over emission caps and carbon taxes, why not start with a simple statement of principle: standard product liability applies to anyone who sells or uses fossil fuels, including liability for any third-party side-effects. There is no need at present to say what these side-effects might be – indeed, the scientific community does not yet know. But we are getting there.
Para evitar catástrofes ambientais (FAPERJ)
Vilma Homero
05/07/2012
| Nelson Fernandes / UFRJ |
![]() |
| Novos métodos podem prever onde e quando ocorrerão deslizamentos na região serrana |
Quando várias áreas de Nova Friburgo, Petrópolis e Teresópolis sofreram deslizamentos, em janeiro de 2011, soterrando mais de mil pessoas em toneladas de lama e destroços, a pergunta que ficou no ar foi se o desastre poderia ter sido minimizado. No que depender do Instituto de Geociências da Universidade Federal do Rio de Janeiro (UFRJ), as consequências provocadas por cataclismas ambientais como esses poderão ser cada vez menores. Para isso, os pesquisadores estão desenvolvendo uma série de projetos multidisciplinares para viabilizar sistemas de análise de riscos. Um deles é o Prever, que, com suporte de programas computacionais, une os avanços alcançados em metodologias de sensoriamento remoto, geoprocessamento, geomorfologia e geotecnia, à modelagem matemática para a previsão do tempo em áreas mais suscetíveis a deslizamentos, como a região serrana. “Embora a realidade dos vários municípios daquela região seja bastante diferente, há em comum uma falta de metodologias voltadas à previsão para esse tipo de risco. O fundamental agora é desenvolver métodos capazes de prever a localização espacial e temporal desses processos. Ou seja, saber “onde” e “quando” esses deslizamentos podem ocorrer”, explica o geólogo Nelson Ferreira Fernandes, professor do Departamento de Geografia da UFRJ e Cientista do Nosso Estado da FAPERJ.Para elaborar métodos de previsão de risco, em tempo real, que incluam movimentos de massa deflagrados em resposta a entradas pluviométricas, os pesquisadores estão traçando um mapeamento, realizado a partir de sucessivas imagens captadas por satélites, que são cruzadas com mapas geológicos e geotécnicos. “O Prever combina modelos de simulação climática e de previsão de eventos pluviométricos extremos, desenvolvidos na área da meteorologia, com modelos matemáticos de previsão, mais as informações desenvolvidos pela geomorfologia e pela geotecnia, que nos indicam as áreas mais suscetíveis a deslizamentos. Assim, podemos elaborar traçar previsões de risco, em tempo real, classificando os resultados de acordo com a gravidade desse risco, que varia continuamente, no espaço e no tempo”, explica Nelson.
Para isso, os Departamentos de Geografia, Geologia e Meteorologia do Instituto de Geociências da UFRJ se unem à Faculdade de Geologia da Universidade do Estado do Rio de Janeiro (Uerj) e ao Departamento de Engenharia Civil da Pontifícia Universidade Católica (PUC-Rio). Com a sobreposição de informações, pode-se apontar, nas imagens resultantes, as áreas mais sensíveis a deslizamentos. “Somando esses conhecimentos acadêmicos aos dados de órgãos estaduais, como o Núcleo de Análise de Desastres (Nade), do Departamento de Recursos Minerais (DRM-RJ), responsável pelo apoio técnico à Defesa Civil, estaremos não apenas atualizando constantemente os mapas usados hoje pelos órgãos do governo do estado e pela Defesa Civil, como estaremos também facilitando um planejamento mais preciso para a tomada de decisões.”
| Divulgação / UFRJ |
![]() |
| Uma simulação mostra em imagem a possibilidade de um deslizamento de massas na região de Jacarepaguá |
Esse novo mapeamento também significa melhor qualidade e maior precisão e mais detalhamento de imagens. “Obviamente, com melhores instrumentos em mãos, o que quer dizer mapas mais detalhados e precisos, os gestores públicos também poderão planejar e agir de forma mais acurada e em tempo real”, afirma Nelson. Segundo o pesquisador, esses mapas precisam ter atualização constante para acompanhar a dinâmica da interferência da ocupação humana sobre a topografia das várias regiões. “Isso vem acontecendo seja pelo corte de encostas, seja pela ocupação de áreas aterradas ou pelas mudanças em consequência da drenagem de rios. Tudo isso altera a topografia e, no caso de chuvas mais fortes e prolongadas, pode tornar determinados solos mais propensos a deslizamentos ou a alagamentos e enchentes”, exemplifica Nelson.Mas os sistemas de análises de desastres e riscos ambientais também compreendem outras linhas de pesquisa. No Prever, se trabalha em duas linhas de ação distintas. “Uma delas é a de clima, em que detectamos as áreas em que haverá um aumento pluviométrico a longo prazo e fornecemos informações a órgãos de decisão e planejamento. Outra é a previsão de curtíssimo prazo, o chamadonowcasting.” No caso de previsão de longo prazo, a professora Ana Maria Bueno Nunes, do Departamento de Meteorologia da mesma universidade, vem trabalhando no projeto “Implementação de um Sistema de Modelagem Regional: Estudos de Tempo e Clima”, sob sua coordenação, com a proposta de uma reconstrução do hidroclima da América do Sul, uma extensão daquele projeto.
“Unindo dados sobre precipitação fornecidos por satélite às informações das estações atmosféricas, é possível, através de modelagem computacional, traçar estimativas de precipitação. Assim, podemos não apenas saber quando haverá chuvas de intensidade mais forte, ou mais prolongadas, como também observar em mapas passados qual foi a convergência de fatores que provocou uma situação de desastre. A reconstrução é uma forma de estudar o passado para entender cenários atuais que se mostrem semelhantes. E, com isso, ajudamos a melhorar os modelos de previsão”, afirma Ana. Estas informações, que a princípio servirão para uso acadêmico e científico, permitirão que se tenha dados cada vez mais detalhados de como se formam grandes chuvas, aquelas que são capazes de provocar inundações em determinadas áreas. “Isso permitirá não apenas compreender melhor as condições em que certas situações de calamidade acontecem, como prever quando essas condições podem se repetir. Com o projeto, estamos também formando recursos humanos ainda mais especializados nessa área”, avalia a pesquisadora, cujo trabalho conta com recursos de um Auxílio à Pesquisa (APQ 1).
Também integrante do projeto, o professor Gutemberg Borges França, da UFRJ, explica que existem três tipos de previsão meteorológica: a sinótica – que traça previsões numa média de 6h até sete dias, cobrindo alguns milhares de km, como o continente sul-americano; a de mesoescala, que faz previsões sobre uma média de 6h a dois dias, cobrindo algumas centenas de km, como o estado do Rio de Janeiro; e a de curto prazo, ou nowcasting, que varia de poucos minutos até 3h a 6h, sobre uma área específica de poucos km, como a região metropolitana do Rio de Janeiro, por exemplo.
Se previsões de longo prazo são importantes, as de curto prazo, ou nowcasting, também são. Segundo Gutemberg, os atuais modelos numéricos de previsão ainda são deficientes para realizar a previsão de curto prazo, que termina sendo feita em grande parte com base na experiência do meteorologista, pela interpretação das informações de várias fontes de dados disponíveis, como imagens de satélites; de estações meteorológicas de superfície e altitude; de radar e sodar (Sonic Detection and Ranging), e modelos numéricos. “No entanto, o meteorologista carece ainda hoje de ferramentas objetivas que possam auxiliá-lo na integração dessas diversas informações para realizar uma previsão de curto prazo mais acurada”, argumenta Gutemberg.Atualmente, o Rio de Janeiro já dispõe de estações de recepção de satélites, estação de altitude – radiosondagem – que geram perfis atmosféricos, estações meteorológicas de superfície e radar. O Laboratório de Meteorologia Aplicada do Departamento de Meteorologia, da UFRJ, está desenvolvendo, desde 2005, ferramentas de previsão de curto prazo, utilizando inteligência computacional, visando o aprimoramento das previsões de eventos meteorológicos extremos para o Rio de Janeiro. “Com inteligência computacional, temos essa informação em tempo mais curto e de forma mais acurada.”, resume.
| © FAPERJ – Todas as matérias poderão ser reproduzidas, desde que citada a fonte. |
This summer is ‘what global warming looks like’ (AP) + related & reactions
Jul 3, 1:10 PM EDT
By SETH BORENSTEIN
AP Science Writer
WASHINGTON (AP) — Is it just freakish weather or something more? Climate scientists suggest that if you want a glimpse of some of the worst of global warming, take a look at U.S. weather in recent weeks.
Horrendous wildfires. Oppressive heat waves. Devastating droughts. Flooding from giant deluges. And a powerful freak wind storm called a derecho.
These are the kinds of extremes experts have predicted will come with climate change, although it’s far too early to say that is the cause. Nor will they say global warming is the reason 3,215 daily high temperature records were set in the month of June.
Scientifically linking individual weather events to climate change takes intensive study, complicated mathematics, computer models and lots of time. Sometimes it isn’t caused by global warming. Weather is always variable; freak things happen.
And this weather has been local. Europe, Asia and Africa aren’t having similar disasters now, although they’ve had their own extreme events in recent years.
But since at least 1988, climate scientists have warned that climate change would bring, in general, increased heat waves, more droughts, more sudden downpours, more widespread wildfires and worsening storms. In the United States, those extremes are happening here and now.
So far this year, more than 2.1 million acres have burned in wildfires, more than 113 million people in the U.S. were in areas under extreme heat advisories last Friday, two-thirds of the country is experiencing drought, and earlier in June, deluges flooded Minnesota and Florida.
“This is what global warming looks like at the regional or personal level,” said Jonathan Overpeck, professor of geosciences and atmospheric sciences at the University of Arizona. “The extra heat increases the odds of worse heat waves, droughts, storms and wildfire. This is certainly what I and many other climate scientists have been warning about.”
Kevin Trenberth, head of climate analysis at the National Center for Atmospheric Research in fire-charred Colorado, said these are the very record-breaking conditions he has said would happen, but many people wouldn’t listen. So it’s I told-you-so time, he said.
As recently as March, a special report an extreme events and disasters by the Nobel Prize-winning Intergovernmental Panel on Climate Change warned of “unprecedented extreme weather and climate events.” Its lead author, Chris Field of the Carnegie Institution and Stanford University, said Monday, “It’s really dramatic how many of the patterns that we’ve talked about as the expression of the extremes are hitting the U.S. right now.”
“What we’re seeing really is a window into what global warming really looks like,” said Princeton University geosciences and international affairs professor Michael Oppenheimer. “It looks like heat. It looks like fires. It looks like this kind of environmental disasters.”
Oppenheimer said that on Thursday. That was before the East Coast was hit with triple-digit temperatures and before a derecho – a large, powerful and long-lasting straight-line wind storm – blew from Chicago to Washington. The storm and its aftermath killed more than 20 people and left millions without electricity. Experts say it had energy readings five times that of normal thunderstorms.
Fueled by the record high heat, this was among the strongest of this type of storm in the region in recent history, said research meteorologist Harold Brooks of the National Severe Storm Laboratory in Norman, Okla. Scientists expect “non-tornadic wind events” like this one and other thunderstorms to increase with climate change because of the heat and instability, he said.
Such patterns haven’t happened only in the past week or two. The spring and winter in the U.S. were the warmest on record and among the least snowy, setting the stage for the weather extremes to come, scientists say.
Since Jan. 1, the United States has set more than 40,000 hot temperature records, but fewer than 6,000 cold temperature records, according to the National Oceanic and Atmospheric Administration. Through most of last century, the U.S. used to set cold and hot records evenly, but in the first decade of this century America set two hot records for every cold one, said Jerry Meehl, a climate extreme expert at the National Center for Atmospheric Research. This year the ratio is about 7 hot to 1 cold. Some computer models say that ratio will hit 20-to-1 by midcentury, Meehl said.
“In the future you would expect larger, longer more intense heat waves and we’ve seen that in the last few summers,” NOAA Climate Monitoring chief Derek Arndt said.
The 100-degree heat, drought, early snowpack melt and beetles waking from hibernation early to strip trees all combined to set the stage for the current unusual spread of wildfires in the West, said University of Montana ecosystems professor Steven Running, an expert on wildfires.
While at least 15 climate scientists told The Associated Press that this long hot U.S. summer is consistent with what is to be expected in global warming, history is full of such extremes, said John Christy at the University of Alabama in Huntsville. He’s a global warming skeptic who says, “The guilty party in my view is Mother Nature.”
But the vast majority of mainstream climate scientists, such as Meehl, disagree: “This is what global warming is like, and we’ll see more of this as we go into the future.”
—
Intergovernmental Panel on Climate Change report on extreme weather: http://ipcc-wg2.gov/SREX/
U.S. weather records:
http://www.ncdc.noaa.gov/extremes/records/
—
Seth Borenstein can be followed at http://twitter.com/borenbears
© 2012 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed. Learn more about our Privacy Policy and Terms of Use.
* * *
July 3, 2012
To Predict Environmental Doom, Ignore the Past
http://www.realclearscience.com
By Todd Myers
The information presented here cannot be used directly to calculate Earth’s long-term carrying capacity for human beings because, among other things, carrying capacity depends on both the affluence of the population being supported and the technologies supporting it. – Paul Ehrlich, 1986
One would expect scientists to pause when they realize their argument about resource collapse makes the king of environmental catastrophe, Paul Ehrlich, look moderate by comparison. Ehrlich is best known for a 40-year series of wildly inaccurate predictions of looming environmental disaster. Yet he looks positively reasonable compared to a paper recently published in the scientific journal Nature titled “Approaching a state shift in Earth’s biosphere.”
The paper predicts we are rapidly approaching a moment of “planetary-scale critical transition,” due to overuse of resources, climate change and other human-caused environmental damage. As a result, the authors conclude, this will “require reducing world population growth and per-capita resource use; rapidly increasing the proportion of the world’s energy budget that is supplied by sources other than fossil fuels,” and a range of other drastic policies. If these sound much like the ideas proposed in the 1970s by Ehrlich and others, like The Club of Rome, it is not a coincidence. TheNature paper is built on Ehrlich’s assumptions and cites his work more than once.
The Nature article, however, suffers from numerous simple statistical errors and assumptions rather than evidence. Its authors do nothing to deal with the fundamental mistakes that led Ehrlich and others like him down the wrong path so many times. Instead, the paper simply argues that with improved data, this time their predictions of doom are correct.
Ultimately, the piece is a good example of the great philosopher of science Thomas Kuhn’s hypothesis, written 50 years ago, that scientists often attempt to fit the data to conform to their particular scientific paradigm, even when that paradigm is obviously flawed. When confronted with failure to explain real-world phenomena, the authors of the Nature piece have, as Kuhn described in The Structure of Scientific Revolutions, devised “numerous articulations and ad hoc modifications of their theory in order to eliminate any apparent conflict.” Like scientists blindly devoted to a failed paradigm, the Nature piece simply tries to force new data to fit a flawed concept.
“Assuming this does not change”
During the last half-century, the world has witnessed a dramatic increase in food production. According to the U.N.’s Food and Agriculture Organization, yields per acre of rice have more than doubled, corn yields are more than one-and-a-half times larger than 50 years ago, and wheat yields have almost tripled. As a result, even as human population has increased, worldwide hunger has declined.
Despite these well-known statistics, the authors of the Nature study assume not only no future technological improvements, but that none have occurred over the last 200 years. The authors simply choose one data point and then project it both into the past and into the future. The authors explain the assumption that underlies their thesis in the caption to a graphic showing the Earth approaching environmental saturation. They write:
“The percentages of such transformed lands… when divided by 7,000,000,000 (the present global human population) yield a value of approximately 2.27 acres (0.92 ha) of transformed land for each person. That value was used to estimate the amount of transformed land that probably existed in the years 1800, 1900 and 1950, and which would exist in 2025 and 2045 assuming conservative population growth and that resource use does not become any more efficient.” (emphasis added)
In other words, the basis for their argument ignores the easily accessible data from the last half century. They take a snapshot in time and mistake it for a historical trend. In contrast to their claim of no change in the efficient use of resources, it would be difficult to find a time period in the last millennium when resource use did not become more efficient.
Ironically, this is the very error Ehrlich warns against in his 1986 paper – a paper the authors themselves cite several times. Despite Ehrlich’s admonition that projections of future carrying capacity are dependent upon technological change, the authors of the Nature article ignore history to come to their desired conclusion.
A Paradigm of Catastrophe
What would lead scientists to make such simplistic assumptions and flat-line projections? Indeed, what would lead Nature editors to print an article whose statistical underpinnings are so flawed? The simple belief in the paradigm of inevitable environmental catastrophe: humans are doing irreparable damage to the Earth and every bit of resource use moves us closer to that catastrophe. The catastrophe paradigm argues a simple model that eventually we will run out of space and resources, and determining the date of ultimate doom is a simple matter of doing the math.
Believing in this paradigm also justifies exaggeration in order to stave off the serious consequences of collapse. Thus, they describe the United Nations’ likely population estimate for 2050 as “the most conservative,” without explaining why. They claim “rapid climate change shows no signs of slowing” without providing a source citation for the claim, and despite an actual slowing of climate change over the last decade.
The need to avoid perceived global catastrophe also encourages the authors to blow past warning signs that their analysis is not built on solid foundations – as if the poor history of such projections were not already warning enough. Even as they admit the interactions “between overlapping complex systems, however, are providing difficult to characterize mathematically,” they base their conclusions on the simplest linear mathematical estimate that assumes nothing will change except population over the next 40 years. They then draw a straight line, literally, from today to the environmental tipping point.
Why is such an unscientific approach allowed to pass for science in a respected international journal? Because whatever the argument does not supply, the paradigm conveniently fills in. Even if the math isn’t reliable and there are obvious counterarguments, “everyone” understands and believes in the underlying truth – we are nearing the limits of the planet’s ability to support life. In this way the conclusion is not proven but assumed, making the supporting argument an impenetrable tautology.
Such a circumstance creates the conditions of scientific revolutions, where the old paradigm fails to explain real-world phenomena and is replaced by an alternative. Given the record of failure of the paradigm of resource catastrophe, dating back to the 1970s, one would hope we are moving toward such a change. Unfortunately, Nature and the authors of the piece are clinging to the old resource-depletion model, simply trying to re-work the numbers.
Let us hope policymakers recognize the failure of that paradigm before they make costly and dangerous policy mistakes that impoverish billions in the name of false scientific assumptions.
* * *
Washington Policy Center exposed: Todd Myers
The Washington Policy Center labels itself as a non-partisan think tank. It’s a mischaractization to say the least but that is their bread and butter. Based in Seattle, with a director in Spokane, the WPC’s mission is to “promote free-market solutions through research and education.” It makes sense they have an environmental director in the form of Todd Myers who has a new book called“Eco-Fads: How The Rise Of Trendy Environmentalism Is Harming The Environment.” You know, since polar bears love to swim.

From the WPC’s newsletter:
Wherever we turn, politicians, businesses and activists are promoting the latest fashionable “green” policy or product. Green buildings, biofuels, electric cars, compact fluorescent lightbulbs and a variety of other technologies are touted as the next key step in protecting the environment and promoting a sustainable future. Increasingly, however, scientific and economic information regarding environmental problems takes a back seat to the social and personal value of being seen and perceived as “green.”
As environmental consciousness has become socially popular, eco-fads supplant objective data. Politicians pick the latest environmental agenda in the same way we choose the fall fashions – looking for what will yield the largest benefit with our public and social circles.
Eco-Fads exposes the pressures that cause politicians, businesses, the media and even scientists to fall for trendy environmental fads. It examines why we fall for such fads, even when we should know better. The desire to “be green” can cloud our judgment, causing us to place things that make us appear green ahead of actions that may be socially invisible yet environmentally responsible.
By recognizing the range of forces that have taken us in the wrong direction, Eco-Fads shows how we can begin to get back on track, creating a prosperous and sustainable legacy for our planet’s future. Order Eco-Fads today for $26.95 (tax and shipping included).
This is what the newsletter doesn’t tell you about Todd Myers.
Myers has spoken at the Heartland Institute’s International Conference on Climate Change. In case you didn’t know, the Heartland Institute has received significant funding from ExxonMobil, Phillip Morris and numerous other corporations and conservative foundations with vested interest in the so-called debate around climate change. That conference was co-sponsored by numerous prominent climate change denier groups, think tanks and lobby groups, almost all of which have received money from the oil industry.
Why not just call it the Washington Fallacy Center? For a litte more background, including ties back to the Koch Brothers, go HERE. In fact, Jack Kemp calls it “The Heritage Foundation of the Northwest.”
* * *
Did climate change ’cause’ the Colorado wildfires?
29 Jun 2012 1:50 PM

Photo by USAF.
The wildfires raging through Colorado and the West are unbelievable. As of yesterday there were 242 fires burning, according to the National Interagency Fire Center. Almost 350 homes have been destroyed in Colorado Springs, where 36,000 people have been evacuated from their homes. President Obama is visiting today to assess the devastation for himself.
Obviously the priority is containing the fires and protecting people. But inevitably the question is going to come up: Did climate change “cause” the fires? Regular readers know that this question drives me a little nuts. Pardon the long post, but I want to try to tackle this causation question once and for all.
What caused the Colorado Springs fire? Well, it was probably a careless toss of a cigarette butt, or someone burning leaves in their backyard, or a campfire that wasn’t properly doused. [UPDATE:Turns out it was lightning.] That spark, wherever it came from, is what triggered the cascading series of events we call “a fire.” It was what philosophers call the proximate cause, the most immediate, the closest.
All the other factors being discussed — the intense drought covering the state, the dead trees left behind by bark beetles, the high winds — are distal causes. Distal causes are less tightly connected to their effects. The dead trees didn’t make any particular fire inevitable; there can be no fire without a spark. What they did is make it more likelythat a fire would occur. Distal causes are like that: probabilistic. Nonetheless, our intuitions tell us that distal causes are in many ways more satisfactory explanations. They tell us something about themeaning of events, not just the mechanisms, which is why they’re also called “ultimate” causes. It’s meaning we usually want.
When we say, “the fires in Colorado were caused by unusually dry conditions, high winds, and diseased trees,” no one accuses us of error or imprecision because it was “really” the matches or campfires that caused them. We are not expected to say, “no individual fire can be definitively attributed to hot, windy conditions, but these are the kinds of fires we would expect to see in those conditions.” Why waste the words? We are understood to be talking about distal causes.
When we talk about, not fires themselves, but the economic and socialimpacts of fires, the range of distal causes grows even broader. For a given level of damages, it’s not enough to have dry conditions and dead trees, not even enough to have fire — you also have to take into account the density of development, the responsiveness of emergency services, and the preparedness of communities for prevention or evacuation.
So if we say, “the limited human toll of the Colorado fires is the result of the bravery and skill of Western firefighters,” no one accuses us of error or imprecision because good firefighting was only one of many contributors to the final level of damages. Everything from evacuation plans to the quality of the roads to the vagaries of the weather contributed in some way to that state of affairs. But we are understood to be identifying a distal cause, not giving a comprehensive account of causation.
What I’m trying to say is, we are perfectly comfortable discussing distal causes in ordinary language. We don’t require scientistic literalism in our everyday talk.
The reason I’m going through all this, you won’t be surprised, is to tie it back to climate change. We know, of course, that climate change was not the proximate cause of the fires. It was a distal cause; it made the fires more likely. That much we know with a high degree of confidence, as this excellent review of the latest science by Climate Communication makes clear.
One can distinguish between distal causes by their proximity to effects. Say the drought made the fires 50 percent more likely than average June conditions in Colorado. (I’m just pulling these numbers out of my ass to illustrate a point.) Climate change maybe only made the fires 1 percent more likely. As a cause, it is more distal than the drought. And there are probably causes even more distal than climate change. Maybe the exact tilt of the earth’s axis this June made the fires 0.0001 percent more likely. Maybe the location of a particular proton during the Big Bang made them 0.000000000000000001 percent more likely. You get the point.
With this in mind, it’s clear that the question as it’s frequently asked — “did climate change cause the fires?” — is not going to get us the answer we want. If it’s yes or no, the answer is “yes.” But that doesn’t tell us much. What people really want to know when they ask that question is, “how proximate a cause is climate change?”
When we ask the question like that, we start to see why climate is such a wicked problem. Human beings, by virtue of their evolution, physiology, and socialization, are designed to heed causes within a particular range between proximate and distal. If I find my kid next to an overturned glass and a puddle of milk and ask him why the milk is spilled, I don’t care about the neurons firing and the muscles contracting. That’s too proximate. I don’t care about humans evolving with poor peripheral vision. That’s too distal. I care about my kid reaching for it and knocking it over. That’s not the only level of causal explanation that is correct, but it’s the level of causal explanation that is most meaningful to me.
For a given effect — a fire, a flood, a dead forest — climate change is almost always too distal a cause to make a visceral impression on us. We’re just not built to pay heed to those 1 percent margins. It’s too abstract. The problem is, wildfires being 1 percent more likely averaged over the whole globe actually means a lot more fires, a lot more damage, loss, and human suffering. Part of managing the Anthropocene is finding ways of making distal causes visceral, giving them a bigger role in our thinking and institutions.
That’s what the “did climate change cause XYZ?” questions are always really about: how proximate a cause climate change is, how immediate its effects are in our lives, how close it is.
There is, of course, a constant temptation among climate hawks to exaggerate how proximate it is, since, all things being equal, proximity = salience. But I don’t think that simply saying “climate change caused the fires” is necessarily false or exaggerated, any more than saying “drought caused the fires” is. The fact that the former strikes many people as suspect while the latter is immediately understood mostly just means that we’re not used to thinking of climate change as a distal cause among others.
That’s why we reach for awkward language like, “fires like this are consonant with what we would expect from climate change.” Not because that’s the way we discuss all distal causes — it’s clearly not — but simply because we’re unaccustomed to counting climate change among those causes. It’s an unfamiliar habit. As it grows more familiar, I suspect we’ll quit having so many of these tedious semantic disputes.
And I’m afraid that, in coming years, it will become all-too familiar.
* * *
Perspective On The Hot and Dry Continental USA For 2012 Based On The Research Of Judy Curry and Of McCabe Et Al 2004
http://pielkeclimatesci.wordpress.com
Photo is from June 26 2012 showing start of the June 26 Flagstaff firenear Boulder Colorado
I was alerted to an excellent presentation by Judy Curry [h/t to Don Bishop] which provides an informative explanation of the current hot and dry weather in the USA. The presentation is titled
Climate Dimensions of the Water Cycle by Judy Curry
First, there is an insightful statement by Judy where she writes in slide 5
CMIP century scale simulations are designed for assessing sensitivity to greenhouse gases using emissions scenarios They are not fit for the purpose of inferring decadal scale or regional climate variability, or assessing variations associated with natural forcing and internal variability. Downscaling does not help.
We need a much broader range of scenarios for regions (historical data, simple models, statistical models, paleoclimate analyses, etc). Permit creatively constructed scenarios as long as they can’t be falsified as incompatible with background knowledge.
With respect to the current hot and dry weather, the paper referenced by Judy in her Powerpoint talk
Gregory J. McCabe, Michael A. Palecki, and Julio L. Betancourt, 2004: Pacific and Atlantic Ocean influences on multidecadal drought frequency in the United States. PNAS 2004 101 (12) 4136-4141; published ahead of print March 11, 2004, doi:10.1073/pnas.0306738101
has the abstract [highlight added]
More than half (52%) of the spatial and temporal variance in multidecadal drought frequency over the conterminous United States is attributable to the Pacific Decadal Oscillation (PDO) and the Atlantic Multidecadal Oscillation (AMO). An additional 22% of the variance in drought frequency is related to a complex spatial pattern of positive and negative trends in drought occurrence possibly related to increasing Northern Hemisphere temperatures or some other unidirectional climate trend. Recent droughts with broad impacts over the conterminous U.S. (1996, 1999–2002) were associated with North Atlantic warming (positive AMO) and northeastern and tropical Pacific cooling (negative PDO). Much of the long-term predictability of drought frequency may reside in the multidecadal behavior of the North Atlantic Ocean. Should the current positive AMO (warm North Atlantic) conditions persist into the upcoming decade, we suggest two possible drought scenarios that resemble the continental-scale patterns of the 1930s (positive PDO) and 1950s (negative PDO) drought.
They also present the figure below with the title “Impact of AMO, PDO on 20-yr drought frequency (1900-1999)”. The figures correspond to A: Warm PDO, cool AMO; B: Cool PDO, cool AMO; C: Warm PDO, warm AMO and D: Cool PDO, warm AMO
The current Drought Monitor analysis shows a remarkable agreement with D, as shown below
As Judy shows in her talk (slide 8) since 1995 we have been in a warm phase of the AMO and have entered a cool phase of the PDO. This corresponds to D in the above figure. Thus the current drought and heat is not an unprecedented event but part of the variations in atmospheric-ocean circulation features that we have seen in the past. This reinforces what Judy wrote that
[w]e need a much broader range of scenarios for regions (historical data, simple models, statistical models, paleoclimate analyses
in our assessment of risks to key resources due to climate. Insightful discussions of the importance of these circulation features are also presented, as just a few excellent examples, by Joe Daleo and Joe Bistardi on ICECAP, by Bob Tisdale at Bob Tisdale – Climate Observations, and in posts on Anthony Watts’s weblog Watts Up With That.
Hotter summers could be a part of Washington’s future
By Ashley Halsey III and Marissa Evans, Published: July 5
Are unbroken weeks of sweltering weather becoming the norm rather than the exception?
The answer to the second, however, is a little more complicated.
Call it a qualified yes.
“Trying to wrap an analysis around it in real time is like trying to diagnose a car wreck as the cars are still spinning,” said Deke Arndt, chief of climate monitoring at the National Climatic Data Center in Asheville, N.C. “But we had record heat for the summer season on the Eastern Seaboard in 2010. We had not just record heat, but all-time record heat, in the summer season in 2011. And then you throw that on top of this [mild] winter and spring and the year to date so far, it’s very consistent with what we’d expect in a warming world.”
Nothing dreadfully dramatic is taking place — the seasons are not about to give way to an endless summer.
Heat-trapping greenhouse gases pumped into the atmosphere may be contributing to unusually hot and long heat waves — the kind of events climate scientists have long warned will become more common. Many anticipate a steady trend of ever-hotter average temperatures as human activity generates more and more carbon pollution.
To some, the numbers recorded this month and in recent years fit together to suggest a balmy future.
“We had a warm winter, a cold spring and now a real hot summer,” said Jessica Miller, 21, a visitor from Ohio, as she sat on a bench beneath the trees in Lafayette Square. “I think the overall weather patterns are changing.”
Another visitor, who sat nearby just across from the White House, shared a similar view.
“I think it’s a natural changing of the Earth’s average temperatures,” said Joe Kaufman, a Pennsylvanian who had just walked over from Georgetown.
Arndt said he expects data for the first half of this year will show that it was the warmest six months on record. Experts predict that average temperatures will rise by 3 to 5 degrees by mid-century and by 6 to 10 degrees by the end of the century.
If that worst prediction comes true, 98 degrees will become the new normal at this time of year in Washington 88 years from now.
Will every passing year till then break records?
“Not so much record-breaking every year,” Arndt said. “But we’ll break records on the warm end more often than on the cold end, that’s for sure. As we continue to warm, we will be flirting with warm records much more than with cold records, and that’s what’s played out over much of the last few years.”
If the present is our future, it may be sizzling. The current heat wave has had eight consecutive days of 95-degree weather. The temperature may reach 106 on Saturday, and the first break will come Monday, when a few days of more seasonable highs in the upper 80s are expected.
The hot streak began June 28 and peaked the next day with a 104-degree record-breaker, the hottest temperature ever recorded here in June. That broke a record of 102 set in 1874 and matched in June 2011.
Political Scientists Are Lousy Forecasters (N.Y.Times)
OPINION
Katia Fouquet
By JACQUELINE STEVENS
Published: June 23, 2012
DESPERATE “Action Alerts” land in my in-box. They’re from the American Political Science Association and colleagues, many of whom fear grave “threats” to our discipline. As a defense, they’ve supplied “talking points” we can use to tell Congressional representatives that political science is a “critical part of our national science agenda.”
Political scientists are defensive these days because in May the House passed an amendment to a bill eliminating National Science Foundation grants for political scientists. Soon the Senate may vote on similar legislation. Colleagues, especially those who have received N.S.F. grants, will loathe me for saying this, but just this once I’m sympathetic with the anti-intellectual Republicans behind this amendment. Why? The bill incited a national conversation about a subject that has troubled me for decades: the government — disproportionately — supports research that is amenable to statistical analyses and models even though everyone knows the clean equations mask messy realities that contrived data sets and assumptions don’t, and can’t, capture.
It’s an open secret in my discipline: in terms of accurate political predictions (the field’s benchmark for what counts as science), my colleagues have failed spectacularly and wasted colossal amounts of time and money. The most obvious example may be political scientists’ insistence, during the cold war, that the Soviet Union would persist as a nuclear threat to the United States. In 1993, in the journal International Security, for example, the cold war historian John Lewis Gaddis wrote that the demise of the Soviet Union was “of such importance that no approach to the study of international relations claiming both foresight and competence should have failed to see it coming.” And yet, he noted, “None actually did so.” Careers were made, prizes awarded and millions of research dollars distributed to international relations experts, even though Nancy Reagan’s astrologer may have had superior forecasting skills.
Political prognosticators fare just as poorly on domestic politics. In a peer-reviewed journal, the political scientist Morris P. Fiorina wrote that “we seem to have settled into a persistent pattern of divided government” — of Republican presidents and Democratic Congresses. Professor Fiorina’s ideas, which synced nicely with the conventional wisdom at the time, appeared in an article in 1992 — just before the Democrat Bill Clinton’s presidential victory and the Republican 1994 takeover of the House.
Alas, little has changed. Did any prominent N.S.F.-financed researchers predict that an organization like Al Qaeda would change global and domestic politics for at least a generation? Nope. Or that the Arab Spring would overthrow leaders in Egypt, Libya and Tunisia? No, again. What about proposals for research into questions that might favor Democratic politics and that political scientists seeking N.S.F. financing do not ask — perhaps, one colleague suggests, because N.S.F. program officers discourage them? Why are my colleagues kowtowing to Congress for research money that comes with ideological strings attached?
The political scientist Ted Hopf wrote in a 1993 article that experts failed to anticipate the Soviet Union’s collapse largely because the military establishment played such a big role in setting the government’s financing priorities. “Directed by this logic of the cold war, research dollars flowed from private foundations, government agencies and military individual bureaucracies.” Now, nearly 20 years later, the A.P.S.A. Web site trumpets my colleagues’ collaboration with the government, “most notably in the area of defense,” as a reason to retain political science N.S.F. financing.
Many of today’s peer-reviewed studies offer trivial confirmations of the obvious and policy documents filled with egregious, dangerous errors. My colleagues now point to research by the political scientists and N.S.F. grant recipients James D. Fearon and David D. Laitin that claims that civil wars result from weak states, and are not caused by ethnic grievances. Numerous scholars have, however, convincingly criticized Professors Fearon and Laitin’s work. In 2011 Lars-Erik Cederman, Nils B. Weidmann and Kristian Skrede Gleditsch wrote in the American Political Science Review that “rejecting ‘messy’ factors, like grievances and inequalities,” which are hard to quantify, “may lead to more elegant models that can be more easily tested, but the fact remains that some of the most intractable and damaging conflict processes in the contemporary world, including Sudan and the former Yugoslavia, are largely about political and economic injustice,” an observation that policy makers could glean from a subscription to this newspaper and that nonetheless is more astute than the insights offered by Professors Fearon and Laitin.
How do we know that these examples aren’t atypical cherries picked by a political theorist munching sour grapes? Because in the 1980s, the political psychologist Philip E. Tetlock began systematically quizzing 284 political experts — most of whom were political science Ph.D.’s — on dozens of basic questions, like whether a country would go to war, leave NATO or change its boundaries or a political leader would remain in office. His book “Expert Political Judgment: How Good Is It? How Can We Know?” won the A.P.S.A.’s prize for the best book published on government, politics or international affairs.
Professor Tetlock’s main finding? Chimps randomly throwing darts at the possible outcomes would have done almost as well as the experts.
These results wouldn’t surprise the guru of the scientific method, Karl Popper, whose 1934 book “The Logic of Scientific Discovery” remains the cornerstone of the scientific method. Yet Mr. Popper himself scoffed at the pretensions of the social sciences: “Long-term prophecies can be derived from scientific conditional predictions only if they apply to systems which can be described as well-isolated, stationary, and recurrent. These systems are very rare in nature; and modern society is not one of them.”
Government can — and should — assist political scientists, especially those who use history and theory to explain shifting political contexts, challenge our intuitions and help us see beyond daily newspaper headlines. Research aimed at political prediction is doomed to fail. At least if the idea is to predict more accurately than a dart-throwing chimp.
To shield research from disciplinary biases of the moment, the government should finance scholars through a lottery: anyone with a political science Ph.D. and a defensible budget could apply for grants at different financing levels. And of course government needs to finance graduate student studies and thorough demographic, political and economic data collection. I look forward to seeing what happens to my discipline and politics more generally once we stop mistaking probability studies and statistical significance for knowledge.
Jacqueline Stevens is a professor of political science at Northwestern University and the author, most recently, of “States Without Nations: Citizenship for Mortals.”
A version of this op-ed appeared in print on June 24, 2012, on page SR6 of the New York edition with the headline: Political Scientists Are Lousy Forecasters.
Gestão sustentável da água está obtendo benefícios econômicos, sociais e ambientais, afirmam os países (Unic.org)
Os resultados da pesquisa da ONU abrangendo 130 países estão num relatório detalhado sobre os esforços globais para melhorar a gestão da água.
Rio de Janeiro, 19 de junho de 2012 – Mais de 80% dos países reformaram sua legislação sobre o uso da água nos últimos vinte anos em reação à demanda crescente de recursos hídricos decorrente do aumento da população, da urbanização e de mudanças climáticas.
Em muitos casos, essa reforma teve impactos positivos no desenvolvimento, incluindo melhorias no acesso à água potável, na saúde humana e no rendimento da água na agricultura.
Todavia o progresso global tem sido mais lento quando se trata de irrigação, recuperação das águas pluviais e investimento em serviços do ecossistema de água doce.
Estas são algumas das conclusões de um estudo das Nações Unidas cobrindo mais de 130 governos de nações quanto aos esforços para melhorar a gestão sustentável dos recursos hídricos. A pesquisa foi produzida especificamente para dar subsídios à tomada de decisões na Rio+20.
Ela se concentra no progresso rumo à implementação de abordagens acordadas internacionalmente para a gestão e uso da água, conhecida como Gestão Integrada de Recursos Hídricos (GIRH).
Apoiada por Estados membros da ONU na Cúpula da Terra Rio-92 como parte de um plano de ação global sobre desenvolvimento sustentável (Agenda 21), a GIRH é um caminho para o desenvolvimento e a gestão eficientes, equitativos e sustentáveis dos recursos hídricos limitados do mundo.
Em meio a demandas crescentes e conflitantes no abastecimento de água do mundo, a GIRH integra necessidades domésticas, agrícolas, industriais e ambientais ao planejamento hídrico, ao invés de considerar cada demanda isoladamente.
“Uma abordagem integrada e adaptável é essencial para garantir que as necessidades de diversos grupos de usuários, que por vezes concorrem entre si, sejam atendidas equitativamente, para que o desenvolvimento e a gestão de recursos hídricos beneficiem a todos”, disse o Presidente da ONU-Água, Michel Jarraud.
“Seu sucesso depende de um quadro institucional e de governança que facilite o diálogo e as decisões sobre recursos hídricos, que sejam ecológica, econômica e socialmente sustentáveis”, concluiu.
Vinte anos após a Cúpula da Terra, os governos mundiais estão reunidos mais uma vez no Rio, onde o papel fundamental da gestão da água doce na transição para uma economia verde abrangente de baixo carbono com uso eficiente de recursos é uma das muitas questões importantes em discussão.
A pesquisa, que foi coordenada pelo Programa das Nações Unidas para o Meio Ambiente (PNUMA) em nome da ONU-Água (o mecanismo de coordenação entre as agências da ONU para questões de água doce), pediu aos governos seus comentários sobre infraestrutura, financiamento, governança e outras áreas relacionadas à gestão da água, para medir o sucesso dos países na transição para a GIRH.
No geral, 90% dos países pesquisados relataram uma série de impactos positivos decorrentes de abordagens integradas para a gestão da água, após as reformas nacionais.
Eis algumas outras conclusões importantes:
- A maioria dos países percebeu que os riscos relacionados à água e a concorrência por recursos hídricos têm aumentado nos últimos 20 anos;
- O abastecimento doméstico de água é classificado pela maioria dos países como a maior prioridade para a gestão dos recursos hídricos;
- A maioria dos países relatou uma tendência crescente no financiamento do desenvolvimento dos recursos hídricos, embora continue a haver obstáculos à implementação de reformas;
- O progresso no rendimento da água está perdendo prioridade em relação a outras reformas na gestão de água, com menos de 50% das reformas nacionais abordando esse tema.
“A gestão sustentável e o uso de água – devido ao seu papel vital na segurança da alimentação, na energia ou no apoio aos valiosos serviços do ecossistema – sustenta a transição para uma economia verde de baixo carbono e o uso eficiente de recursos”, disse Achim Steiner, subsecretário-geral da ONU e diretor de executivo do PNUMA.
“Além de destacar os desafios, esta nova pesquisa mostra também importantes sucessos na gestão integrada dos recursos hídricos, onde uma abordagem mais sustentável à água resultou em benefícios concretos para a saúde humana, o meio ambiente e a redução da pobreza. Na Rio+20, os governos têm a oportunidade de capitalizar sobre essas inovações e traçar o caminho a seguir para o desenvolvimento sustentável, para que as necessidades de água para uma população global, que deverá aumentar para 9 bilhões até 2050, possam ser atendidas de forma equitativa”, acrescentou Steiner.
A pesquisa da ONU mostra as principais mudanças ambientais ocorridas entre 1992, quando pela primeira vez a GIRH foi amplamente apoiada pelos governos, e hoje – e como são administrados os recursos hídricos diante desses desafios.
A população mundial, por exemplo, aumentou de 5,3 bilhões em 1992 para pouco mais de 7 bilhões hoje, com impactos que são mais fortemente sentidos nos países em desenvolvimento. Isso foi acompanhado por uma migração crescente das áreas rurais para as urbanas bem como elevados movimentos de refugiados, devido a desastres climáticos e sociopolíticos.
Sucessos e desafios
A pesquisa mostra que a introdução da GIRH em nível nacional varia muito em todo o mundo – desde fases iniciais de planejamento até a implementação concreta de novas leis e políticas.
Ao responder a pesquisa, alguns governos relataram impactos significativos no desenvolvimento em consequência da implantação de estratégias de GIRH desde 1992, tais como:
Estônia: A introdução de taxas sobre a água e impostos sobre a poluição contribuiu para um maior rendimento da água e a redução da carga de poluição no mar Báltico.
Costa Rica: 50% da arrecadação das taxas sobre a água agora é reinvestida na gestão dos recursos hídricos.
Guatemala: A capacidade de geração de energia hidrelétrica quase duplicou entre 1982 e 2011.
Gana: 40% dos sistemas de irrigação para produtividade e uso mais eficaz da água foram recuperados.
Chade: O acesso ao abastecimento de água aumentou de 15% em 1990 para 50% em 2011.
Tunísia: Foram construídas 110 estações de tratamento de águas servidas.
Ainda assim, muitos países – especialmente os de regiões em desenvolvimento – sinalizaram a necessidade de maior capacitação, investimento e desenvolvimento de infraestrutura, a fim de implementar plenamente a gestão integrada dos recursos hídricos.
Percepção das questões principais pelos países
As questões relacionadas com a água mais frequentemente citadas como tendo ‘alta’ ou ‘máxima’ prioridade pelos governos são o desenvolvimento de infraestrutura e financiamento (79% de todos os países) e o financiamento para a gestão de recursos hídricos (78%).
A mudança climática é citada como alta prioridade pela maioria dos países (70% do total) e 76% dos países consideraram que a ameaça das mudanças climáticas aos recursos hídricos aumentou desde 1992.
Mas a pesquisa também destaca importantes diferenças entre países desenvolvidos e em desenvolvimento em termos de prioridades relacionadas à água. Usando o Índice de Desenvolvimento Humano, a pesquisa categorizou os países em quatro grupos de IDH: baixo, médio, alto e muito alto.
Garantir o abastecimento de água adequado para a agricultura é uma alta prioridade para muitos países com baixo IDH, enquanto a preservação da água doce dos ecossistemas (“água para o meio ambiente”) é uma prioridade principalmente para países com IDH muito alto.
Recomendações da pesquisa
A pesquisa inclui uma série de metas e recomendações sugeridas, que visam informar os tomadores de decisões na Rio+20. Elas são baseadas em uma avaliação das conclusões da pesquisa e incluem:
- Até 2015, cada país deverá desenvolver objetivos e prazos específicos para preparar e implementar um programa de ação e estratégia de financiamento de GIRH.
- Até 2015, deverá ser estabelecido um mecanismo de informação global na gestão dos recursos hídricos nacionais. Isso é para garantir um sistema de informação mais rigoroso sobre o progresso da GIRH e melhorar a disponibilidade das informações.
- É necessário mais esforço para aumentar os níveis de financiamento e melhorar o quadro institucional da gestão dos recursos hídricos – dando especial atenção aos países com baixo IDH.
Desenvolvimento Sustentável: Estado Sólido, Líquido ou Gasoso? (Plurale)
Envolverde Rio + 20
14/6/2012 – 10h48
por Patricia Almeida Ashley*
Faço aqui uma reflexão sobre as discussões em andamento sobre Objetivos de Desenvolvimento Sustentável (Sustainable Development Goals) para uma possível ação a ser acordada entre os países representados na Rio+20.
Percebo uma ansiedade por um “estado sólido”, paupável, mensurável, segurável, assegurável, verificável, comprovável, comparável, planejável, previsível, reproduzível, sempre que se fala em Objetivos de Desenvolvimento Sustentável como resultados e impactos a serem desdobrados em Indicadores e Metas para serem usados por todos os países e regiões no mundo. Haja números e estatísticas e equações!!!. Para mim, reflete uma racionalidade mecanicista, positivista, típica da abordagem científica hipotético-dedutiva e cartesiana, que pressupõe equações lineares, métodos estatísticos e métricas parametrizáveis para comparações, rankings, previsões. Típica de formação de engenharias e outras ciências que requerem, para a sua contribuição na formação do conhecimento humano, estruturas mensuráveis.
Quando passamos a conceber os Objetivos do Desenvolvimento Sustentável como algo mais pertinente a processos e capacidades para que as sociedades renovem seus sistemas jurídicos, normativos, suas lógicas anacrônicas para a educação, reprodução, produção e consumo, estamos passando para um “estado líquido” de concepção de desenvolvimento sustentável, algo como a água que não se perde ao cruzar com as pedras, mas as contorna, sofre com contaminações, mas se evapora, se desmancha para um novo ciclo de vida. Ou seja, os Objetivos do Desenvolvimento Sustentável não passam mais a ser comparáveis em métricas entre países, pois somos águas e terrenos distintos, mas somos passíveis de trocas, intercâmbios, aprendizagens, intenções e ações para que nossas águas sempre se renovem e gerem vida.
E se caminharmos para os Objetivos do Desenvolvimento Sustentável como algo mais pertinente a princípios, valores, sensações, sentimentos, afetos, daí caminharemos para a qualidade espiritual da humanidade em harmonia com a Terra e com o Cosmos, passando a enxergar o quão grande é a família a que pertencemos e o que viemos fazer aqui e agora e com todos que percebemos estar e viver e morrer. Digo que é desenvolvimento sustentável como “estado gasoso” que trabalha pela inteligência espiritual, pela evolução de consciências, pelo desapego à limitação da expressão sólida apenas perceptível pelos cinco sentidos. Como medir em indicadores e metas o que atingimos e atuamos quando podemos nos abraçar profundamente e sem medo? Já experimentou algo assim? Percebe o que muda em sua hierarquia de valores? Entende por que podemos ser plenamente realizados sem ter que ter que ter que ter, mas sendo o ser para ser o ser entre seres?
O artigo Why we need sustainable development goals?, que replico mais abaixo e originalmente publicado em Why we need Sustainable Development Goals – SciDev.Net, foi o que me provocou as reflexões que escrevi acima. Coloco, como contraposição, o excelente artigo elaborado por Benedito Silva Neto e David Basso, publicado na Revista Ambiente e Sociedade, em 2010, sob título A ciência e o desenvolvimento sustentável: para além do positivismo e da pós-modernidade, que nos ajuda a sair do estado sólido, transitar pelo necessário estado líquido para atingir e recuperarmos o estado gasoso do desenvolvimento sustentável.
* Patricia Almeida Ashley é colunista de Plurale, professora e coordenadora da Rede EConsCiência e Ecocidades da Univesidade Federal Fluminense.
* Publicado originalmente no site Plurale.
Coastal N.C. counties fighting sea-level rise prediction (News Observer)
MON, MAY 28, 2012 10:50 PM
State lawmakers are considering a measure that would limit how North Carolina prepares for sea-level rise, which many scientists consider one of the surest results of climate change.
Federal authorities say the North Carolina coast is vulnerable because of its low, flat land and thin fringe of barrier islands. A state-appointed science panel has reported that a 1-meter rise in sea level is likely by 2100.
The calculation, prepared for the N.C. Coastal Resources Commission, was intended to help the state plan for rising water that could threaten 2,000 square miles. Critics say it could thwart economic development on just as large a scale.
But NC-20, named for the 20 coastal counties, appears to be winning its campaign to undermine them.
The Coastal Resources Commission agreed to delete references to planning benchmarks – such as the 1-meter prediction – and new development standards for areas likely to be inundated.
The N.C. Division of Emergency Management, which is using a $5 million federal grant to analyze the impact of rising water, lowered its worst-case scenario prediction from 1 meter (about 39 inches) to 15 inches by 2100.
Politics and economics in play
Several local governments on the coast have passed resolutions against sea-level rise policies.
When the General Assembly convened this month, Republican legislators went further.
They circulated a bill that authorizes only the coastal commission to calculate how fast the sea is rising. It said the calculations must be based only on historic trends – leaving out the accelerated rise that climate scientists widely expect this century if warming increases and glaciers melt.
The bill, a substitute for an unrelated measure the N.C. House passed last year, has not been introduced. State legislative officials say they can’t predict how it might be changed, or when or whether it will emerge.
Longtime East Carolina University geologist Stan Riggs, a science panel member who studies the evolution of the coast, said the 1-meter estimate is squarely within the mainstream of research.
“We’re throwing this science out completely, and what’s proposed is just crazy for a state that used to be a leader in marine science,” he said of the proposed legislation. “You can’t legislate the ocean, and you can’t legislate storms.”
NC-20 Chairman Tom Thompson, economic development director in Beaufort County, said his members – many of them county managers and other economic development officials – are convinced that climate changes and sea-level rises are part of natural cycles. Climate scientists who say otherwise, he believes, are wrong.
The group’s critiques quote scientists who believe the rate of sea-level rise is actually slowing. NC-20 says the state should rely on historical trends until acceleration is detected. The computer models that predict a quickening rate could be inaccurate, it says.
“If you’re wrong and you start planning today at 39 inches, you could lose millions of dollars in development and 2,000 square miles would be condemned as a flood zone,” Thompson said. “Is it really a risk to wait five years and see?”
State planners concerned
State officials say the land below the 1-meter elevation would not be zoned as a flood zone and off-limits to development. Planners say it’s crucial to allow for rising water when designing bridges, roads, and sewer lines that will be in use for decades.
“We’re concerned about it,” said Philip Prete, an environmental planner in Wilmington, which will soon analyze the potential effects of rising water on infrastructure. “For the state to tie our hands and not let us use the information that the state science panel has come up with makes it overly restrictive.”
Other states, he said, are “certainly embracing planning.”
Maine is preparing for a rise of up to 2 meters by 2100, Delaware 1.5 meters, Louisiana 1 meter and California 1.4 meters. Southeastern Florida projects up to a 2-foot rise by 2060.
Dueling studies
NC-20 says the state should plan for 8 inches of rise by 2100, based on the historical trend in Wilmington.
The science panel based its projections on records at the northern coast town of Duck, where the rate is twice as fast, and factored in the accelerated rise expected to come later. Duck was chosen, the panel said, because of the quality of its record and site on the open ocean.
The panel cites seven studies that project global sea level will rise as much as 1 meter, or more, by 2100. The Intergovernmental Panel on Climate Change estimated in 2007 a rise of no more than 23 inches, but did not factor in the melting land ice that many scientists now expect.
NC-20’s science adviser, Morehead City physicist John Droz, says he consulted with 30 sea-level experts, most of them not named in his latest critique of the panel’s work. He says the 13-member panel failed to do a balanced review of scientific literature, didn’t use the best available science and made unsupported assumptions.
“I’m not saying these people are liars,” Thompson said. “I’m saying they have a passion for sea-level rise and they can’t give it up.”
John Dorman of the N.C. Division of Emergency Management, which is preparing a study of sea-level impact, said an “intense push” by the group and state legislators led to key alterations.
Instead of assuming a 1-meter, worst-case rise, he said, the study will report the impact of seas that rise only 3.9, 7.8, 11.7 and 15.6 inches by 2100. The 1-meter analysis will be available to local governments that request it.
“It’s not the product we had put the grant out for,” Dorman said, referring to the $5 million from the Federal Emergency Management Agency that’s paying for the study. Coastal communities will still find the work useful, he predicts.
The backlash on the coast centers on the question of whether sea-level rise will accelerate, said Bob Emory, chairman of the Coastal Resources Commission.
Emory, who lives in New Bern, said the commission deleted wording from its proposed sea-level rise policy that hinted at new regulations in order to find common ground. “Any remaining unnecessarily inflammatory language that’s still in there, we want to get out,” he said.
New information will be incorporated as it comes out, he said.
“There are people who disagree on the science. There are people who worry about what impact even talking about sea-level rise will have on development,” Emory said. “It’s my objective to have a policy that makes so much sense that people would have trouble picking at it.”
In written comments, the N.C. Department of Environment and Natural Resources said the legislation that circulated earlier this month appeared consistent with the coastal commission’s policy changes.
But the department warned of the “unintended impacts” of not allowing agencies other than the coastal commission to develop sea-level rise policies. The restriction could undermine the Division of Emergency Management’s study, it said, and the ability of transportation and emergency-management planners to address rising waters.
The N.C. Coastal Federation, the region’s largest environmental group, said the bill could hurt local governments in winning federal planning grants. Insurance rates could go up, it says.
Relying solely on historical trends, the group said, is like “being told to make investment decisions strictly on past performance and not being able to consider market trends and research.”














Você precisa fazer login para comentar.