Arquivo da tag: Previsão

Visualizing The Way Americans Value Water (fastcoexist.com)

By Ariel Schwartz (accessed December 17, 2012)

It’s a pretty precious resource, considering that we need it to live. But do we actually care enough to change our behavior to make sure we have it in the future?

The aging water infrastructure in the U.S. is fragile, to say the least; every year, over 1.7 trillion gallons of water are lost due to leaks and breaks in the system. It’s never good to waste water, but that’s a staggeringly unacceptable figure at a time when the country is facing unprecedented droughts. But on a grassroots level, things may be starting to change. Water technology company Xylem’s new Value of Water Index, which examines American attitudes toward water, indicates that the public is finally realizing the magnitude of our water problem–and that everyone might need to pitch in to fix it.

According to the report–culled from a survey of 1,008 voters in the U.S.–79% of Americans realize we have a water scarcity problem. That may seem high, but 86% of respondents also say they have dealt with water shortages and contamination, meaning it takes a lot (or is just impossible) to convince some people. A whopping 88% of respondents think the country’s water structure needs reform.

Americans also think they have some personal responsibility for the crisis–specifically, 31% of respondents think they should have to pay a bit more on water bills for infrastructure improvements. If Americans upped their monthly water bill by just $7.70, we would see an extra $6.4 billion for water infrastructure investments.

In spite of everything, 69% of those polled say they take clean water for granted, and just 29% think problems with our water infrastructure will seriously affect them (remember: the vast majority of respondents have dealt with water shortages and contamination already). Water awareness still has a long way to go–but it will most likely be sped up as water shortages become more common.

Here’s the whole infographic

Eight examples of where the IPCC has missed the mark on its predictions and projections (The Daily Climate)

flooded-768

A “king tide” leaves parts of Sausalito, Calif., flooded in 2010. Disagreement over the impact of ice-sheet melting on sea-level rise has led the Intergovernmental Panel on Climate Change to omit their influence – and thus underestimate sea-level rise – in recent reports, a pattern the panel repeats with other key findings. Photo by Yanna B./flickr.

Dec. 6, 2012

Correction appended

By Glenn Scherer
The Daily Climate

Scientists will tell you: There are no perfect computer models. All are incomplete representations of nature, with uncertainty built into them. But one thing is certain: Several fundamental projections found in Intergovernmental Panel on Climate Change reports have consistently underestimated real-world observations, potentially leaving world governments at doubt as to how to guide climate policy.

emissions

Emissions

At the heart of all IPCC projections are “emission scenarios:” low-, mid-, and high-range estimates for future carbon emissions. From these “what if” estimates flow projections for temperature, sea-rise, and more.

Projection: In 2001, the IPCC offered a range of fossil fuel and industrial emissions trends, from a best-case scenario of 7.7 billion tons of carbon released each year by 2010 to a worst-case scenario of 9.7 billion tons.

Reality: In 2010, global emissions from fossil fuels alone totaled 9.1 billion tons of carbon, according to federal government’s Earth Systems Research Laboratory.

Why the miss? While technically within the range, scientists never expected emissions to rise so high so quickly, said IPCC scientist Christopher Fields. The IPCC, for instance, failed to anticipate China’s economic growth, or resistance by the United States and other nations to curbing greenhouse gases.

“We really haven’t explored a world in which the emissions growth rate is as rapid as we have actually seen happen,” Fields said.

Temperature

IPCC models use the emission scenarios discussed above to estimate average global temperature increases by the year 2100.

warming-300

Projection: The IPCC 2007 assessment projected a worst-case temperature rise of 4.3° to 11.5° Fahrenheit, with a high probability of 7.2°F.

Reality: We are currently on track for a rise of between 6.3° and 13.3°F, with a high probability of an increase of 9.4°F by 2100, according to the Massachusetts Institute of Technology. Other modelers are getting similar results, including a study published earlier this month by the Global Carbon Project consortium confirming the likelihood of a 9ºF rise.

Why the miss? IPCC emission scenarios underestimated global CO2 emission rates, which means temperature rates were underestimated too. And it could get worse: IPCC projections haven’t included likely feedbacks such as large-scale melting of Arctic permafrost and subsequent release of large quantities of CO2 and methane, a greenhouse gas 20 times more potent, albeit shorter lived, in the atmosphere than carbon dioxide.

Arctic Meltdown

Five years ago, the summer retreat of Arctic ice wildly outdistanced all 18 IPCC computer models, amazing IPCC scientists. It did so again in 2012.

ice-600

Projection: The IPCC has always confidently projected that the Arctic ice pack was safe at least until 2050 or well beyond 2100.

Reality: Summer ice is thinning faster than every climate projection, and today scientists predict an ice-free Arctic in years, not decades. Last summer, Arctic sea ice extent plummeted to 1.32 million square miles, the lowest level ever recorded – 50 percent below the long-term 1979 to 2000 average.

Why the miss? For scientists, it is increasingly clear that the models are under-predicting the rate of sea ice retreat because they are missing key real-world interactions.

“Sea ice modelers have speculated that the 2007 minimum was an aberration… a matter of random variability, noise in the system, that sea ice would recover.… That no longer looks tenable,” says IPCC scientist Michael Mann. “It is a stunning reminder that uncertainty doesn’t always act in our favor.”

Ice Sheets

Greenland and Antarctica are melting, even though IPCC said in 1995 that they wouldn’t be.

Projection: In 1995, IPCC projected “little change in the extent of the Greenland and Antarctic ice sheets… over the next 50-100 years.” In 2007 IPCC embraced a drastic revision: “New data… show[s] that losses from the ice sheets of Greenland and Antarctica have very likely contributed to sea level rise over 1993 to 2003.”

Today, ice loss in Greenland and Antarctica is trending at least 100 years ahead of projections compared to IPCC’s first three reports.

Reality: Today, ice loss in Greenland and Antarctica is trending at least 100 years ahead of projections compared to IPCC’s first three reports.

Why the miss? “After 2001, we began to realize there were complex dynamics at work – ice cracks, lubrication and sliding of ice sheets,” that were melting ice sheets quicker, said IPCC scientist Kevin Trenberth. New feedbacks unknown to past IPCC authors have also been found. A 2012 study, for example, showed that the reflectivity of Greenland’s ice sheet is decreasing, causing ice to absorb more heat, likely escalating melting.

Sea-Level Rise

The fate of the world’s coastlines has become a classic example of how the IPCC, when confronted with conflicting science, tends to go silent.

Projection: In the 2001 report, the IPCC projected a sea rise of 2 millimeters per year. The worst-case scenario in the 2007 report, which looked mostly at thermal expansion of the oceans as temperatures warmed, called for up to 1.9 feet of sea-level-rise by century’s end.

Today: Observed sea-level-rise has averaged 3.3 millimeters per year since 1990. By 2009, various studies that included ice-melt offered drastically higher projections of between 2.4 and 6.2 feet sea level rise by 2100.

Why the miss? IPCC scientists couldn’t agree on a value for the contribution melting Greenland and Antarctic ice sheets would add to sea-level rise. So they simply left out the data to reach consensus. Science historian Naomi Oreskes calls this – one of IPCC’s biggest underestimates – “consensus by omission.”

Ocean Acidification

To its credit, the IPCC admits to vast climate change unknowns. Ocean acidification is one such impact.

Projection: Unmentioned as a threat in the 1990, 1995 and 2001 IPCC reports. First recognized in 2007, when IPCC projected acidification of between 0.14 and 0.35 pH units by 2100. “While the effects of observed ocean acidification on the marine biosphere are as yet undocumented,” said the report, “the progressive acidification of oceans is expected to have negative impacts on marine shell-forming organisms (e.g. corals) and their dependent species.”

Reality: The world’s oceans absorb about a quarter of the carbon dioxide humans release annually into the atmosphere. Since the start of the Industrial Revolution, the pH of surface ocean waters has fallen by 0.1 pH units. Since the pH scale is logarithmic, this change represents a stunning 30 percent increase in acidity.

Why the miss? Scientists didn’t have the data. They began studying acidification by the late 1990s, but there weren’t many papers on the topic until mid-2000, missing the submission deadline for IPCC’s 2001 report. Especially alarming are new findings that ocean temperatures and currents are causing parts of the seas to become acidic far faster than expected, threatening oysters and other shellfish.

National Oceanic and Atmospheric Administration chief Jane Lubchenco has called acidification the “equally evil twin” to global warming.

Thawing Tundra

Some carbon-cycle feedbacks that could vastly amplify climate change – especially a massive release of carbon and methane from thawing permafrost – are extremely hard to model.

Projection: In 2007, IPCC reported with “high confidence” that “methane emissions from tundra… and permafrost have accelerated in the past two decades, and are likely to accelerate further.” However, the IPCC offered no projections regarding permafrost melt.

Reality: Scientists estimate that the world’s permafrost holds 1.5 trillion tons of frozen carbon. That worries scientists: The Arctic is warming faster than anywhere else on earth, and researchers are seeing soil temperatures climb rapidly, too. Some permafrost degradation is already occurring.

Large-scale tundra wildfires in 2012 added to the concern.

Why the miss? This is controversial science, with some researchers saying the Arctic tundra is stable, others saying it will defrost only over long periods of time, and still more convinced we are on the verge of a tipping point, where the tundra thaws rapidly and catastrophically. A major 2005 study, for instance, warned that the entire top 11 feet of global permafrost could disappear by century’s end, with potentially cataclysmic climate impacts.

The U.N. Environmental Programme revealed this week that IPCC’s fifth assessment, due for release starting in September, 2013, will again “not include the potential effects of the permafrost carbon feedback on global climate.”

Tipping points

The IPCC has been silent on tipping points – non-linear “light switch” moments when the climate system abruptly shifts from one paradigm to another.

The trouble with tipping points is they’re hard to spot until you’ve passed one.

Projection: IPCC has made no projections regarding tipping-point thresholds.

Reality: The scientific jury is still out as to whether we have reached any climate thresholds – a point of no return for, say, an ice-free Arctic, a Greenland meltdown, the slowing of the North Atlantic Ocean circulation, or permanent changes in large-scale weather patterns like the jet stream, El Niño or monsoons. The trouble with tipping points is they’re hard to spot until you’ve passed one.

Why the miss? Blame the computers: These non-linear events are notoriously hard to model. But with scientists recognizing the sizeable threat tipping points represent, they will be including some projections in the 2013-14 assessment.

Correction (Dec. 6, 2012): Earlier editions incorrectly compared global carbon dioxide emissions against carbon emissions scenarios. Carbon dioxide is heavier, incorrectly skewing the comparison. Global use of fossil fuels in 2010 produced about 30 billion tons of carbon dioxide but only 9.1 tons of carbon, putting emissions within the extreme end of IPCC scenarios. The story has been changed to reflect that.

© Glenn Scherer, 2012. All rights reserved.

Graphic of emissions scenario courtesy U.S. Global Change Research Program. Photo of activist warning of 6ºC warming © Adela Nistora. Graphic showing Arctic summer ice projections vs. observations by the Vancouver Observer.

Glenn Scherer is senior editor of Blue Ridge Press, a news service that has been providing environmental commentary and news to U.S. newspapers since 2007.

DailyClimate.org is a foundation-funded news service covering climate change. Contact editor Douglas Fischer at dfischer [at] dailyclimate.org

Scientists Pioneer Method to Predict Environmental Collapse (Science Daily)

Researcher Enlou Zhang takes a core sample from the bed of Lake Erhai in China. (Credit: University of Southampton)

Nov. 19, 2012 — Scientists at the University of Southampton are pioneering a technique to predict when an ecosystem is likely to collapse, which may also have potential for foretelling crises in agriculture, fisheries or even social systems.

The researchers have applied a mathematical model to a real world situation, the environmental collapse of a lake in China, to help prove a theory which suggests an ecosystem ‘flickers’, or fluctuates dramatically between healthy and unhealthy states, shortly before its eventual collapse.

Head of Geography at Southampton, Professor John Dearing explains: “We wanted to prove that this ‘flickering’ occurs just ahead of a dramatic change in a system — be it a social, ecological or climatic one — and that this method could potentially be used to predict future critical changes in other impacted systems in the world around us.”

A team led by Dr Rong Wang extracted core samples from sediment at the bottom of Lake Erhai in Yunnan province, China and charted the levels and variation of fossilised algae (diatoms) over a 125-year period. Analysis of the core sample data showed the algae communities remained relatively stable up until about 30 years before the lake’s collapse into a turbid or polluted state. However, the core samples for these last three decades showed much fluctuation, indicating there had been numerous dramatic changes in the types and concentrations of algae present in the water — evidence of the ‘flickering’ before the lake’s final definitive change of state.

Rong Wang comments: “By using the algae as a measure of the lake’s health, we have shown that its eco-system ‘wobbled’ before making a critical transition — in this instance, to a turbid state.

“Dramatic swings can be seen in other data, suggesting large external impacts on the lake over a long time period — for example, pollution from fertilisers, sewage from fields and changes in water levels — caused the system to switch back and forth rapidly between alternate states. Eventually, the lake’s ecosystem could no longer cope or recover — losing resilience and reaching what is called a ‘tipping point’ and collapsing altogether.”

The researchers hope the method they have trialled in China could be applied to other regions and landscapes.

Co-author Dr Pete Langdon comments: “In this case, we used algae as a marker of how the lake’s ecosystem was holding-up against external impacts — but who’s to say we couldn’t use this method in other ways? For example, perhaps we should look for ‘flickering’ signals in climate data to try and foretell impending crises?”

Journal Reference:

  1. Rong Wang, John A. Dearing, Peter G. Langdon, Enlou Zhang, Xiangdong Yang, Vasilis Dakos, Marten Scheffer.Flickering gives early warning signals of a critical transition to a eutrophic lake stateNature, 2012; DOI:10.1038/nature11655

Do We Live in a Computer Simulation Run by Our Descendants? Researchers Say Idea Can Be Tested (Science Daily)

The conical (red) surface shows the relationship between energy and momentum in special relativity, a fundamental theory concerning space and time developed by Albert Einstein, and is the expected result if our universe is not a simulation. The flat (blue) surface illustrates the relationship between energy and momentum that would be expected if the universe is a simulation with an underlying cubic lattice. (Credit: Martin Savage)

Dec. 10, 2012 — A decade ago, a British philosopher put forth the notion that the universe we live in might in fact be a computer simulation run by our descendants. While that seems far-fetched, perhaps even incomprehensible, a team of physicists at the University of Washington has come up with a potential test to see if the idea holds water.

The concept that current humanity could possibly be living in a computer simulation comes from a 2003 paper published inPhilosophical Quarterly by Nick Bostrom, a philosophy professor at the University of Oxford. In the paper, he argued that at least one of three possibilities is true:

  • The human species is likely to go extinct before reaching a “posthuman” stage.
  • Any posthuman civilization is very unlikely to run a significant number of simulations of its evolutionary history.
  • We are almost certainly living in a computer simulation.

He also held that “the belief that there is a significant chance that we will one day become posthumans who run ancestor simulations is false, unless we are currently living in a simulation.”

With current limitations and trends in computing, it will be decades before researchers will be able to run even primitive simulations of the universe. But the UW team has suggested tests that can be performed now, or in the near future, that are sensitive to constraints imposed on future simulations by limited resources.

Currently, supercomputers using a technique called lattice quantum chromodynamics and starting from the fundamental physical laws that govern the universe can simulate only a very small portion of the universe, on the scale of one 100-trillionth of a meter, a little larger than the nucleus of an atom, said Martin Savage, a UW physics professor.

Eventually, more powerful simulations will be able to model on the scale of a molecule, then a cell and even a human being. But it will take many generations of growth in computing power to be able to simulate a large enough chunk of the universe to understand the constraints on physical processes that would indicate we are living in a computer model.

However, Savage said, there are signatures of resource constraints in present-day simulations that are likely to exist as well in simulations in the distant future, including the imprint of an underlying lattice if one is used to model the space-time continuum.

The supercomputers performing lattice quantum chromodynamics calculations essentially divide space-time into a four-dimensional grid. That allows researchers to examine what is called the strong force, one of the four fundamental forces of nature and the one that binds subatomic particles called quarks and gluons together into neutrons and protons at the core of atoms.

“If you make the simulations big enough, something like our universe should emerge,” Savage said. Then it would be a matter of looking for a “signature” in our universe that has an analog in the current small-scale simulations.

Savage and colleagues Silas Beane of the University of New Hampshire, who collaborated while at the UW’s Institute for Nuclear Theory, and Zohreh Davoudi, a UW physics graduate student, suggest that the signature could show up as a limitation in the energy of cosmic rays.

In a paper they have posted on arXiv, an online archive for preprints of scientific papers in a number of fields, including physics, they say that the highest-energy cosmic rays would not travel along the edges of the lattice in the model but would travel diagonally, and they would not interact equally in all directions as they otherwise would be expected to do.

“This is the first testable signature of such an idea,” Savage said.

If such a concept turned out to be reality, it would raise other possibilities as well. For example, Davoudi suggests that if our universe is a simulation, then those running it could be running other simulations as well, essentially creating other universes parallel to our own.

“Then the question is, ‘Can you communicate with those other universes if they are running on the same platform?'” she said.

Journal References:

  1. Silas R. Beane, Zohreh Davoudi, Martin J. Savage.Constraints on the Universe as a Numerical SimulationArxiv, 2012 [link]
  2. Nick Bostrom. Are You Living in a Computer Simulation? Philosophical Quarterly, (2003) Vol. 53, No. 211, pp. 243-255 [link]

‘Missing’ Polar Weather Systems Could Impact Climate Predictions (Science Daily)

Intense but small-scale polar storms could make a big difference to climate predictions according to new research. (Credit: NEODAAS / University of Dundee)

Dec. 16, 2012 — Intense but small-scale polar storms could make a big difference to climate predictions, according to new research from the University of East Anglia and the University of Massachusetts.

Difficult-to-forecast polar mesoscale storms occur frequently over the polar seas; however, they are missing in most climate models.

Research published Dec. 16 inNature Geoscience shows that their inclusion could paint a different picture of climate change in years to come.

Polar mesoscale storms are capable of producing hurricane-strength winds which cool the ocean and lead to changes in its circulation.

Prof Ian Renfrew, from UEA’s School of Environmental Sciences, said: “These polar lows are typically under 500 km in diameter and over within 24-36 hours. They’re difficult to predict, but we have shown they play an important role in driving large-scale ocean circulation.

“There are hundreds of them a year in the North Atlantic, and dozens of strong ones. They create a lot of stormy weather, strong winds and snowfall — particularly over Norway, Iceland, and Canada, and occasionally over Britain, such as in 2003 when a massive dump of snow brought the M11 to a standstill for 24 hours.

“We have shown that adding polar storms into computer-generated models of the ocean results in significant changes in ocean circulation — including an increase in heat travelling north in the Atlantic Ocean and more overturning in the Sub-polar seas.

“At present, climate models don’t have a high enough resolution to account for these small-scale polar lows.

“As Arctic Sea ice continues to retreat, polar lows are likely to migrate further north, which could have consequences for the ‘thermohaline’ or northward ocean circulation — potentially leading to it weakening.”

Alan Condron from the University of Massachusetts said: “By simulating polar lows, we find that the area of the ocean that becomes denser and sinks each year increases and causes the amount of heat being transported towards Europe to intensify.

“The fact that climate models are not simulating these storms is a real problem because these models will incorrectly predict how much heat is being moved northward towards the poles. This will make it very difficult to reliably predict how the climate of Europe and North America will change in the near-future.”

Prof Renfrew added: “Climate models are always improving, and there is a trade-off between the resolution of the model, the complexity of the model, and the number of simulations you can carry out. Our work suggests we should put some more effort into resolving such storms.”

‘The impact of polar mesoscale storms on Northeast Atlantic ocean circulation’ by Alan Condron from the University of Massachusetts (US) and Ian Renfrew from UEA (UK), is published in Nature Geoscience on December 16, 2012.

Journal Reference:

  1. Alan Condron, Ian A. Renfrew. The impact of polar mesoscale storms on northeast Atlantic Ocean circulationNature Geoscience, 2012; DOI:10.1038/ngeo1661

Physicist Happens Upon Rain Data Breakthrough (Science Daily)

John Lane looks over data recorded from his laser system as he refines his process and formula to calibrate measurements of raindrops. (Credit: NASA/Jim Grossmann)

Dec. 3, 2012 — A physicist and researcher who set out to develop a formula to protect Apollo sites on the moon from rocket exhaust may have happened upon a way to improve weather forecasting on Earth.

Working in his backyard during rain showers and storms, John Lane, a physicist at NASA’s Kennedy Space Center in Florida, found that the laser and reflector he was developing to track lunar dust also could determine accurately the size of raindrops, something weather radar and other meteorological systems estimate, but don’t measure.

The special quantity measured by the laser system is called the “second moment of the size distribution,” which results in the average cross-section area of raindrops passing through the laser beam.

“It’s not often that you’re studying lunar dust and it ends up producing benefits in weather forecasting,” said Phil Metzger, a physicist who leads the Granular Mechanics and Regolith Operations Lab, part of the Surface Systems Office at Kennedy.

Lane said the additional piece of information would be useful in filling out the complex computer calculations used to determine the current conditions and forecast the weather.

“We may be able to refine (computer weather) models to make them more accurate,” Lane said. “Weather radar data analysis makes assumptions about raindrop size, so I think this could improve the overall drop size distribution estimates.”

The breakthrough came because Metzger and Lane were looking for a way to calibrate a laser sensor to pick up the fine particles of blowing lunar dust and soil. It turns out that rain is a good stand-in for flying lunar soil.

“I was pretty skeptical in the beginning that the numbers would come out anywhere close,” Lane said. “Anytime you do something new, it’s a risk that you’re just wasting your time.”

The genesis of the research was the need to find out how much damage would be done by robotic landers getting too close to the six places on the moon where Apollo astronauts landed, lived and worked.

NASA fears that dust and soil particles thrown up by the rocket exhaust of a lander will scour and perhaps puncture the metal skin of the lunar module descent stages and experiment hardware left behind by the astronauts from 1969 to 1972.

“It’s like sandblasting, if you have something coming down like a rocket engine, and it lifts up this dust, there’s not air, so it just keeps going fast,” Lane said. “Some of the stuff can actually reach escape velocity and go into orbit.”

Such impacts to those materials could ruin their scientific value to researchers on Earth who want to know what happens to human-made materials left on another world for more than 40 years.

“The Apollo sites have value scientifically and from an engineering perspective because they are a record of how these materials on the moon have interacted with the solar system over 40 years,” Metzger said. “They are witness plates to the environment.”

There also are numerous bags of waste from the astronauts laying up there that biologists want to examine simply to see if living organisms can survive on the moon for almost five decades where there is no air and there is a constant bombardment of cosmic radiation.

“If anybody goes back and sprays stuff on the bags or touches the bags, they ruin the experiment,” Metzger said. “It’s not just the scientific and engineering value. They believe the Apollo sites are the most important archaeological sites in the human sphere, more important than the pyramids because it’s the first place humans stepped off the planet. And from a national point of view, these are symbols of our country and we don’t want them to be damaged by wanton ransacking.”

Current thinking anticipates placing a laser sensor on the bottom of one of the landers taking part in the Google X-Prize competition. The sensor should be able to pick up the blowing dust and soil and give researchers a clear set of results so they can formulate restrictions for other landers, such as how far away from the Apollo sites new landers can touch down.

As research continues into the laser sensor, Lane expects the work to continue on the weather forecasting side of the equation, too. Lane already presented some of his findings at a meteorological conference and is working on a research paper to detail the work. “This is one of those topics that span a lot of areas of science,” Lane said.

Water Resources Management and Policy in a Changing World: Where Do We Go from Here? (Science Daily)

Nov. 26, 2012 — Visualize a dusty place where stream beds are sand and lakes are flats of dried mud. Are we on Mars? In fact, we’re on arid parts of Earth, a planet where water covers some 70 percent of the surface.

How long will water be readily available to nourish life here?

Scientists funded by the National Science Foundation’s (NSF) Dynamics of Coupled Natural and Human Systems (CNH) program are finding new answers.

NSF-supported CNH researchers will address water resources management and policy in a changing world at the fall meeting of the American Geophysical Union (AGU), held in San Francisco from Dec. 3-7, 2012.

In the United States, more than 36 states face water shortages. Other parts of the world are faring no better.

What are the causes? Do the reasons lie in climate change, population growth or still other factors?

Among the topics to be covered at AGU are sociohydrology, patterns in coupled human-water resource systems and the resilience of coupled natural and human systems to global change.

Researchers will report, for example, that human population growth in the Andes outweighs climate change as the culprit in the region’s dwindling water supplies. Does the finding apply in other places, and perhaps around the globe?

Scientists presenting results are affiliated with CHANS-Net, an international network of researchers who study coupled natural and human systems.

NSF’s CNH program supports CHANS-Net, with coordination from the Center for Systems Integration and Sustainability at Michigan State University.

CHANS-Net facilitates communication and collaboration among scientists, engineers and educators striving to find sustainable solutions that benefit the environment while enabling people to thrive.

“For more than a decade, NSF’s CNH program has supported projects that explore the complex ways people and natural systems interact with each other,” says Tom Baerwald, NSF CNH program director.

“CHANS-Net and its investigators represent a broad range of projects. They’re developing a new, better understanding of how our planet works. CHANS-Net researchers are finding practical answers for how people can prosper while maintaining environmental quality.”

CNH and CHANS-Net are part of NSF’s Science, Engineering and Education for Sustainability (SEES) investment. NSF’s Directorates for Geosciences; Social, Behavioral and Economic Sciences; and Biological Sciences support the CNH program.

“CHANS-Net has grown to more than 1,000 members who span generations of natural and social scientists from around the world,” says Jianguo “Jack” Liu, principal investigator of CHANS-Net and Rachel Carson Chair in Sustainability at Michigan State University.

“CHANS-Net is very happy to support another 10 CHANS Fellows–outstanding young scientists–to attend AGU, give presentations there, and learn from leaders in CHANS research and build professional networks. We’re looking forward to these exciting annual CHANS-Net events.”

Speakers at AGU sessions organized by CHANS-Net will discuss such subjects as the importance of water conservation in the 21st century; the Gila River and whether its flows might reduce the risk of water shortages in the Colorado River Basin; and historical evolution of the hydrological functioning of the old Lake Xochimilco in the southern Mexico Basin.

Other topics to be addressed include water conflicts in a changing world; system modeling of the Great Salt Lake in Utah to improve the hydro-ecological performance of diked wetlands; and integrating economics into water resources systems analysis.

“Of all our natural resources, water has become the most precious,” wrote Rachel Carson in 1962 in Silent Spring. “By a strange paradox, most of the Earth’s abundant water is not usable for agriculture, industry, or human consumption because of its heavy load of sea salts, and so most of the world’s population is either experiencing or is threatened with critical shortages.”

Fifty years later, more than 100 scientists will present research reflecting Rachel Carson’s conviction that “seldom if ever does nature operate in closed and separate compartments, and she has not done so in distributing Earth’s water supply.”

Go With the Flow in Flood Prediction (Science Daily)

Dec. 3, 2012 — Floods have once again wreaked havoc across the country and climate scientists and meteorologists suggest that the problem is only going to get worse with wetter winters and rivers bursting their banks becoming the norm. A team based at Newcastle University and their colleagues in China have developed a computer model that can work out how the flood flow will develop and where flooding will be worst based on an understanding of fluid dynamics and the underlying topology of a region.

Writing in the journal Progress in Computational Fluid Dynamics,Newcastle civil engineer Qiuhua Liang and colleagues and Chi Zhang of Dalian University of Technology and Junxian Yin, China Institute of Water Resources and Hydropower Research in Beijing, explain how they have developed an adaptive computer model that could provide accurate and efficient predictions about the flow of water as a flood occurs. Such a model might provide environmental agencies and authorities with a more precise early-warning system for residents and businesses in a region at risk of flood. It could also be used by insurance companies to determine the relative risk of different areas within a given region and so make their underwriting of the risk economically viable.

The model is based on a numerical solution to the hydrodynamic equations of fluid flow . This allows the researchers to plot the likely movement of water during a dam break or flash flood over different kinds of terrain and around obstacles even when flood waves are spreading rapidly. The researchers have successfully tested their model on real-world flood data.

The team points out that flood disasters have become a major threat to human lives and assets. “Flood management is therefore an important task for different levels of governments and authorities in many countries”, the researchers explain. “The availability of accurate and efficient flood modelling tools is vital to assist engineers and managers charged with flood risk assessment, prevention and alleviation.”

Journal Reference:

  1. Chi Zhang, Qiuhua Liang, Junxian Yin. A first-order adaptive solution to rapidly spreading flood waves.Progress in Computational Fluid Dynamics, An International Journal, 2013; 13 (1): 1 DOI: 10.1504/PCFD.2013.050645

We Are Basically Honest – Except When We Are at Work, Study Suggests (Science Daily)

Dec. 14, 2012 — A new study has revealed we are more honest than you might think. The research by the University of Oxford and the University of Bonn suggests that it pains us to tell lies, particularly when we are in our own homes. It appears that being honest is hugely important to our sense of who we are. However, while it might bother us to tell lies at home, we are less circumspect at work where we are probably more likely to bend the truth, suggests the study.

The researchers conducted simple honesty tests by ringing people in their own homes in Germany and asking them to flip a coin. The study participants were asked over the phone to report on how it landed. The catch to this test was that each of the individuals taking part was given a strong financial incentive to lie without the fear of being found out. The study participants were told that if the coin landed tails up, they would receive 15 euros or a gift voucher; while if the coin landed heads up, they would receive nothing.

Using randomly generated home phone numbers, 658 people were contacted who agreed to take part. Although the researchers could not directly observe the behaviour of the individuals in their own homes, the aggregated reports show a remarkably high level of honesty. Over half of the study participants (55.6 per cent) reported that the coin landed heads-up, which meant they would receive nothing. Only 44.4 per cent reported tails up, collecting their financial reward as a result.

A second similar test was done involving 94 participants over the phone. This time they were asked to report on the results of four consecutive coin tosses with the promise of five euros for every time the coin landed tails up. Despite a potential maximum pay-off of 20 euros, the reports they received from the respondents reflected the likely distribution of a fair coin. This is based on the premise that the coin would have landed tails up around 50 per cent of the time.

All those taking part in the experiments answered questions about their own gender, age, views on honesty and their religious background. The study suggests, however, that personal attributes play no part here as the overall level of honesty demonstrated in both experiments was high.

This latest study can be compared with previous similar studies, which were conducted with students in tightly controlled laboratory situations. In those studies around 75 per cent of participants reported tails-up, which the researchers suggest could infer that people are more honest when they are in their own homes.

Dr Johannes Abeler, from the Department of Economics at the University of Oxford, said: The fact that the financial incentive to lie was outweighed by the perceived cost of lying shows just how honest most people are when they are in their own homes. One theory is that being honest is at the very core of how we want to perceive ourselves and is very important to our sense of self identity. Why it is so important? It may be to do with the social norms we have been given about what is right and wrong from the moment we could walk and talk.

‘This study has implications for policy-makers. For instance, if they want to catch those involved in fraudulent behaviour, perhaps the forms and questionnaires could be designed to reveal more about our personal lives and sense of self-identity. Our experiments showed that if people plainly see that to lie in a given situation would be fraudulent, they shy away from it. However, if people are given “wriggle room,” they can convince themselves that their behaviour is not fraudulent and this does not attack their sense of who they are.’

The computer-assisted telephone interviews were carried out by the Institute for Applied Social Sciences (infas), a private, well-known German research institute. They were conducted between November 2010 and February 2011. Telephone numbers were selected using a random digit dialling technique with numbers randomly based on a data set of all potential landline telephone numbers in Germany. Part of the study consisted of questions relating to the participants’ social background, age and education, their economic and political preferences, their religious beliefs, their attitudes to crime, and their beliefs about other people’s behaviour in the experiment.

The Opportunistic Apocalypse (Savage Minds)

by  on December 14th, 2012

The third in a guest series about the “Mayan Apocalypse” predicted for Dec. 21, 2012.  The first two posts are here and here.

There are opportunities in the apocalypse.  The end of the world has been commodified.  A few are seriously investing in bunkers, boats, and survival supplies. Tourism is up, not only to Mayan archaeological sites, but also to places like Bugarach, France and Mt. Rtanj, Serbia.  But even those of us on a budget can afford at least a book, a T-shirt or a handbag.

There are opportunities here for academics, too. Many scholars have been quoted in the press lately saying that nothing will happen on Dec 21 , in addition to those who have written comprehensive books and articles discrediting the impending doom. Obviously publishing helps individual careers, and that does not detract from our collective responsibility to debunk ideas that might lead people to physical or financial harm.  But neither can we divorce our work from its larger social implications.

It is telling that the main scholarly players in debunking the Mayan Apocalypse in the U.S. are NASA (which is facing budget cuts) and anthropologists.  Both groups feel the need to prove they are relevant because our collective jobs depend on it. I don’t need to go into great detail with this crowd about academia’s current situation. Academia has gone from being a well-respected, stable job to one where most classes are taught by underpaid, uninsured part-time adjuncts, and many Ph.D.s never find work in academia at all. Tuition fees for undergraduates have skyrocketed while full-time faculty salaries have stagnated.

Among the public (too often talked about as being in “the real world,” as if academics were somehow immune to taxes or swine flu), there seems to be a general distrust of intellectuals. That, combined with the current economic situation, has translated into a loss of research funding, such as cuts to the Fulbright program and NSF. Some public officials specifically state that science and engineering are worth funding, but anthropology is not.  To add insult to injury, the University of California wants to move away from that whole “reading” thing and rebrand itself as a web startup.

Articles, books with general readership, being quoted in the newspaper, and yes, blogging are all concrete ways to show funding agencies and review committees that what we do matters. The way to get exposure among those general audiences is to engage with what interests them — like the end of the world.  Dec. 21, 2012 has become an internet meme. Many online references to it are debunkings or tongue-in-cheek. Newspaper articles on unrelated topics make passing references in jest, stores offer just-in-case-it’s-real sales, people are planning parties.  There seems to be more written to discredit the apocalypse, or make fun on it, than to prepare for it.

We need to remember that this non-believer attention has a purpose, and that purpose is not just (or even primarily) about convincing believers that nothing is going to happen. Rather, it serves to demonstrate something about non-believers themselves.  “We” are sensible and logical, while “they” are superstitious and credulous. “We” value science and data, while “they” turn to astrology, misreadings of ancient texts, and esoteric spirituality.   ”We” remember the non-apocalypses of the past, while “they” have forgotten.

I would argue that discrediting the Mayan Apocalypse is part of an ongoing process of creating western modernity (cue Latour). That modernity requires an “other,” and here that “other” is defined in this case primarily by religious/spiritual belief in the Mayan apocalypse.  The more “other” these Apocalypse believers are, the more clearly they reflect the modernity of non-believers.  (Of course, there are also the “others” of the Maya themselves, and I’ll address that issue in my next post.)

This returns us to the difference I drew in my first post between “Transitional Apocalyptic Expectations” (TAE) and “Catastrophic Apocalyptic Expectations” (CAE).  I suspect the majority of believers are expecting something like a TAE-type event, but media attention focuses on discrediting CAE beliefs, such as a rogue planet hitting the Earth or massive floods. These would be dire catastrophes, but they will also be far easier to disprove. We will all notice if a planet does or does not hit the Earth next week, but many of us — myself included — will miss a transformation in human consciousness among the enlightened.

By providing the (very real) scientific data to discredit the apocalypse, scholars are incorporated into this project of modernity.  Much of the scholarly work on this phenomenon is fascinating and subtle, but the press picks up on two main themes.  One is scientific proof that the apocalypse will not happen, such as astronomical data that Earth is not on a collision course with another planet, Mayan epigraphy that shows the Long Count does not really end, and ethnography that suggests most Maya themselves are not worried about any of this.  The other scholarly theme the press circulates is the long history of apocalyptic beliefs in the west.  In the logic of the metanarrative of western progress, this connects contemporary Apocalypse believers to the past, nonmodernity and “otherness.”

I now find myself in an uncomfortable position, although it is an intellectually interesting corner to be backed into. I agree with my colleagues that the world will not end, that Mayan ideas have been misappropriated, and that we have a responsibility to address public concerns.  At the same time, I can’t help but feel we are being drawn, either reluctantly or willingly, into a larger project than extends far beyond next week.

*   *   *

2012, the movie we love to hate

by  on December 11th, 2012

The second in a guest series about the “Mayan Apocalypse” predicted for Dec. 21, 2012.  The first post is here.

Last summer, I traveled to Philadelphia to visit the Penn Museum exhibit “Maya: the Lords of Time.” It was, as one might expect given the museum collection and the scholars involved, fantastic.  I want to comment on just the beginning of the exhibit, however. On entering, one is immediately greeted by a wall crowded with TV screens, all showing different clips of predicted disasters and people talking fearfully about the end of the world. The destruction, paranoia, and cacophony create a ambiance of chaos and uncertainty. Turning the corner, these images are replaced by widely spaced Mayan artifacts and stela. The effect is striking.  One moves from media-induced insanity to serenity, from endless disturbing jump-cuts to the well-lit, quiet contemplation of beautiful art.

Among these images were scenes from Director Roland Emmerich’s blockbuster film 2012 (2009). This over-the-top disaster film is well used in that context.  Still, it is interesting how often 2012 is mentioned by academics and other debunkers — almost as often as they mention serious alternative thinkers about the Mayan calendar, such as Jose Arguelles (although the film receives less in-depth coverage than he does).

I find this interesting because 2012 is clearly not trying to convince us to stockpile canned goods or build boats to prepare for the end of the Maya Long Count, any more than Emmerich’s previous films were meant to prepare us for alien invasion (Independence Day, 1996) or the effects of global climate change (The Day After Tomorrow, 2004).  Like Emmerich’s previous films,2012 is a chance to watch the urban industrialized world burn (in that way, it has much in common with the currently popular zombie film genre). If you want to see John Cusack survive increasingly implausible crumbling urban landscapes, this film is for you.

The Maya, however, are barely mentioned in 2012. There are no Mayan characters, no one travels to Mesoamerica, there is no mention of the Long Count.  Emmerich’s goal for 2012 was, in his own words (here and here), “a modern retelling of Noah’s Ark.” In fact, he claims that the movie originally had nothing to do with the 2012 phenomenon at all.  Instead, he was convinced – reluctantly – to include the concept because of public interest in the Maya calendar.

This explains why the Maya only receive two passing mentions in 2012 — one is a brief comment that even “they” had been able to predict the end of the world, the other a short news report on a cult suicide in Tikal. The marketing aspect of the film emphasized these Maya themes (all of the film footage about the Maya is in the trailer, the movie website starts with a rotating image of the Maya calendar, and there are related extras on the DVD), but the movie itself had basically nothing to do with the Maya, the Mayan Long Count, or Dec 21.

Nevertheless, this film’s impact on public interest in Dec 21 is measurable.  Google Trends, which gives data on the number of times particular search terms are used, gives us a sense of the impact of this $200,000,000  film. I looked at a number of related terms, but have picked the ones that show thegeneral pattern: There is a spike of interest in 2012 apocalyptic ideas when the 2012 marketing campaign starts (November 2008), a huge spike when the film is released (November 2009), and a higher baseline of interest from then until now. Since January, interest in the Mayan calendar/apocalypse has been steadily climbing (and in fact, is higher every time I check this link; it automatically updates). In other words, the 2012 movie both responded to, and reinforced, public interest in the 2012 phenomenon.

Here I return to Michael D. Gordin’s The Pseudoscience Wars (2012).  This delightful book deals with the scientific response to Velikovsky, who believed that the miracles of the Old Testament and other ancient myths documented the emergence of a comet from Jupiter, its traumatic interactions with Earth, and its eventual settling into the role of the planet Venus. (The final chapter also discusses the 2012 situation.)  Gordin’s main focus is understanding why Velikovsky — unlike others labeled “crackpots” before him — stirred the public ire of astronomers and physicists. Academics’ real concern was not Velikovsky’s ideas per se, but how much attention he received by being published by MacMillan — a major publisher of science textbooks — which implied the book had scientific legitimacy. Velikovsky’s “Worlds in Collision” was a major bestseller when it was released in 1950, and academics felt the ideas had to be addressed so that the public would not be misled.

With the Mayan Apocalypse, no major academic publisher is lending legitimacy to these theories.   Books about expected events of 2012 (mainly TAE ideas) are published by specialty presses that focus on the spiritual counterculture, such as Evolver EditionsInner Traditions/Bear & CompanyShambhala, and John Hunt Publishing.  Instead, film media has become the battleground for public attention (perhaps because reading is declining?). The immense amount of money put into movies, documentaries, and TV shows about the Mayan Apocalypse is creating public interest today, and in some ways this parallels what Macmillan did for Velikovsky in the 1950s.

One example of this is the viral marketing campaign for 2012 conducted in November 2008.   Columbia pictures created webpages that were not clearly marked as advertising (these no longer appear to be available), promoting the idea that scientists really did know the world would end and were preparing.  This type of advertising was not unique to this film, but in this case it reinforced already existing fears that the end really was nigh.  NASA began responding to public fears about 2012 as a result of this marketing campaign, and many of the academics interested in addressing these concerns also published after this time.

Academics are caught in something of a bind here.  Do we respond to public fears, in the hopes of debunking them, but no doubt also increasing the public interest in the very ideas we wish to discredit?  Should we respond in the hopes of selling a few more books or receiving a few more citations, thus generating interest in the rest of what our discipline does?  As anthropologists we are not immune to the desires of public interest, certainly (obviously I’m not — here I am, blogging away), nor should we be.  Perhaps something good can come of the non-end-of-the-world.  I’ll turn to this question next time.

*   *   *

The End is Nigh. Start blogging.

by  on December 4th, 2012

Savage Minds welcomes guest blogger Clare A. Sammells.

My thanks to the editors of Savage Minds for allowing me to guest blog this month. Hopefully I will not be among the last of Savage Mind’s guests, given that the End of the World is nigh.

You hadn’t heard? On or around Dec 21, 2012, the Maya Long Count will mark the end of a 5125 year cycle. Will this be a mere a calendrical turn, no more inherently eventful that the transition from Dec 31, 2012 to Jan 1, 2013? Will this be a moment of astronomical alignments, fiery conflagrations, and social upheavals? Or will there be a shift in human consciousness, an opportunity for the prepared to improve their lives and achieve enlightenment?

I am going to bet with the house: I do not think the world is going to end in a few weeks.  That way, either the world doesn’t end — another victory for predictive anthropology! — or the world does end, and nothing I write here will matter much anyway. (More seriously, I don’t think our world is destined to end with a bang).

I am not a Mayanist, an archaeologist, or an astronomer. I won’t be discussing conflicting interpretations of Maya long count dates, astronomical observations, or Classical-era Maya stela inscriptions. Books by David Stuart,Anthony Aveni, and Matthew Restall and Amara Solari all provide detailed arguments using those data, and analyze the current phenomenon in light of the long history of western fascinations with End Times.  Articles by John HoopesKevin Whitesides, and Robert Sitler, among others, address “New Age” interpretations of the Maya.  Many ethnographers have considered how Maya peoples understand their complex interactions with “New Age” spiritualists and tourists, among them Judith MaxwellQuetzil Casteneda and Walter Little.

My own interest lies in how indigenous timekeeping is interpreted in the Andes. I conducted ethnographic research focusing on tourism in Tiwanaku, Bolivia — a pre-Incan archaeological site near Lake Titicaca, and a contemporary Aymara village.  One of the first things I noticed was that every tour guide tells visitors about multiple calendars inscribed in the stones of the site, most famously in the Puerta del Sol.  These calendrical interpretations are meaningful to Bolivian visitors, foreign tourists, and local Tiwanakenos for understanding the histories, ethnicities, and politics centered in this place. I took a stab at addressing some of these ideas in a recent article, where I considered how interconnected archaeological theories and political projects of the 1930s fed into what is today accepted conventional knowledge about Tiwanakota calendars.  I’m now putting together a book manuscript about temporal intersections in Tiwanaku.  The parallels between that situation and the Maya 2012 Phenomena led me to consider the prophecies, expectations, YouTube videos, blog posts, scholarly debunkings, and tourist travels motivated by the end of the Maya Long Count.

survey by the National Geographic Channel suggested that 27% of those in the United States think the Maya may have predicted a catastrophe for December 21.  But it is important to note that there is no agreement, even among believers, about what will happen. I tend to think of these beliefs as collecting into two broad (and often overlapping) camps.

Many believe that “something” will happen on (or around) Dec 21, 2012, but do not anticipate world destruction. I think of these beliefs as “Transitional Apocalyptic Expectations” (TAE). Writers such as José Argüelles and John Major Jenkins, for example, believe that there will be a shift in human consciousness, and tend to view the end of the 13th baktun as an opportunity for human improvement.

On the other hand, there are those who believe that the world will end abruptly, in fire, flood, cosmic radiation, or collision with other planets. I think of these beliefs as “Catastrophic Apocalyptic Expectations” (CAE).  While some share my belief that the numbers of serious CAE-ers is small, there are panics and survivalists reported by the press in RussiaFrance, and Los Angeles.  Tragically, there has been at least one suicide.  And of course, there has been a major Hollywood movie (“2012″), which I’ll be discussing more in my next post.

As anthropologists, we certainly should respond to public fears.  But we should also wonder why this fear, out of so many possible fears, is the one to capture public imagination.  Beliefs in paranormal activities, astrology, and the like are historically common, although the specifics change over time.  Michael D. Gordin’s excellent book The Pseudoscience Wars (2012) convincingly suggests that there are larger societal reasons why some fringe theories attract scholarly and public attention while others go ignored.  The Mayan Apocalypse has certainly attracted massive attention, from scholarly rebuttals from anthropologists, NASA, and others, to numerous popular parodies such as GQ’s survival tipsLOLcats, and my personal favorite, an advertisement for Mystic Mayan Power Cloaks.

There seems to be a general fascination with the Mayan calendar — even among those who know relatively little about the peoples that label refers to.  Some are anxiously watching the calendar count down, others are trying to reassure them, and many more simply watching, cracking jokes, or even selling supplies.  But there is something interesting about the fact that so many in the United States and Europe are talking about it at all.  I look forward to exploring these questions further with all of you.

Clare A. Sammells is Assistant Professor of Anthropology at Bucknell University. She is currently living in Madrid, where she is writing about concepts of time in Tiwanaku and conducting ethnographic research on food among Bolivian migrants.  She is not stockpiling canned goods.

Monbiot: The Gift of Death (The Guardian)

December 10, 2012

Pathological consumption has become so normalised that we scarcely notice it.

By George Monbiot, published in the Guardian 11th December 2012

There’s nothing they need, nothing they don’t own already, nothing they even want. So you buy them a solar-powered waving queen; a belly button brush; a silver-plated ice cream tub holder; a “hilarious” inflatable zimmer frame; a confection of plastic and electronics called Terry the Swearing Turtle; or – and somehow I find this significant – a Scratch Off World wall map.

They seem amusing on the first day of Christmas, daft on the second, embarrassing on the third. By the twelfth they’re in landfill. For thirty seconds of dubious entertainment, or a hedonic stimulus that lasts no longer than a nicotine hit, we commission the use of materials whose impacts will ramify for generations.

Researching her film The Story of Stuff, Annie Leonard discovered that of the materials flowing through the consumer economy, only 1% remain in use six months after sale(1). Even the goods we might have expected to hold onto are soon condemned to destruction through either planned obsolescence (breaking quickly) or perceived obsolesence (becoming unfashionable).

But many of the products we buy, especially for Christmas, cannot become obsolescent. The term implies a loss of utility, but they had no utility in the first place. An electronic drum-machine t-shirt; a Darth Vader talking piggy bank; an ear-shaped i-phone case; an individual beer can chiller; an electronic wine breather; a sonic screwdriver remote control; bacon toothpaste; a dancing dog: no one is expected to use them, or even look at them, after Christmas Day. They are designed to elicit thanks, perhaps a snigger or two, and then be thrown away.

The fatuity of the products is matched by the profundity of the impacts. Rare materials, complex electronics, the energy needed for manufacture and transport are extracted and refined and combined into compounds of utter pointlessness. When you take account of the fossil fuels whose use we commission in other countries, manufacturing and consumption are responsible for more than half of our carbon dioxide production(2). We are screwing the planet to make solar-powered bath thermometers and desktop crazy golfers.

People in eastern Congo are massacred to facilitate smart phone upgrades of ever diminishing marginal utility(3). Forests are felled to make “personalised heart-shaped wooden cheese board sets”. Rivers are poisoned to manufacture talking fish. This is pathological consumption: a world-consuming epidemic of collective madness, rendered so normal by advertising and the media that we scarcely notice what has happened to us.

In 2007, the journalist Adam Welz records, 13 rhinos were killed by poachers in South Africa. This year, so far, 585 have been shot(4). No one is entirely sure why. But one answer is that very rich people in Vietnam are now sprinkling ground rhino horn on their food or snorting it like cocaine to display their wealth. It’s grotesque, but it scarcely differs from what almost everyone in industrialised nations is doing: trashing the living world through pointless consumption.

This boom has not happened by accident. Our lives have been corralled and shaped in order to encourage it. World trade rules force countries to participate in the festival of junk. Governments cut taxes, deregulate business, manipulate interest rates to stimulate spending. But seldom do the engineers of these policies stop and ask “spending on what?”. When every conceivable want and need has been met (among those who have disposable money), growth depends on selling the utterly useless. The solemnity of the state, its might and majesty, are harnessed to the task of delivering Terry the Swearing Turtle to our doors.

Grown men and women devote their lives to manufacturing and marketing this rubbish, and dissing the idea of living without it. “I always knit my gifts”, says a woman in a television ad for an electronics outlet. “Well you shouldn’t,” replies the narrator(5). An advertisement for Google’s latest tablet shows a father and son camping in the woods. Their enjoyment depends on the Nexus 7’s special features(6). The best things in life are free, but we’ve found a way of selling them to you.

The growth of inequality that has accompanied the consumer boom ensures that the rising economic tide no longer lifts all boats. In the US in 2010 a remarkable 93% of the growth in incomes accrued to the top 1% of the population(7). The old excuse, that we must trash the planet to help the poor, simply does not wash. For a few decades of extra enrichment for those who already possess more money than they know how to spend, the prospects of everyone else who will live on this earth are diminished.

So effectively have governments, the media and advertisers associated consumption with prosperity and happiness that to say these things is to expose yourself to opprobrium and ridicule. Witness last week’s Moral Maze programme, in which most of the panel lined up to decry the idea of consuming less, and to associate it, somehow, with authoritarianism(8). When the world goes mad, those who resist are denounced as lunatics.

Bake them a cake, write them a poem, give them a kiss, tell them a joke, but for god’s sake stop trashing the planet to tell someone you care. All it shows is that you don’t.

http://www.monbiot.com

1. http://www.storyofstuff.org/movies-all/story-of-stuff/

2. It’s 57%. See http://www.monbiot.com/2010/05/05/carbon-graveyard/

3. See the film Blood in the Mobile. http://bloodinthemobile.org/

4.http://e360.yale.edu/feature/the_dirty_war_against_africas_remaining_rhinos/2595/

5. http://www.youtube.com/watch?v=i7VE2wlDkr8&list=UU25QbTq58EYBGf2_PDTqzFQ&index=9

6. http://www.ubergizmo.com/2012/07/commercial-for-googles-nexus-7-tablet-revealed/

7. Emmanuel Saez, 2nd March 2012. Striking it Richer: the Evolution of Top Incomes in the United States (Updated with 2009 and 2010 estimates).http://elsa.berkeley.edu/~saez/saez-UStopincomes-2010.pdf

8. http://www.bbc.co.uk/programmes/b01p424r

You Can Give a Boy a Doll, but You Can’t Make Him Play With It (The Atlantic)

By Christina Hoff Sommers

DEC 6 2012, 11:29 AM ET 223

The logistical and ethical problems with trying to make toys gender-neutral

sommers_boydoll_post.jpg

Top-Toy

Is it discriminatory and degrading for toy catalogs to show girls playing with tea sets and boys with Nerf guns? A Swedish regulatory group says yes. The Reklamombudsmannen (RO) has reprimanded Top-Toy, a licensee of Toys”R”Us and one of the largest toy companies in Northern Europe, for its “outdated” advertisements and has pressured it to mend its “narrow-minded” ways. After receiving “training and guidance” from RO equity experts, Top-Toy introduced gender neutrality in its 2012 Christmas catalogue. The catalog shows little boys playing with a Barbie Dream House and girls with guns and gory action figures. As its marketing director explains, “For several years, we have found that the gender debate has grown so strong in the Swedish market that we have had to adjust.”

Swedes can be remarkably thorough in their pursuit of gender parity. A few years ago, a feminist political party proposed a law requiring men to sit while urinating—less messy and more equal. In 2004, the leader of the Sweden’s Left Party Feminist Council, Gudrun Schyman,proposed a “man tax”—a special tariff to be levied on men to pay for all the violence and mayhem wrought by their sex. In April 2012, following the celebration of International Women’s Day, the Swedes formally introduced the genderless pronoun “hen” to be used in place of he and she (han and hon).

Egalia, a new state-sponsored pre-school in Stockholm, is dedicated to the total obliteration of the male and female distinction. There are no boys and girls at Egalia—just “friends” and “buddies.” Classic fairy tales like Cinderellaand Snow White have been replaced by tales of two male giraffes who parent abandoned crocodile eggs. The Swedish Green Party would like Egalia to be the norm: It has suggested placing gender watchdogs in all of the nation’s preschools. “Egalia gives [children] a fantastic opportunity to be whoever they want to be,” says one excited teacher. (It is probably necessary to add that this is not an Orwellian satire or a right-wing fantasy: This school actually exists.)

The problem with Egalia and gender-neutral toy catalogs is that boys and girls, on average, do not have identical interests, propensities, or needs. Twenty years ago, Hasbro, a major American toy manufacturing company, tested a playhouse it hoped to market to both boys and girls. It soon emerged that girls and boys did not interact with the structure in the same way. The girls dressed the dolls, kissed them, and played house. The boys catapulted the toy baby carriage from the roof. A Hasbro manager came up with a novel explanation: “Boys and girls are different.”

They are different, and nothing short of radical and sustained behavior modification could significantly change their elemental play preferences. Children, with few exceptions, are powerfully drawn to sex-stereotyped play. David Geary, a developmental psychologist at the University of Missouri, told me in an email this week, “One of the largest and most persistent differences between the sexes are children’s play preferences.” The female preference for nurturing play and the male propensity for rough-and-tumble hold cross-culturally and even cross-species (with a few exceptions—female spotted hyenas seem to be at least as aggressive as males). Among our close relatives such as vervet and rhesus monkeys, researchers have found that females play with dolls far more than their brothers, who prefer balls and toy cars. It seems unlikely that the monkeys were indoctrinated by stereotypes in a Top-Toy catalog. Something else is going on.

Biology appears to play a role. Several animal studies have shown that hormonal manipulation can reverse sex-typed behavior. When researchers exposed female rhesus monkeys to male hormones prenatally, these females later displayed male-like levels of rough-and-tumble play. Similar results are found in human beings. Congenital adrenal hyperplasia (CAH) is a genetic condition that results when the female fetus is subjected to unusually large quantities of male hormones—adrenal androgens. Girls with CAH tend to prefer trucks, cars, and construction sets over dolls and play tea sets. As psychologist Doreen Kimura reported in Scientific American, “These findings suggest that these preferences were actually altered in some way by the early hormonal environment.” They also cast doubt on the view that gender-specific play is primarily shaped by socialization.

Professor Geary does not have much hope for the new gender-blind toy catalogue: “The catalog will almost certainly disappear in a few years, once parents who buy from it realize their kids don’t want these toys.” Most little girls don’t want to play with dump trucks, as almost any parent can attest. Including me: When my granddaughter Eliza was given a toy train, she placed it in a baby carriage and covered it with a blanket so it could get some sleep.

Androgyny advocates like our Swedish friends have heard such stories many times, and they have an answer. They acknowledge that sex differences have at least some foundation in biology, but they insist that culture can intensify or diminish their power and effect. Even if Eliza is prompted by nature to interact with a train in a stereotypical female way, that is no reason for her parents not to energetically correct her. Hunter College psychologist Virginia Valian, a strong proponent of Swedish-style re-genderization, wrote in the book Why So Slow? The Advancement of Women, “We do not accept biology as destiny … We vaccinate, we inoculate, we medicate… I propose we adopt the same attitude toward biological sex differences.”

Valian is absolutely right that we do not have to accept biology as destiny. But the analogy is ludicrous: We vaccinate, inoculate, and medicate children against disease. Is being a gender-typical little boy or girl a pathology in need of a cure? Failure to protect children from small pox, diphtheria, or measles places them in harm’s way. I don’t believe there is any such harm in allowing male/female differences to flourish in early childhood. As one Swedish mother, Tanja Bergkvist,told the Associated Press, “Different gender roles aren’t problematic as long as they are equally valued.” Gender neutrality is not a necessary condition for equality. Men and women can be different—but equal. And for most human beings, the differences are a vital source for meaning and happiness. Since when is uniformity a democratic ideal?

Few would deny that parents and teachers should expose children to a wide range of toys and play activities. But what the Swedes are now doing in some of their classrooms goes far beyond encouraging children to experiment with different toys and play styles—they are requiring it. And toy companies who resist the gender neutrality mandate face official censure. Is this kind of social engineering worth it? Is it even ethical?

To succeed, the Swedish parents, teachers and authorities are going to have to police—incessantly—boys’ powerful attraction to large-group rough-and-tumble play and girls’ affinity for intimate theatrical play. As Geary says, “You can change some of these behaviors with reinforcement and monitoring, but they bounce back once this stops.” But this constant monitoring can also undermine children’s healthy development.

Anthony Pellegrini, a professor of early childhood education at the University of Minnesota, defines the kind of rough-and-tumble play that boys favor as a behavior that includes “laughing, running, smiling, jumping, open-hand beating, wrestling, play fighting, chasing and fleeing.” This kind of play is often mistakenly regarded as aggression, but according to Pellegrini, it is the very opposite. In cases of schoolyard aggression, the participants are unhappy, they part as enemies, and there are often tears and injuries. Rough-and-tumble play brings boys together, makes them happy, and is a critical party of their social development.

Researchers Mary Ellin Logue (University of Maine) and Hattie Harvey (University of Denver ) agree, and they have documented the benefits of boys’ “bad guy” superhero action narratives. Teachers tend not to like such play, say Logue and Harvey, but it improves boys’ conversation, creative writing skills, and moral imagination. Swedish boys, like American boys, are languishing far behind girls in school. In a 2009 study Logue and Harvey ask an important question the Swedes should consider: “If boys, due to their choices of dramatic play themes, are discouraged from dramatic play, how will this affect their early language and literacy development and their engagement in school?”

What about the girls? Nearly 30 years ago, Vivian Gussin Paley, a beloved kindergarten teacher at the Chicago Laboratory Schools and winner of a MacArthur “genius” award, published a classic book on children’s play entitled Boys & Girls: Superheroes in the Doll Corner. Paley wondered if girls are missing out by not partaking in boys’ superhero play, but her observations of the “doll corner” allayed her doubts. Girls, she learned, are interested in their own kind of domination. Boys’ imaginative play involves a lot of conflict and imaginary violence; girls’ play, on the other hand, seems to be much gentler and more peaceful. But as Paley looked more carefully, she noticed that the girls’ fantasies were just as exciting and intense as the boys—though different. There were full of conflict, pesky characters and imaginary power struggles. “Mothers and princesses are as powerful as any superheroes the boys can devise.” Paley appreciated the benefits of gendered play for both sexes, and she had no illusions about the prospects for its elimination: “Kindergarten is a triumph of sexual self-stereotyping. No amount of adult subterfuge or propaganda deflects the five-year-old’s passion for segregation by sex.”

But subterfuge and propaganda appear to be the order of the day in Sweden. In their efforts to free children from the constraints of gender, the Swedish reformers are imposing their own set of inviolate rules, standards, and taboos. Here is how Slate author Nathalie Rothchild describes a gender-neutral classroom:

One Swedish school got rid of its toy cars because boys “gender-coded” them and ascribed the cars higher status than other toys. Another preschool removed “free playtime” from its schedule because, as a pedagogue at the school put it, when children play freely ‘stereotypical gender patterns are born and cemented. In free play there is hierarchy, exclusion, and the seed to bullying.’ And so every detail of children’s interactions gets micromanaged by concerned adults, who end up problematizing minute aspects of children’s lives, from how they form friendships to what games they play and what songs they sing.

The Swedes are treating gender-conforming children the way we once treated gender-variant children. Formerly called “tomboy girls” and “sissy boys” in the medical literature, these kids are persistently attracted to the toys of the opposite sex. They will often remain fixated on the “wrong” toys despite relentless, often cruel pressure from parents, doctors, and peers. Their total immersion in sex-stereotyped culture—a non-stop Toys”R”Us indoctrination—seems to have little effect on their passion for the toys of the opposite sex. There was a time when a boy who displayed a persistent aversion to trucks and rough play and a fixation on frilly dolls or princess paraphernalia would have been considered a candidate for behavior modification therapy. Today, most experts encourage tolerance, understanding, and acceptance: just leave him alone and let him play as he wants. The Swedes should extend the same tolerant understanding to the gender identity and preferences of the vast majority of children.

When data prediction is a game, the experts lose out (New Scientist)

Specialist Knowledge Is Useless and Unhelpful

By |Posted Saturday, Dec. 8, 2012, at 7:45 AM ET

 Airplanes at an airport.Airplanes at an airport. iStockphoto/Thinkstock.

Jeremy Howard founded email company FastMail and the Optimal Decisions Group, which helps insurance companies set premiums. He is now president and chief scientist of Kaggle, which has turned data prediction into sport.

Peter Aldhous: Kaggle has been described as “an online marketplace for brains.” Tell me about it.
Jeremy Howard: It’s a website that hosts competitions for data prediction. We’ve run a whole bunch of amazing competitions. One asked competitors to develop algorithms to mark students’ essays. One that finished recently challenged competitors to develop a gesture-learning system for the Microsoft Kinect. The idea was to show the controller a gesture just once, and the algorithm would recognize it in future. Another competition predicted the biological properties of small molecules being screened as potential drugs.

PA: How exactly do these competitions work?
JH: They rely on techniques like data mining and machine learning to predict future trends from current data. Companies, governments, and researchers present data sets and problems, and offer prize money for the best solutions. Anyone can enter: We have nearly 64,000 registered users. We’ve discovered that creative-data scientists can solve problems in every field better than experts in those fields can.

PA: These competitions deal with very specialized subjects. Do experts enter?
JH: Oh yes. Every time a new competition comes out, the experts say: “We’ve built a whole industry around this. We know the answers.” And after a couple of weeks, they get blown out of the water.

PA: So who does well in the competitions?
JH: People who can just see what the data is actually telling them without being distracted by industry assumptions or specialist knowledge. Jason Tigg, who runs a pretty big hedge fund in London, has done well again and again. So has Xavier Conort, who runs a predictive analytics consultancy in Singapore.

PA: You were once on the leader board yourself. How did you get involved?
JH: It was a long and strange path. I majored in philosophy in Australia, worked in management consultancy for eight years, and then in 1999 I founded two start-ups—one an email company, the other helping insurers optimize risks and profits. By 2010, I had sold them both. I started learning Chinese and building amplifiers and speakers because I hadn’t made anything with my hands. I travelled. But it wasn’t intellectually challenging enough. Then, at a meeting of statistics users in Melbourne, somebody told me about Kaggle. I thought: “That looks intimidating and really interesting.”

PA: How did your first competition go?
JH: Setting my expectations low, my goal was to not come last. But I actually won it. It was on forecasting tourist arrivals and departures at different destinations. By the time I went to the next statistics meeting I had won two out of the three competitions I entered. Anthony Goldbloom, the founder of Kaggle, was there. He said: “You’re not Jeremy Howard, are you? We’ve never had anybody win two out of three competitions before.”

PA: How did you become Kaggle’s chief scientist?
JH: I offered to become an angel investor. But I just couldn’t keep my hands off the business. I told Anthony that the site was running slowly and rewrote all the code from scratch. Then Anthony and I spent three months in America last year, trying to raise money. That was where things got really serious, because we raised $11 million. I had to move to San Francisco and commit to doing this full-time.

PA: Do you still compete?
JH: I am allowed to compete, but I can’t win prizes. In practice, I’ve been too busy.

PA: What explains Kaggle’s success in solving problems in predictive analytics?
JH: The competitive aspect is important. The more people who take part in these competitions, the better they get at predictive modeling. There is no other place in the world I’m aware of, outside professional sport, where you get such raw, harsh, unfettered feedback about how well you’re doing. It’s clear what’s working and what’s not. It’s a kind of evolutionary process, accelerating the survival of the fittest, and we’re watching it happen right in front of us. More and more, our top competitors are also teaming up with each other.

PA: Which statistical methods work best?
JH: One that crops up again and again is called the random forest. This takes multiple small random samples of the data and makes a “decision tree” for each one, which branches according to the questions asked about the data. Each tree, by itself, has little predictive power. But take an “average” of all of them and you end up with a powerful model. It’s a totally black-box, brainless approach. You don’t have to think—it just works.

PA: What separates the winners from the also-rans?
JH: The difference between the good participants and the bad is the information they feed to the algorithms. You have to decide what to abstract from the data. Winners of Kaggle competitions tend to be curious and creative people. They come up with a dozen totally new ways to think about the problem. The nice thing about algorithms like the random forest is that you can chuck as many crazy ideas at them as you like, and the algorithms figure out which ones work.

PA: That sounds very different from the traditional approach to building predictive models. How have experts reacted?
JH: The messages are uncomfortable for a lot of people. It’s controversial because we’re telling them: “Your decades of specialist knowledge are not only useless, they’re actually unhelpful; your sophisticated techniques are worse than generic methods.” It’s difficult for people who are used to that old type of science. They spend so much time discussing whether an idea makes sense. They check the visualizations and noodle over it. That is all actively unhelpful.

PA: Is there any role for expert knowledge?
JH: Some kinds of experts are required early on, for when you’re trying to work out what problem you’re trying to solve. The expertise you need is strategy expertise in answering these questions.

PA: Can you see any downsides to the data-driven, black-box approach that dominates on Kaggle?
JH: Some people take the view that you don’t end up with a richer understanding of the problem. But that’s just not true: The algorithms tell you what’s important and what’s not. You might ask why those things are important, but I think that’s less interesting. You end up with a predictive model that works. There’s not too much to argue about there.

Doubling Down on Climate Change Denial (Slate)

By Phil Plait

Posted Monday, Dec. 3, 2012, at 8:00 AM ET

Graphic of Earth on fireOh, those wacky professional climate change deniers! Once again, they’ve banded together a passel of people, 90 percent of whom aren’t even climatologists, and had them sign a nearly fact-free opinion piece in the Financial Post, claiming global warming isn’t real. It’s an astonishing example of nonsense so ridiculous I would run out of synonyms for “bilge” before adequately describing it.

The Op-Ed is directed to U.N. Secretary General Ban Ki-Moon, who has recently, and thankfully, been vocal about the looming environmental catastrophe of global warming. The deniers’ letter takes him to task for this, but doesn’t come within a glancing blow of reality.

The letter itself is based on a single claim. So let’s be clear: If that claim is wrong, so is the rest of the letter.

Guess what? That claim is wrong. So blatantly wrong, in fact, it’s hard to imagine anyone could write it with a straight face. It says:

“The U.K. Met Office recently released data showing that there has been no statistically significant global warming for almost 16 years.”

This is simply, completely, and utterly false. The Met Office is the national weather service for the United Kingdom. In October 2012, they updated their database of global surface temperature measurements, a compendium of temperatures taken over time by weather stations around the planet. David Rose, a climate change denier who can charitably be said to have trouble with facts, cherry-picked this dataset and published a horrendously misleading graph in that bastion of scientific thought, the Daily Mail, saying the measurements show there’s been no global warming for the past 16 years.

But he did this by choosing a starting point on his graph that gave the result he wanted, a graph that looks like there’s been no warming since 1997. But if you show the data properly, you see there has been warming:

Graph showing how the Earth is warming up.Global surface temperatures from the Met Office data. Top: Fiction. Bottom: Fact.

Image credit: David Rose/Daily Mail (top), Tamino (bottom).

The top graph is from Rose’s article, but the bottom graph shows what happens when you display the data going back a few more years. See the difference? What he did is like measuring how tall you are when you’re 25, doing it again when you’re 30, then claiming human beings never grow. That’s a big no-no in science. You have to choose starting and ending points that fairly represent the data, as in the bottom graph. When you do, you very clearly see the trend that the Earth is getting warmer. In fact, hammering home how patently ridiculous this claims is, nine of the 10 hottest years on record have been since 2000. On top of that, Rose was using global surface temperatures, which don’t really represent global overall heat content well; most of the heating is going into ocean waters. So the data he’s displaying so awfully isn’t even the right data to make his claim anyway!

So the very first basis of this denial letter is total garbage, and was such an egregious manipulation of the U.K. Met Office data that the Met Office itself issued a debunking of it! Yet here were are, months later, with the deniers still ignoring facts.

The letter is chock full of more such falsehoods. If you want the rundown, please go readthe great article on Skeptical Science destroying this nonsense. Full disclosure: I had already written quite a bit more for this post before seeing the one at Skeptical Science, and decided it would be better to send readers there for more rather than debunk all the wrongness here. I’m pleased to note they found the same examples of misleading or outright false statements in the deniers’ article and debunked them the same way I had.

I do want to add something, though. I’ll note that it seems superficially impressive that they got 125 scientists, “qualified in climate-related matters” as they claim, to sign this letter.

Yeah, about that…

First, not everyone signing that letter is a scientist. Lord Monckton, for example, apparently has no formal scientific training, has some trouble with the truth, and oh, by the way, claims Obama’s birth certificate is a forgery. He’s the last guy I’d want signing a letter I was on. Yet he seems to pop up on every denialist list as a go-to guy.

Here’s another: The very first signatory, Hhabibullo Abdusamatov, claims that global warming is caused by the Sun, which is patently and provably false (see that Skeptical Science link for more). Many of the claims Abdusamatov makes (as listed on his Wikipedia page) are, um, not accepted by mainstream science, to be very charitable.

Going down the list of signatories I was struck by how many are not, in fact, climate scientists (again, for examples with references, see Skeptical Science); I counted a dozen who actually have climatology in their listed credentials. It’s kinda weird to write such a big letter and then only have fewer than 10 percent of the signers actually be credentialed in the field.

Of course, I’m not a climatologist either, though I am an astronomer classically trained in science, and that means I know enough to rely on the combined research of actual climate scientists from around the world. And when thousands upon thousands of such scientists— in fact, 98 percent of actual, bona fide climate scientists—say global warming is real, well then, that strikes me as being somewhat more credible than a hundred or so politically and ideologically driven non-climate-scientists.

I’ll note this isn’t the first time a laughably-wrong article has been printed by right-leaning venues and signed by multiple, similarly-inappropriate authors. The Wall Street Journalposted one in January 2012 (while turning down an article supporting the reality of global warming signed by 255 actual scientists), and in April 2012, another made the roundsthat was signed by 49 people, including some ex-NASA astronauts, but again, none who actually were climate scientists.

So we can expect to see more of this. Clearly, when you don’t have facts to support your claims, the best thing to do is make as much noise as possible to distract from reality. And that reality is that the world is getting hotter, and unless we do something, now, we’re facing a world of trouble.

Origin of intelligence and mental illness linked to ancient genetic accident (University of Edinburgh)

2-Dec-2012 – By Tara Womersley, University of Edinburgh

Scientists have discovered for the first time how humans – and other mammals – have evolved to have intelligence

Scientists have discovered for the first time how humans – and other mammals – have evolved to have intelligence.

Researchers have identified the moment in history when the genes that enabled us to think and reason evolved.

This point 500 million years ago provided our ability to learn complex skills, analyse situations and have flexibility in the way in which we think.

Professor Seth Grant, of the University of Edinburgh, who led the research, said: “One of the greatest scientific problems is to explain how intelligence and complex behaviours arose during evolution.”

The research, which is detailed in two papers in Nature Neuroscience, also shows a direct link between the evolution of behaviour and the origins of brain diseases.

Scientists believe that the same genes that improved our mental capacity are also responsible for a number of brain disorders.

“This ground breaking work has implications for how we understand the emergence of psychiatric disorders and will offer new avenues for the development of new treatments,” said John Williams, Head of Neuroscience and Mental Health at the Wellcome Trust, one of the study funders.

The study shows that intelligence in humans developed as the result of an increase in the number of brain genes in our evolutionary ancestors.

The researchers suggest that a simple invertebrate animal living in the sea 500 million years ago experienced a ‘genetic accident’, which resulted in extra copies of these genes being made.

This animal’s descendants benefited from these extra genes, leading to behaviourally sophisticated vertebrates – including humans.

The research team studied the mental abilities of mice and humans, using comparative tasks that involved identifying objects on touch-screen computers.

Researchers then combined results of these behavioural tests with information from the genetic codes of various species to work out when different behaviours evolved.

They found that higher mental functions in humans and mice were controlled by the same genes.

The study also showed that when these genes were mutated or damaged, they impaired higher mental functions.

“Our work shows that the price of higher intelligence and more complex behaviours is more mental illness,” said Professor Grant.

The researchers had previously shown that more than 100 childhood and adult brain diseases are caused by gene mutations.

“We can now apply genetics and behavioural testing to help patients with these diseases”, said Dr Tim Bussey from Cambridge University, which was also involved in the study.

The study was funded by the Wellcome Trust, the Medical Research Council and European Union.

Current scientific knowledge does not substantiate Ban Ki-Moon assertions on weather and climate, say 125-plus scientists (Financial Post) + EANTH list reactions

OPEN CLIMATE LETTER TO UN SECRETARY-GENERAL: Current scientific knowledge does not substantiate Ban Ki-Moon assertions on weather and climate, say 125-plus scientists.

Special to Financial Post | Nov 29, 2012 8:36 PM ET | Last Updated:Nov 30, 2012 12:11 PM ET

Getty – UN Secretary-General Ban Ki-Moon

Policy actions that aim to reduce CO2 emissions are unlikely to influence future climate. Policies need to focus on preparation for, and adaptation to, all dangerous climatic events, however caused


Open Letter to the Secretary-General of the United Nations

H.E. Ban Ki-Moon, Secretary-General, United Nations. First Avenue and East 44th Street, New York, New York, U.S.A.

November 29, 2012

Mr. Secretary-General:

On November 9 this year you told the General Assembly: “Extreme weather due to climate change is the new normal … Our challenge remains, clear and urgent: to reduce greenhouse gas emissions, to strengthen adaptation to … even larger climate shocks … and to reach a legally binding climate agreement by 2015 … This should be one of the main lessons of Hurricane Sandy.”

On November 13 you said at Yale: “The science is clear; we should waste no more time on that debate.”

The following day, in Al Gore’s “Dirty Weather” Webcast, you spoke of “more severe storms, harsher droughts, greater floods”, concluding: “Two weeks ago, Hurricane Sandy struck the eastern seaboard of the United States. A nation saw the reality of climate change. The recovery will cost tens of billions of dollars. The cost of inaction will be even higher. We must reduce our dependence on carbon emissions.”

We the undersigned, qualified in climate-related matters, wish to state that current scientific knowledge does not substantiate your assertions.

The U.K. Met Office recently released data showing that there has been no statistically significant global warming for almost 16 years. During this period, according to the U.S. National Oceanic and Atmospheric Administration (NOAA), carbon dioxide (CO2) concentrations rose by nearly 9% to now constitute 0.039% of the atmosphere. Global warming that has not occurred cannot have caused the extreme weather of the past few years. Whether, when and how atmospheric warming will resume is unknown. The science is unclear. Some scientists point out that near-term natural cooling, linked to variations in solar output, is also a distinct possibility.

The “even larger climate shocks” you have mentioned would be worse if the world cooled than if it warmed. Climate changes naturally all the time, sometimes dramatically. The hypothesis that our emissions of CO2 have caused, or will cause, dangerous warming is not supported by the evidence.

The incidence and severity of extreme weather has not increased. There is little evidence that dangerous weather-related events will occur more often in the future. The U.N.’s own Intergovernmental Panel on Climate Change says in its Special Report on Extreme Weather (2012) that there is “an absence of an attributable climate change signal” in trends in extreme weather losses to date. The funds currently dedicated to trying to stop extreme weather should therefore be diverted to strengthening our infrastructure so as to be able to withstand these inevitable, natural events, and to helping communities rebuild after natural catastrophes such as tropical storm Sandy.

There is no sound reason for the costly, restrictive public policy decisions proposed at the U.N. climate conference in Qatar. Rigorous analysis of unbiased observational data does not support the projections of future global warming predicted by computer models now proven to exaggerate warming and its effects.

The NOAA “State of the Climate in 2008” report asserted that 15 years or more without any statistically-significant warming would indicate a discrepancy between observation and prediction. Sixteen years without warming have therefore now proven that the models are wrong by their creators’ own criterion.

Based upon these considerations, we ask that you desist from exploiting the misery of the families of those who lost their lives or properties in tropical storm Sandy by making unsupportable claims that human influences caused that storm. They did not. We also ask that you acknowledge that policy actions by the U.N., or by the signatory nations to the UNFCCC, that aim to reduce CO2 emissions are unlikely to exercise any significant influence on future climate. Climate policies therefore need to focus on preparation for, and adaptation to, all dangerous climatic events however caused.

Signed by:

  1. Habibullo I. Abdussamatov, Dr. Sci., mathematician and astrophysicist, Head of the Selenometria project on the Russian segment of the ISS, Head of Space Research of the Sun Sector at the Pulkovo Observatory of the Russian Academy of Sciences, St. Petersburg, Russia
  2. Syun-Ichi Akasofu, PhD, Professor of Physics, Emeritus and Founding Director, International Arctic Research Center of the University of Alaska, Fairbanks, Alaska, U.S.A.
  3. Bjarne Andresen, Dr. Scient., physicist, published and presents on the impossibility of a “global temperature”, Professor, Niels Bohr Institute (physics (thermodynamics) and chemistry), University of Copenhagen, Copenhagen, Denmark
  4. J. Scott Armstrong, PhD, Professor of Marketing, The Wharton School, University of Pennsylvania, Founder of the International Journal of Forecasting, focus on analyzing climate forecasts, Philadelphia, Pennsylvania, U.S.A.
  5. Timothy F. Ball, PhD, environmental consultant and former climatology professor, University of Winnipeg, Winnipeg, Manitoba, Canada
  6. James R. Barrante, Ph.D. (chemistry, Harvard University), Emeritus Professor of Physical Chemistry, Southern Connecticut State University, focus on studying the greenhouse gas behavior of CO2, Cheshire, Connecticut, U.S.A.
  7. Colin Barton, B.Sc., PhD (Earth Science, Birmingham, U.K.), FInstEng Aus Principal research scientist (ret.), Commonwealth Scientific and Industrial Research Organisation (CSIRO), Melbourne, Victoria, Australia
  8. Joe Bastardi, BSc, (Meteorology, Pennsylvania State), meteorologist, State College, Pennsylvania, U.S.A.
  9. Franco Battaglia, PhD (Chemical Physics), Professor of Physics and Environmental Chemistry, University of Modena, Italy
  10. Richard Becherer, BS (Physics, Boston College), MS (Physics, University of Illinois), PhD (Optics, University of Rochester), former Member of the Technical Staff – MIT Lincoln Laboratory, former Adjunct Professor – University of Connecticut, Areas of Specialization: optical radiation physics, coauthor – standard reference book Optical Radiation Measurements: Radiometry, Millis, MA, U.S.A.
  11. Edwin X. Berry, PhD (Atmospheric Physics, Nevada), MA (Physics, Dartmouth), BS (Engineering, Caltech), Certified Consulting Meteorologist, President, Climate Physics LLC, Bigfork, MT, U.S.A.
  12. Ian Bock, BSc, PhD, DSc, Biological sciences (retired), Ringkobing, Denmark
  13. Ahmed Boucenna, PhD, Professor of Physics (strong climate focus), Physics Department, Faculty of Science, Ferhat Abbas University, Setif, Algéria
  14. Antonio Brambati, PhD, Emeritus Professor (sedimentology), Department of Geological, Environmental and Marine Sciences (DiSGAM), University of Trieste (specialization: climate change as determined by Antarctic marine sediments), Trieste, Italy
  15. Stephen C. Brown, PhD (Environmental Science, State University of New York), District Agriculture Agent, Assistant Professor, University of Alaska Fairbanks, Ground Penetrating Radar Glacier research, Palmer, Alaska, U.S.A.
  16. Mark Lawrence Campbell, PhD (chemical physics; gas-phase kinetic research involving greenhouse gases (nitrous oxide, carbon dioxide)), Professor, United States Naval Academy, Annapolis, Maryland, U.S.A.
  17. Rudy Candler, PhD (Soil Chemistry, University of Alaska Fairbanks (UAF)), former agricultural laboratory manager, School of Agriculture and Land Resources Management, UAF, co-authored papers regarding humic substances and potential CO2 production in the Arctic due to decomposition, Union, Oregon, U.S.A.
  18. Alan Carlin, B.S. (California Institute of Technology), PhD (economics, Massachusetts Institute of Technology), retired senior analyst and manager, U.S. Environmental Protection Agency, Washington, DC, former Chairman of the Angeles Chapter of the Sierra Club (recipient of the Chapter’s Weldon Heald award for conservation work), U.S.A.
  19. Dan Carruthers, M.Sc., Arctic Animal Behavioural Ecologist, wildlife biology consultant specializing in animal ecology in Arctic and Subarctic regions, Turner Valley, Alberta, Canada
  20. Robert M. Carter, PhD, Professor, Marine Geophysical Laboratory, James Cook University, Townsville, Australia
  21. Uberto Crescenti, PhD, Full Professor of Applied Geology, Università G. d’Annunzio, Past President Società Geologica taliana, Chieti, Italy
  22. Arthur Chadwick, PhD (Molecular Biology), Research Professor of Geology, Department of Biology and Geology, Southwestern Adventist University, Climate Specialties: dendrochronology (determination of past climate states by tree ring analysis), palynology (same but using pollen as a climate proxy), paleobotany and botany; Keene, Texas, U.S.A.
  23. George V. Chilingar, PhD, Professor, Department of Civil and Environmental Engineering of Engineering (CO2/temp. focused research), University of Southern California, Los Angeles, California, U.S.A.
  24. Ian D. Clark, PhD, Professor (isotope hydrogeology and paleoclimatology), Dept. of Earth Sciences, University of Ottawa, Ottawa, Ontario, Canada
  25. Cornelia Codreanova, Diploma in Geography, Researcher (Areas of Specialization: formation of glacial lakes) at Liberec University, Czech Republic, Zwenkau, Germany
  26. Michael Coffman, PhD (Ecosystems Analysis and Climate Influences, University of Idaho), CEO of Sovereignty International, President of Environmental Perspectives, Inc., Bangor, Maine, U.S.A.
  27. Piers Corbyn, ARCS, MSc (Physics, Imperial College London)), FRAS, FRMetS, astrophysicist (Queen Mary College, London), consultant, founder WeatherAction long range weather and climate forecasters, American Thinker Climate Forecaster of The Year 2010, London, United Kingdom
  28. Richard S. Courtney, PhD, energy and environmental consultant, IPCC expert reviewer, Falmouth, Cornwall, United Kingdom
  29. Roger W. Cohen, B.S., M.S., PhD Physics, MIT and Rutgers University, Fellow, American Physical Society, initiated and managed for more than twenty years the only industrial basic research program in climate, Washington Crossing, Pennsylvania, U.S.A.
  30. Susan Crockford, PhD (Zoology/Evolutionary Biology/Archaeozoology), Adjunct Professor (Anthropology/Faculty of Graduate Studies), University of Victoria, Victoria, British Colombia, Canada
  31. Walter Cunningham, B.S., M.S. (Physics – Institute of Geophysics And Planetary Sciences,  UCLA), AMP – Harvard Graduate School of Business, Colonel (retired) U.S. Marine Corps, Apollo 7 Astronaut., Fellow – AAS, AIAA; Member AGU, Houston, Texas, U.S.A.
  32. Joseph D’Aleo, BS, MS (Meteorology, University of Wisconsin),  Doctoral Studies (NYU), CMM, AMS Fellow, Executive Director – ICECAP (International Climate and Environmental Change Assessment Project), College Professor Climatology/Meteorology, First Director of Meteorology The Weather Channel, Hudson, New Hampshire, U.S.A.
  33. David Deming, PhD (Geophysics), Professor of Arts and Sciences, University of Oklahoma, Norman, Oklahoma, U.S.A.
  34. James E. Dent; B.Sc., FCIWEM, C.Met, FRMetS, C.Env., Independent Consultant (hydrology & meteorology), Member of WMO OPACHE Group on Flood Warning, Hadleigh, Suffolk, England, United Kingdom
  35. Willem de Lange, MSc (Hons), DPhil (Computer and Earth Sciences), Senior Lecturer in Earth and Ocean Sciences, The University of Waikato, Hamilton, New Zealand
  36. Silvia Duhau, Ph.D. (physics), Solar Terrestrial Physics, Buenos Aires University, Buenos Aires, Argentina
  37. Geoff Duffy, DEng (Dr of Engineering), PhD (Chemical Engineering), BSc, ASTCDip. (first chemical engineer to be a Fellow of the Royal Society in NZ), FIChemE, wide experience in radiant heat transfer and drying, chemical equilibria, etc. Has reviewed, analysed, and written brief reports and papers on climate change, Auckland, New Zealand
  38. Don J. Easterbrook, PhD, Emeritus Professor of Geology, Western Washington, University, Bellingham, Washington, U.S.A.
  39. Ole Henrik Ellestad, former Research Director, applied chemistry SINTEF, Professor in physical chemistry, University of Oslo, Managing director Norsk Regnesentral and Director for Science and Technology, Norwegian Research Council, widely published in infrared spectroscopy, Oslo, Norway
  40. Per Engene, MSc, Biologist, Co-author – The Climate, Science and Politics (2009), Bø i Telemark, Norway
  41. Gordon Fulks, B.S., M.S., PhD (Physics, University of Chicago), cosmic radiation, solar wind, electromagnetic and geophysical phenomena, Portland, Oregon, U.S.A.
  42. Katya Georgieva, MSc (meteorology), PhD (solar-terrestrial climate physics), Professor, Space Research and Technologies Institute, Bulgarian Academy of Sciences, Sofia, Bulgaria
  43. Lee C. Gerhard, PhD, Senior Scientist Emeritus, University of Kansas, past director and state geologist, Kansas Geological Survey, U.S.A.
  44. Ivar Giaever PhD, Nobel Laureate in Physics 1973, professor emeritus at the Rensselaer Polytechnic Institute, a professor-at-large at the University of Oslo, Applied BioPhysics, Troy, New York, U.S.A.
  45. Albrecht Glatzle, PhD, ScAgr, Agro-Biologist and Gerente ejecutivo, Tropical pasture research and land use management, Director científico de INTTAS, Loma Plata, Paraguay
  46. Fred Goldberg, PhD, Adj Professor, Royal Institute of Technology (Mech, Eng.), Secretary General KTH International Climate Seminar 2006 and Climate analyst (NIPCC), Lidingö, Sweden
  47. Laurence I. Gould, PhD, Professor of Physics, University of Hartford, Past Chair (2004), New England Section of the American Physical Society, West Hartford, Connecticut, U.S.A.
  48. Vincent Gray, PhD, New Zealand Climate Coalition, expert reviewer for the IPCC, author of The Greenhouse Delusion: A Critique of Climate Change 2001, Wellington, New Zealand
  49. William M. Gray, PhD, Professor Emeritus, Dept. of Atmospheric Science, Colorado State University, Head of the Tropical Meteorology Project, Fort Collins, Colorado, U.S.A.
  50. Charles B. Hammons, PhD (Applied Mathematics), climate-related specialties: applied mathematics, modeling & simulation, software & systems engineering, Associate Professor, Graduate School of Management, University of Dallas; Assistant Professor, North Texas State University (Dr. Hammons found many serious flaws during a detailed study of the software, associated control files plus related email traffic of the Climate Research Unit temperature and other records and “adjustments” carried out in support of IPCC conclusions), Coyle, OK, U.S.A.
  51. William Happer, PhD, Professor, Department of Physics, Princeton University, Princeton, NJ, U.S.A.
  52. Hermann Harde, PhD, Professur f. Lasertechnik & Werkstoffkunde (specialized in molecular spectroscopy, development of gas sensors and CO2-climate sensitivity), Helmut-Schmidt-Universität, Universität der Bundeswehr Fakultät für Elektrotechnik, Hamburg, Germany
  53. Howard Hayden, PhD, Emeritus Professor (Physics), University of Connecticut, The Energy Advocate, Pueblo West, Colorado, U.S.A.
  54. Ross Hays, Meteorologist, atmospheric scientist, NASA Columbia Scientific Balloon Facility (currently working at McMurdo Station, Antarctica), Palestine, Texas, U.S.A.
  55. Martin Hovland, M.Sc. (meteorology, University of Bergen), PhD (Dr Philos, University of Tromsø), FGS, Emeritus Professor, Geophysics, Centre for Geobiology, University of Bergen, member of the expert panel: Environmental Protection and Safety Panel (EPSP) for the Ocean Drilling Program (ODP) and the Integrated ODP, Stavanger, Norway
  56. Ole Humlum, PhD, Professor of Physical Geography, Department of Physical Geography, Institute of Geosciences, University of Oslo, Oslo, Norway
  57. Craig D. Idso, PhD, Chairman of the Board of Directors of the Center for the Study of Carbon Dioxide and Global Change, Tempe, Arizona, U.S.A.
  58. Sherwood B. Idso, PhD, President, Center for the Study of Carbon Dioxide and Global Change, Tempe, Arizona, U.S.A.
  59. Larry Irons, BS (Geology), MS (Geology), Sr. Geophysicist at Fairfield Nodal (specialization: paleoclimate), Lakewood, Colorado, U.S.A.
  60. Terri Jackson, MSc (plasma physics), MPhil (energy economics), Director, Independent Climate Research Group, Northern Ireland and London (Founder of the energy/climate group at the Institute of Physics, London), United Kingdom
  61. Albert F. Jacobs, Geol.Drs., P. Geol., Calgary, Alberta, Canada
  62. Hans Jelbring, PhD Climatology, Stockholm University, MSc Electronic engineering, Royal Institute of Technology, BSc  Meteorology, Stockholm University, Sweden
  63. Bill Kappel, B.S. (Physical Science-Geology), B.S. (Meteorology), Storm Analysis, Climatology, Operation Forecasting, Vice President/Senior Meteorologist, Applied Weather Associates, LLC, University of Colorado, Colorado Springs, U.S.A.
  64. Olavi Kärner, Ph.D., Extraordinary Research Associate; Dept. of Atmospheric Physics, Tartu Observatory, Toravere, Estonia
  65. Leonid F. Khilyuk, PhD, Science Secretary, Russian Academy of Natural Sciences, Professor of Engineering (CO2/temp. focused research), University of Southern California, Los Angeles, California, U.S.A.
  66. William Kininmonth MSc, MAdmin, former head of Australia’s National Climate Centre and a consultant to the World Meteorological organization’s Commission for Climatology, Kew, Victoria, Australia
  67. Gerhard Kramm, Dr. rer. nat. (Theoretical Meteorology), Research Associate Professor, Geophysical Institute, Associate Faculty, College of Natural Science and Mathematics, University of Alaska Fairbanks, (climate specialties: Atmospheric energetics, physics of the atmospheric boundary layer, physical climatology – seeinteresting paper by Kramm et al), Fairbanks, Alaska, U.S.A.
  68. Leif Kullman, PhD (Physical geography, plant ecology, landscape ecology), Professor, Physical geography, Department of Ecology and Environmental science, Umeå University, Areas of Specialization: Paleoclimate (Holocene to the present), glaciology, vegetation history, impact of modern climate on the living landscape, Umeå, Sweden
  69. Hans H.J. Labohm, PhD, Independent economist, author specialised in climate issues, IPCC expert reviewer, author of Man-Made Global Warming: Unravelling a Dogma and climate science-related Blog, The Netherlands
  70. Rune Berg-Edland Larsen, PhD (Geology, Geochemistry), Professor, Dep. Geology and Geoengineering, Norwegian University of Science and Technology (NTNU), Trondheim, Norway
  71. C. (Kees) le Pair, PhD (Physics Leiden, Low Temperature Physics), former director of the Netherlands Research Organization FOM (fundamental physics) and subsequently founder and director of The Netherlands Technology Foundation STW.  Served the Dutch Government many years as member of its General Energy Council and of the National Defense Research Council. Royal Academy of Arts and Sciences Honorary Medal and honorary doctorate in all technical sciences of the Delft University of technology, Nieuwegein, The Netherlands
  72. Douglas Leahey, PhD, meteorologist and air-quality consultant, past President – Friends of Science, Calgary, Alberta, Canada
  73. Jay Lehr, B.Eng. (Princeton), PhD (environmental science and ground water hydrology), Science Director, The Heartland Institute, Chicago, Illinois, U.S.A.
  74. Bryan Leyland, M.Sc., FIEE, FIMechE, FIPENZ, MRSNZ, consulting engineer (power), Energy Issues Advisor – International Climate Science Coalition, Auckland, New Zealand
  75. Edward Liebsch, B.A. (Earth Science, St. Cloud State University); M.S. (Meteorology, The Pennsylvania State University), former Associate Scientist, Oak Ridge National Laboratory; former Adjunct Professor of Meteorology, St. Cloud State University, Environmental Consultant/Air Quality Scientist (Areas of Specialization: micrometeorology, greenhouse gas emissions), Maple Grove, Minnesota, U.S.A.
  76. William Lindqvist, PhD (Applied Geology), Independent Geologic Consultant, Areas of Specialization: Climate Variation in the recent geologic past, Tiburon, California, U.S.A.
  77. Horst-Joachim Lüdecke, Prof. Dr. , PhD (Physics), retired from university of appl. sciences HTW, Saarbrücken (Germany), atmospheric temperature research, speaker of the European Institute for Climate and Energy (EIKE), Heidelberg, Germany
  78. Anthony R. Lupo, Ph.D., Professor of Atmospheric Science, Department of Soil, Environmental, and Atmospheric Science, University of Missouri, Columbia, Missouri, U.S.A.
  79. Oliver Manuel, BS, MS, PhD, Post-Doc (Space Physics), Associate – Climate & Solar Science Institute, Emeritus Professor, College of Arts & Sciences University of Missouri-Rolla, previously Research Scientist (US Geological Survey) and NASA Principal Investigator for Apollo, Cape Girardeau, Missouri, U.S.A.
  80. Francis Massen, professeur-docteur en physique (PhD equivalent, Universities of Nancy (France) and Liège (Belgium), Manager of the Meteorological Station of the Lycée Classique de Diekirch, specialising in the measurement of solar radiation and atmospheric gases. Collaborator to the WOUDC (World Ozone and UV Radiation Data Center), Diekirch, Luxembourg
  81. Henri Masson, Prof. dr. ir., Emeritus Professor University of Antwerp (Energy & Environment Technology Management), Visiting professor Maastricht School of Management, specialist in dynamical (chaotic) complex system analysis, Antwerp, Belgium.
  82. Ferenc Mark Miskolczi, PhD, atmospheric physicist, formerly of NASA’s Langley Research Center, Hampton, Virginia, U.S.A.
  83. Viscount Monckton of Brenchley, Expert reviewer, IPCC Fifth Assessment Report, Quantification of Climate Sensitivity, Carie, Rannoch, Scotland
  84. Nils-Axel Mörner, PhD (Sea Level Changes and Climate), Emeritus Professor of Paleogeophysics & Geodynamics, Stockholm University, Stockholm, Sweden
  85. John Nicol, PhD (Physics, James Cook University), Chairman – Australian climate Science Coalition, Brisbane, Australia
  86. Ingemar Nordin, PhD, professor in philosophy of science (including a focus on “Climate research, philosophical and sociological aspects of a politicised research area”), Linköpings University, Sweden.
  87. David Nowell, M.Sc., Fellow of the Royal Meteorological Society, former chairman of the NATO Meteorological Group, Ottawa, Ontario, Canada
  88. Cliff Ollier, D.Sc., Professor Emeritus (School of Earth and Environment – see hisCopenhagen Climate Challenge sea level article here), Research Fellow, University of Western Australia, Nedlands, W.A., Australia
  89. Oleg M. Pokrovsky, BS, MS, PhD (mathematics and atmospheric physics – St. Petersburg State University, 1970), Dr. in Phys. and Math Sciences (1985), Professor in Geophysics (1995), principal scientist, Main Geophysical Observatory (RosHydroMet), Note: Dr. Pokrovsky analyzed long climates and concludes that anthropogenic CO2 impact is not the main contributor in climate change,St. Petersburg, Russia.
  90. Daniel Joseph Pounder, BS (Meteorology, University of Oklahoma), MS (Atmospheric Sciences, University of Illinois, Urbana-Champaign); Meteorological/Oceanographic Data Analyst for the National Data Buoy Center, formerly Meteorologist, WILL AM/FM/TV, Urbana, U.S.A.
  91. Brian Pratt, PhD, Professor of Geology (Sedimentology), University of Saskatchewan (see Professor Pratt’s article for a summary of his views), Saskatoon, Saskatchewan, Canada
  92. Harry N.A. Priem, PhD, Professore-emeritus isotope-geophysics and planetary geology, Utrecht University, past director ZWO/NOW Institute of Isotope Geophysical Research, Past-President Royal Netherlands Society of Geology and Mining, Amsterdam, The Netherlands
  93. Oleg Raspopov, Doctor of Science and Honored Scientist of the Russian Federation, Professor – Geophysics, Senior Scientist, St. Petersburg Filial (Branch) of N.V.Pushkov Institute of Terrestrial Magnetism, Ionosphere and Radiowaves Propagation of RAS (climate specialty: climate in the past, particularly the influence of solar variability), Editor-in-Chief of journal “Geomagnetism and Aeronomy” (published by Russian Academy of Sciences), St. Petersburg, Russia
  94. Curt G. Rose, BA, MA (University of Western Ontario), MA, PhD (Clark University), Professor Emeritus, Department of Environmental Studies and Geography, Bishop’s University, Sherbrooke, Quebec, Canada
  95. S. Jeevananda Reddy, M.Sc. (Geophysics), Post Graduate Diploma (Applied Statistics, Andhra University), PhD (Agricultural Meteorology, Australian University, Canberra), Formerly Chief Technical Advisor—United Nations World Meteorological Organization (WMO) & Expert-Food and Agriculture Organization (UN), Convener – Forum for a Sustainable Environment, author of 500 scientific articles and several books – here is one: “Climate Change – Myths & Realities“, Hyderabad, India
  96. Arthur Rorsch, PhD, Emeritus Professor, Molecular Genetics, Leiden University, former member of the board of management of the Netherlands Organization Applied Research TNO, Leiden, The Netherlands
  97. Rob Scagel, MSc (forest microclimate specialist), Principal Consultant – Pacific Phytometric Consultants, Surrey, British Columbia, Canada
  98. Chris Schoneveld, MSc (Structural Geology), PhD (Geology), retired exploration geologist and geophysicist, Australia and France
  99. Tom V. Segalstad, PhD (Geology/Geochemistry), Associate Professor of Resource and Environmental Geology, University of Oslo, former IPCC expert reviewer, former Head of the Geological Museum, and former head of the Natural History Museum and Botanical Garden (UO), Oslo, Norway
  100. John Shade, BS (Physics), MS (Atmospheric Physics), MS (Applied Statistics), Industrial Statistics Consultant, GDP, Dunfermline, Scotland, United Kingdom
  101. Thomas P. Sheahen, B.S., PhD (Physics, Massachusetts Institute of Technology), specialist in renewable energy, research and publication (applied optics) in modeling and measurement of absorption of infrared radiation by atmospheric CO2,  National Renewable Energy Laboratory (2005-2009); Argonne National Laboratory (1988-1992); Bell Telephone labs (1966-73), National Bureau of Standards (1975-83), Oakland, Maryland, U.S.A.
  102. S. Fred Singer, PhD, Professor Emeritus (Environmental Sciences), University of Virginia, former director, U.S. Weather Satellite Service, Science and Environmental Policy Project, Charlottesville, Virginia, U.S.A.
  103. Frans W. Sluijter, Prof. dr ir, Emeritus Professor of theoretical physics, Technical University Eindhoven, Chairman—Skepsis Foundation, former vice-president of the International Union of Pure and Applied Physics, former President of the Division on Plasma Physics of the European Physical Society and former bureau member of the Scientific Committee on Sun-Terrestrial Physics, Euvelwegen, the Netherlands
  104. Jan-Erik Solheim, MSc (Astrophysics), Professor, Institute of Physics, University of Tromsø, Norway (1971-2002), Professor (emeritus), Institute of Theoretical Astrophysics, University of Oslo, Norway (1965-1970, 2002- present), climate specialties: sun and periodic climate variations, scientific paper by Professor Solheim “Solen varsler et kaldere tiår“, Baerum, Norway
  105. H. Leighton Steward, Master of Science (Geology), Areas of Specialization: paleoclimates and empirical evidence that indicates CO2 is not a significant driver of climate change, Chairman, PlantsNeedCO2.org and CO2IsGreen.org, Chairman of the Institute for the Study of Earth and Man (geology, archeology & anthropology) at SMU in Dallas, Texas, Boerne, TX, U.S.A.
  106. Arlin B. Super, PhD (Meteorology – University of Wisconsin at Madison), former Professor of Meteorology at Montana State University, retired Research Meteorologist, U.S. Bureau of Reclamation, Saint Cloud, Minnesota, U.S.A.
  107. Edward (Ted) R. Swart, D.Sc. (physical chemistry, University of Pretoria), M.Sc. and Ph.D. (math/computer science, University of Witwatersrand). Formerly Director of the Gulbenkian Centre, Dean of the Faculty of Science, Professor and Head of the Department of Computer Science, University of Rhodesia and past President of the Rhodesia Scientific Association. Set up the first radiocarbon dating laboratory in Africa. Most recently, Professor in the Department of Combinatorics and Optimization at the University of Waterloo and Chair of Computing and Information Science and Acting Dean at the University of Guelph, Ontario, Canada, now retired in Kelowna British Columbia, Canada
  108. George H. Taylor, B.A. (Mathematics, U.C. Santa Barbara), M.S. (Meteorology, University of Utah), Certified Consulting Meteorologist, Applied Climate Services, LLC, Former State Climatologist (Oregon), President, American Association of State Climatologists (1998-2000), Corvallis, Oregon, U.S.A.
  109. J. E. Tilsley, P.Eng., BA Geol, Acadia University, 53 years of climate and paleoclimate studies related to development of economic mineral deposits, Aurora, Ontario, Canada
  110. Göran Tullberg, Civilingenjör i Kemi (equivalent to Masters of Chemical Engineering), Co-author – The Climate, Science and Politics (2009) (see here for a review), formerly instructor of Organic Chemistry (specialization in “Climate chemistry”), Environmental Control and Environmental Protection Engineering at University in Växjö; Falsterbo, Sweden
  111. Brian Gregory Valentine, PhD, Adjunct professor of engineering (aero and fluid dynamics specialization) at the University of Maryland, Technical manager at US Department of Energy, for large-scale modeling of atmospheric pollution, Technical referee for the US Department of Energy’s Office of Science programs in climate and atmospheric modeling conducted at American Universities and National Labs, Washington, DC, U.S.A.
  112. Bas van Geel, PhD, paleo-climatologist, Institute for Biodiversity and Ecosystem Dynamics, Research Group Paleoecology and Landscape Ecology, Faculty of Science, Universiteit van Amsterdam, Amsterdam, The Netherlands
  113. Gerrit J. van der Lingen, PhD (Utrecht University), geologist and paleoclimatologist, climate change consultant, Geoscience Research and Investigations, Nelson, New Zealand
  114. A.J. (Tom) van Loon, PhD, Professor of Geology (Quaternary Geologyspecialism: Glacial Geology), Adam Mickiewicz University, former President of the European Association of Science Editors Poznan, Poland
  115. Fritz Vahrenholt, B.S. (chemistry), PhD (chemistry), Prof. Dr., Professor of Chemistry, University of Hamburg, Former Senator for environmental affairs of the State of Hamburg, former CEO of REpower Systems AG (wind turbines), Author of the book Die kalte Sonne: warum die Klimakatastrophe nicht stattfindet (The Cold Sun: Why the Climate Crisis Isn’t Happening”, Hamburg, Germany
  116. Michael G. Vershovsky, Ph.D. in meteorology (macrometeorology, long-term forecasts, climatology), Senior Researcher, Russian State Hydrometeorological University, works with, as he writes, “Atmospheric Centers of Action (cyclones and anticyclones, such as Icelandic depression, the South Pacific subtropical anticyclone, etc.). Changes in key parameters of these centers strongly indicate that the global temperature is influenced by these natural factors (not exclusively but nevertheless)”, St. Petersburg, Russia
  117. Gösta Walin, PhD and Docent (theoretical Physics, University of Stockholm), Professor Emeritus in oceanografi, Earth Science Center, Göteborg University, Göteborg,  Sweden
  118. Anthony Watts, ItWorks/IntelliWeather, Founder, surfacestations.orgWatts Up With That, Chico, California, U.S.A.
  119. Carl Otto Weiss, Direktor und Professor at Physikalisch-Technische Bundesanstalt,  Visiting Professor at University of Copenhagen, Tokyo Institute of Technology, Coauthor of ”Multiperiodic Climate Dynamics: Spectral Analysis of…“, Braunschweig, Germany
  120. Forese-Carlo Wezel, PhD, Emeritus Professor of Stratigraphy (global and Mediterranean geology, mass biotic extinctions and paleoclimatology), University of Urbino, Urbino, Italy
  121. Boris Winterhalter, PhD, senior marine researcher (retired), Geological Survey of Finland, former professor in marine geology, University of Helsinki, Helsinki, Finland
  122. David E. Wojick, PhD,  PE, energy and environmental consultant, Technical Advisory Board member – Climate Science Coalition of America, Star Tannery, Virginia, U.S.A.
  123. George T. Wolff, Ph.D., Principal Atmospheric Scientist, Air Improvement Resource, Inc., Novi, Michigan, U.S.A.
  124. Thomas (Tom) Wysmuller –NASA (Ret) ARC, GSFC, Hdq. – Meteorologist, Ogunquit, ME, U.S.A.
  125. Bob Zybach, PhD (Environmental Sciences, Oregon State University), climate-related carbon sequestration research, MAIS, B.S., Director, Environmental Sciences Institute Peer review Institute, Cottage Grove, Oregon, U.S.A.
  126. Milap Chand Sharma, PhD, Associate Professor of Glacial Geomorphology, Centre fort the Study of Regional Development, Jawaharlal Nehru University, New Delhi, India
  127. Valentin A. Dergachev, PhD, Professor and Head of the Cosmic Ray Laboratory at Ioffe Physical-Technical Institute of Russian Academy of Sciences, St. Petersburg, Russia
  128. Vijay Kumar Raina, Ex-Deputy Director General, Geological Survey of India, Ex-Chairman Project Advisory and Monitoring Committee on Himalayan glacier, DST, Govt. of India and currently Member Expert Committee on Climate Change Programme, Dept. of Science & Technology, Govt. of India, author of 2010 MoEF Discussion Paper, “Himalayan Glaciers – State-of-Art Review of Glacial Studies, Glacial Retreat and Climate Change”, the first comprehensive study on the region.  Winner of the Indian Antarctica Award, Chandigarh, India
  129. Scott Chesner, B.S. (Meteorology, Penn State University), KETK Chief Meteorologist, KETK TV, previously Meteorologist with Accu Weather, Tyler, Texas, U.S.A

*   *   *

Reactions (I will not mention names here; all are from emails in the EANTH list)

1) “Hmm, I clicked on a few links, googled a few names. Found that when one is listed as “author of x book”, said book doesn’t appear on Amazon, etc.

Many non-PhDs.

Many “consultants”. Losts of “adjuncts”, lots of professor emeriti. someone listed as an “Extraordinary Research Associate”.

Little actual data. Few peer-reviewed research reports.

Didn’t recognize most of the names. Did recognize some “suspicious” ones (e.g., Tim Ball, a lovely [sic] Canadian).

Misrepresentation of the SREX report (quotation is a minor comment on a single point of many – page 280 of 594).

Link is to a letter published in the Financial Post, the business section of the National Post, the more rightward leaning of Canada’s 2 national papers. To give an indication, on the day in 2007 when the Nobel was awarded to IPCC and Gore, the headline on front page was “A Coup for Junk Science: Gaffe riddled work undeserving”.

Conclusion: don’t bother to click the link.”

 

2) “A few of the names on the list are also contained in table 3 of the 2012 Heartland Institute Proposed Budget (pages 7-8). Namely:

Craig D. Idso

Anthony Lupo

Susan Crockford

Joseph D’Aleo

Fred Singer

Robert Carter

Link:

http://www.desmogblog.com/sites/beta.desmogblog.com/files/(1-15-2012)%202012%20Heartland%20Budget.pdf

 

3) “Is it that bad guys without phd and associations with the wrong institutions nullify the legitimacy of the good guys with proper credentials? Suppose you could not look them up. Would you be unable to judge the contents (with links to data) of the letter? You seem to require a certain kind of authority (defined by political means especially) to allow you to decide whether ideas are valuable. How sad. If every scientist were that intellectually timid there would be no learning. Thank goodness for the Feynmans of the world.”

 

4) “Short of being able to read, review and test all the science, a person has to make judgements based on additional criteria. My criteria include, but are not limited to, some things such as peer-review, credentials, reputation, availability of cited sources/affiliation/expertise, guilt-by-association, and so on. They are only part of the judgement of credibility. I looked up a book listed in the credentials of one “expert” and could not find it; I followed links, and so on.

The endpoint was when I looked for the quotation in the cited source (SREX) and evaluated it as misrepresentation. Since the IPCC was called on as an expert source by the so-called experts, yet it claimed other than what they claimed it claimed, the credibility of the letters and listed experts is to be disparaged.

That’s the character of science, and process of knowledge. Yup, that is how one really, really does judge which ideas are valuable.”

 

5) “Thank you for drawing attention to this open letter. I suspect quite a number of the people named in this letter are members of naysayer groups. From an Australian perspective Prof Bob Carter is a member of the secretive  Lavoissier group. I have inside knowledge of this group as I was approached with my husband to write a film script about climate change many years ago and we pulled out eventually after being told what they wanted to say about the science of climate change, which required a distortion of the facts. We had the impression that the money for the film was coming from America and I wouldn’t mind betting that it was oil and mining interest finance ($6 million). The person who set out to recruit us was a glaciologist who was also a member of the Lavoissier group. For more information see the following:

Pearse, Guy , “High and Dry”, Viking/Penguin,Camberwell, Victoria, 2007.
Hamilton, Clive, “Scorcher: The Dirty Politics of Climate Change, Black Inc. Agenda, Melbourne, Victoria, 2007.”

 

6) “I read a short and entertaining book that laid out a good process for deciding what to believe about climate change (or any other complex issue with lots of scientific research swirling around). It’s by Greg Craven and it’s called What’s the Worst That Could Happen?

Besides providing a way to cut through all the chatter, the book offers sound fundamentals for people interested in how scientific information comes to be accepted. I think it’s a great book for students, especially because the author (a physics teacher) tackles tough subjects with humor.

Here’s a link to it: http://www.amazon.com/Whats-Worst-That-Could-Happen/dp/0399535012/

The State of Climate Science (scienceprogress.org)

CLIMATE SCIENCE

A Thorough Review of the Scientific Literature on Global Warming

By Dr. James Powell | Thursday, November 15th, 2012

Polls show that many members of the public believe that scientists substantially disagree about human-caused global warming. The gold standard of science is the peer-reviewed literature. If there is disagreement among scientists, based not on opinion but on hard evidence, it will be found in the peer-reviewed literature.

I searched the Web of Science, an online science publication tool, for peer-reviewed scientific articles published between January first 1991 and November 9th 2012 that have the keyword phrases “global warming” or “global climate change.” The search produced 13,950 articles. See methodology.

I read whatever combination of titles, abstracts, and entire articles was necessary to identify articles that “reject” human-caused global warming. To be classified as rejecting, an article had to clearly and explicitly state that the theory of global warming is false or, as happened in a few cases, that some other process better explains the observed warming. Articles that merely claimed to have found some discrepancy, some minor flaw, some reason for doubt, I did not classify as rejecting global warming.

Articles about methods, paleoclimatology, mitigation, adaptation, and effects at least implicitly accept human-caused global warming and were usually obvious from the title alone. John Cook and Dana Nuccitelli also reviewed and assigned some of these articles; John provided invaluable technical expertise.

This work follows that of Oreskes (Science, 2005) who searched for articles published between 1993 and 2003 with the keyword phrase “global climate change.” She found 928, read the abstracts of each and classified them. None rejected human-caused global warming. Using her criteria and time-span, I get the same result. Deniers attacked Oreskes and her findings, but they have held up.

Some articles on global warming may use other keywords, for example, “climate change” without the “global” prefix. But there is no reason to think that the proportion rejecting global warming would be any higher.

By my definition, 24 of the 13,950 articles, 0.17 percent or 1 in 581, clearly reject global warming or endorse a cause other than CO2 emissions for observed warming. The list of articles that reject global warming is here.

The 24 articles have been cited a total of 113 times over the nearly 21-year period, for an average of close to 5 citations each. That compares to an average of about 19 citations for articles answering to “global warming,” for example. Four of the rejecting articles have never been cited; four have citations in the double-digits. The most-cited has 17.

Of one thing we can be certain: had any of these articles presented the magic bullet that falsifies human-caused global warming, that article would be on its way to becoming one of the most-cited in the history of science.

The articles have a total of 33,690 individual authors. The top ten countries represented, in order, are USA, England, China, Germany, Japan, Canada, Australia, France, Spain, and Netherlands. (The chart shows results through November 9th, 2012.)

Global warming deniers often claim that bias prevents them from publishing in peer-reviewed journals. But 24 articles in 18 different journals, collectively making several different arguments against global warming, expose that claim as false. Articles rejecting global warming can be published, but those that have been have earned little support or notice, even from other deniers.

A few deniers have become well known from newspaper interviews, Congressional hearings, conferences of climate change critics, books, lectures, websites and the like. Their names are conspicuously rare among the authors of the rejecting articles. Like those authors, the prominent deniers must have no evidence that falsifies global warming.

Anyone can repeat this search and post their findings. Another reviewer would likely have slightly different standards than mine and get a different number of rejecting articles. But no one will be able to reach a different conclusion, for only one conclusion is possible: Within science, global warming denial has virtually no influence. Its influence is instead on a misguided media, politicians all-too-willing to deny science for their own gain, and a gullible public.

Scientists do not disagree about human-caused global warming. It is the ruling paradigm of climate science, in the same way that plate tectonics is the ruling paradigm of geology. We know that continents move. We know that the earth is warming and that human emissions of greenhouse gases are the primary cause. These are known facts about which virtually all publishing scientists agree.

James Lawrence Powell is the author of The Inquisition of Climate Science. Powell is also the executive director of the National Physical Science Consortium, a partnership among government agencies and laboratories, industry, and higher education dedicated to increasing the number of American citizens with graduate degrees in the physical sciences and related engineering fields. This article is cross-posted with permission with the Columbia University Press blog.

This article is a cross-post with our partners at DeSmogBlog.

Saúde mental, outra vítima da mudança climática (IPS)

23/11/2012 – 10h05

por Patricia Grogg, da IPS

clima Saúde mental, outra vítima da mudança climática

As tensões e angústias acompanham toda pessoa que sofre um desastre. Foto: Jorge Luis Baños/IPS

Santiago de Cuba, Cuba, 23/11/2012 – “A cidade parecia bombardeada. Caminho para meu escritório, cruzo com pessoas que levavam em seus rostos o mesmo – diria dramático – espanto que eu. Nos olhávamos e, sem nos conhecermos, nos perguntávamos: como foi com você? Aconteceu alguma coisa com sua casa? Foi uma solidariedade afetiva muito importante para mim”. Este testemunho dado à IPS, por uma jornalista de Santiago de Cuba, coloca na balança um dos lados bons da reação coletiva após um desastre como o sofrido por esta cidade na madrugada do dia 25 de outubro, quando o furacão Sandy, apesar do alerta meteorológico e das advertências oficiais, surpreendeu boa parte de seus habitantes.

O valor econômico dos prejuízos ainda são desconhecidos hoje, quando a parte mais oriental do país cura suas feridas, graves de todos os ângulos. Mas existe também o impacto psicológico, do qual se fala menos e se vê nos olhos das pessoas quando contam: “perdemos nossa casa com móveis, eletrodomésticos, até as lembranças”. “Tive muito medo, me enfiei no armário quando o vento levou o telhado do meu quarto. Meus vizinhos me tiraram de casa e me ajudaram a atravessar a rua até onde haviam se refugiado outras famílias cujas casas estavam em muito mau estado”, contou à IPS Isabel da Cruz, de 70 anos, moradora de Guantânamo, outra área afetada.

Depressão, tristeza, angústia, desespero, incerteza e agressividade, todas estas são manifestações que acompanham as pessoas depois de um desastre em qualquer parte do mundo. “Imagine, nos deitamos com a bela e acordamos com a fera”, comparou um trabalhador do setor turístico cujo hotel onde é empregado foi totalmente destruído. “As pessoas estão deprimidas e desorientadas. Em muitas nota-se o desequilíbrio psíquico pelas perdas sofridas”, disse à IPS o sacerdote católico Eugenio Castellanos, reitor do Santuário da Caridad del Cobre, virgem padroeira de Cuba. O padre estima que 90% das casas do Cobre, localidade vizinha a esta cidade, sofreram o impacto do Sandy.

Juan González Pérez, por sua vez, disse à IPS que dias antes do furacão houve focos de violência em alguns lugares, especialmente na hora de comprar artigos em falta. “Ficamos muitos dias sem energia elétrica e começaram a vender ‘luz brilhante’ (querosene) para cozinhar. Embora houvesse o suficiente para todos, aconteceram discussões e brigas na fila. Quando as pessoas se desesperam, costumam ficar agressivas”, observou Pérez, mais conhecido por Madelaine, líder do espiritismo cruzado “muertero”, uma expressão de religiosidade popular nesse lugar. Segundo contou, aconselha aos seus seguidores “unirem-se, se lavar bem, dar a quem não tem e não se desesperar”.

Em Mar Verde, a praia por onde o Sandy tocou o território cubano a 15 quilômetros de Santiago, a médica Elizabeth Martínez atende mais de cem pessoas, abrigadas em cabanas de veraneio que, por estarem mais afastadas do mar, se salvaram do desastre. “O impacto psicológico é grande, mas não houve mortes e nem temos pessoas doentes”, contou. Pouco mais de uma semana depois da passagem do furacão, os esforços em matéria de saúde se concentravam fundamentalmente em conter focos epidêmicos. “Estamos dando informações sanitárias aos moradores, ensinando como cuidar de doenças transmissíveis, sobre a importância de descontaminar a água antes de beber”, informou a médica.

Segundo meios especializados, estima-se que entre um terço e metade de uma população exposta a desastres sofre algum tipo de problema psicológico, embora na maioria dos casos se deva entender como reações normais diante de eventos extremos, que sob o impacto da mudança climática ameaçam aumentar em intensidade.

“Quando encontrei meus vizinhos no abrigo, estávamos em choque. Mas alguém disse: vamos limpar a entrada que está bloqueada por essas árvores caídas. Então, começamos a trabalhar, embora no começo ninguém falasse”, contou uma mulher do setor turístico. Nos primeiros dias era possível ver muitas pessoas recolhendo escombros e varrendo as ruas de suas vizinhanças.

Diante da frequência e da maior intensidade dos ciclones tropicais, as autoridades de saúde, desde a década de 1990, começaram a se preocupar com o impacto psicológico dos desastres causados por esses e outros fenômenos naturais. Em 2008, quando o país sofreu três furacões, uma indicação ministerial fortaleceu a inclusão do tema nos planos sanitários. Em um artigo sobre o assunto, o médico cubano Alexis Lorenzo Ruiz explica que os aspectos psicossociais dos desastres são considerados tanto na capacitação do pessoal como na organização dos programas que chegam a todo o país e enfatizam a atenção a setores mais vulneráveis, como menores de idade, adolescentes e idosos.

Do ponto de vista da saúde mental, nos desastres toda a população “sofre tensões e angústias em maior ou menor medida, direta ou indiretamente”, afirmaram Katia Villamil e Orlando Fleitas, que recomendaram não se esquecer que o impacto nessas circunstâncias é mais acentuado em populações de escassos recursos. Estes profissionais afirmam que as reações mais frequentes vão desde as consideradas normais, como ansiedade controlável, depressão leve ou quadros “histeriformes”, até estresse “peritraumático”, embotamento, redução do nível de atenção, descompensação de transtornos psiquiátricos pré-existentes, bem como “reação coletiva de agitação”.

O furacão Sandy causou estragos não apenas em Santiago de Cuba, mas também nas províncias de Guantânamo e Holguín, com saldo de 11 mortos. O governo de Raúl Castro ainda não divulgou as perdas econômicas, embora dados preliminares e incompletos dos primeiros dias indicassem uma estimativa de US$ 88 milhões.

Call to Modernize Antiquated Climate Negotiations (Science Daily)

ScienceDaily (Nov. 18, 2012) — The structure and processes of United Nations climate negotiations are “antiquated,” unfair and obstruct attempts to reach agreements, according to research published November 18.

The findings come ahead of the 18thUN Climate Change Summit, which starts in Doha on November 26.

The study, led by Dr Heike Schroeder from the University of East Anglia (UEA) and the Tyndall Centre for Climate Change Research, argues that the consensus-based decision making used by the United Nations Framework Convention on Climate Change (UNFCCC) stifles progress and contributes to negotiating deadlocks, which ultimately hurts poor countries more than rich countries.

It shows that delegations from some countries taking part have increased in size over the years, while others have decreased, limiting poor countries’ negotiating power and making their participation less effective.

Writing in the journal Nature Climate Change, Dr Schroeder, Dr Maxwell Boykoff of the University of Colorado and Laura Spiers of Pricewaterhouse Coopers, argue that changes are long overdue if demands for climate mitigation and adaptation agreements are to be met.

They recommend that countries consider capping delegation numbers at a level that allows broad representation across government departments and sectors of society, while maintaining a manageable overall size.

Dr Schroeder, of UEA’s School of International Development, will be attending COP18. She said: “The UN must recognize that these antiquated structures serve to constrain rather than compel co-operation on international climate policy. The time is long overdue for changes to institutions and structures that do not support decision-making and agreements.

“Poor countries cannot afford to send large delegations and their level of expertise usually remains significantly below that of wealthier countries. This limits poor countries’ negotiating power and makes their participation in each session less effective.”

The researchers found that attendance has changed in terms of the number and diversity of representatives. The number of delegates went from 757 representing 170 countries at the first COP in 1995 to 10,591 individuals from 194 countries attending COP15 in 2009 — a 1400 per cent increase. At COP15 there were also 13,500 delegates from 937 non-government Observer organisations.

Small developing countries have down-sized their delegations while G-7 and +5 countries (Brazil, China, India, Mexico, and South Africa) have increased theirs. The exception is the United States, which after withdrawing from the Kyoto Protocol started to send fewer delegates to COPs.

The study also looked at the make-up of the delegations and found an increase in participation by environmental, campaigning, academic and other non-Governmental organisations.

“Our work shows an increasing trend in the size of delegations on one side and a change in the intensity, profile and politicization of the negotiations on the other,” explained Dr Schroeder. “These variations suggest the climate change issue and its associated interests are framed quite differently across countries. NSAs are well represented on national delegations but clearly the government decides who is included and who is not, and what the official negotiating position of the country and its level of negotiating flexibility are.”

Some countries send large representations from business associations (Brazil), local government (Canada) orscience and academia (Russia). For small developing countries such as Bhutan and Gabon the majority of government representatives come from environment, forestry and agriculture. The UK has moved from mainly environment, forestry and agriculture to energy and natural resources. The US has shifted from these more conventional areas to an overwhelming representation from the US Congress at COP15.

Journal Reference:

  1. Heike Schroeder, Maxwell T. Boykoff, Laura Spiers. Equity and state representations in climate negotiations.Nature Climate Change, 2012; DOI: 10.1038/nclimate1742

Government, Industry Can Better Manage Risks of Very Rare Catastrophic Events, Experts Say (Science Daily)

ScienceDaily (Nov. 15, 2012) — Several potentially preventable disasters have occurred during the past decade, including the recent outbreak of rare fungal meningitis linked to steroid shots given to 13,000 patients to relieve back pain. Before that, the 9/11 terrorist attacks in 2001, the Space Shuttle Columbia explosion in 2003, the financial crisis that started in 2008, the Deepwater Horizon accident in the Gulf of Mexico in 2011, and the Fukushima tsunami and ensuing nuclear accident also in 2011 were among rare and unexpected disasters that were considered extremely unlikely or even unthinkable.

A Stanford University engineer and risk management expert has analyzed the phenomenon of government and industry waiting for rare catastrophes to happen before taking risk management steps. She concluded that a different approach to these events would go far towards anticipating them, preventing them or limiting the losses.

To examine the risk management failures discernible in several major catastrophes, the research draws upon the combination of systems analysis and probability as used, for example, in engineering risk analysis. When relevant statistics are not available, it discusses the powerful alternative of systemic risk analysis to try to anticipate and manage the risks of highly uncertain, rare events. The paper by Stanford University researcher Professor Elisabeth Paté-Cornell recommends “a systematic risk analysis anchored in history and fundamental knowledge” as opposed to both industry and regulators sometimes waiting until after a disaster occurs to take safety measures as was the case, for example, of the Deepwater Horizon accident in 2011. Her paper, “On ‘Black Swans’ and ‘Perfect Storms’: Risk Analysis and Management When Statistics Are Not Enough,” appears in the November 2012 issue of Risk Analysis, published by the Society for Risk Analysis.

Paté-Cornell’s paper draws upon two commonly cited images representing different types of uncertainty — “black swans” and “perfect storms” — that are used both to describe extremely unlikely but high-consequence events and often to justify inaction until after the fact. The uncertainty in “perfect storms” derives mainly from the randomness of rare but known events occurring together. The uncertainty in “black swans” stems from the limits of fundamental understanding of a phenomenon, including in extreme cases, a complete lack of knowledge about its very existence.

Given these two extreme types of uncertainties, Paté-Cornell asks what has been learned about rare events in engineering risk analysis that can be incorporated in other fields such as finance or medicine. She notes that risk management often requires “an in-depth analysis of the system, its functions, and the probabilities of its failure modes.” The discipline confronts uncertainties by systematic identification of failure “scenarios,” including rare ones, using “reasoned imagination,” signals (new intelligence information, medical alerts, near-misses and accident precursors) and a set of analytical tools to assess the chances of events that have not happened yet. A main emphasis of systemic risk analysis is on dependencies (of failures, human errors, etc.) and on the role of external factors, such as earthquakes and tsunamis that become common causes of failure.

The “risk of no risk analysis” is illustrated by the case of the 14 meter Fukushima tsunami resulting from a magnitude 9 earthquake. Historical records showed that large tsunamis had occurred at least twice before in the same area. The first time was the Sanriku earthquake in the year 869, which was estimated at magnitude 8.6 with a tsunami that penetrated 4 kilometers inland. The second was the Sanriku earthquake of 1611, estimated at magnitude 8.1 that caused a tsunami with an estimated maximum wave height of about 20 meters. Yet, those previous events were not factored into the design of the Fukushima Dai-ichi nuclear reactor, which was built for a maximum wave height of 5.7 meters, simply based on the tidal wave caused in that area by the 1960 earthquake in Chile. Similar failures to capture historical data and various “signals” occurred in the cases of the 9/11 attacks, the Columbia Space Shuttle explosion and other examples analyzed in the paper.

The risks of truly unimaginable events that have never been seen before (such as the AIDS epidemics) cannot be assessed a priori, but careful and systematic monitoring, signals observation and a concerted response are keys to limiting the losses. Other rare events that place heavy pressure on human or technical systems are the result of convergences of known events (“perfect storms”) that can and should be anticipated. Their probabilities can be assessed using a set of analytical tools that capture dependencies and dynamics in scenario analysis. Given the results of such models, there should be no excuse for failing to take measures against rare but predictable events that have damaging consequences, and to react to signals, even imperfect ones, that something new may be unfolding.

Journal Reference:

  1. Elisabeth Paté-Cornell. On “Black Swans” and “Perfect Storms”: Risk Analysis and Management When Statistics Are Not EnoughRisk Analysis, 2012; DOI:10.1111/j.1539-6924.2011.01787.x

Estudo aumenta precisão ao simular clima (Folha de São Paulo)

JC e-mail 4621, de 09 de Novembro de 2012.

Americanos conseguiram método indireto para levar em conta o papel das nuvens no aquecimento do planeta. Metodologia criada por eles indica que o mais provável neste século é que temperatura média aumente perto de 4º C.

Pesquisadores americanos acabam de achar um meio de determinar quais modelos da mudança climática parecem ser os mais precisos. E a má notícia: os melhores são os que predizem modificações mais drásticas no clima para as próximas décadas.

O segredo do trabalho, conduzido por John Fasullo e Kevin Trenberth, do Centro Nacional Para Pesquisa Atmosférica em Boulder, Colorado, foi se concentrar naquilo que se podia ver -no caso, a umidade relativa em regiões subtropicais- para compreender o que é muito mais difícil de medir: a dinâmica das nuvens.

As nuvens são um dos elementos-chave na interpretação do fenômeno do aquecimento global. Isso porque elas têm um efeito duplo. Por um lado, por serem claras, elas refletem a luz solar para o espaço, resultando em resfriamento. Por outro, o vapor d’água nelas é um poderoso gás do efeito estufa, podendo gerar aquecimento.

Os modelos de computador têm dificuldade em lidar com as nuvens e seu papel na evolução do clima.

Incerteza, em termos – Já é possível simular, ao menos em parte, o efeito delas, e existe um consenso mais ou menos claro de que a soma de tudo que elas fazem resulta em suave resfriamento. Entretanto, ainda há muita incerteza sobre o que isso significa para o futuro.

Tal incerteza é o grande mal a afetar a ciência do aquecimento global. Os detratores costumam apontá-la como a prova de que o medo da mudança climática é muito mais um movimento ideológico do que uma conclusão científica inescapável.

Ao que tudo indica, porém, a incerteza diz respeito ao nível de aquecimento para as próximas décadas, mas não ao fenômeno em si. Alguns modelos sugerem que, nos próximos cem anos, veremos um aumento da temperatura média da ordem de 4,5 graus Celsius. Já os mais modestos preveem que tudo não passará de uma variação de 1,5 grau Celsius.

Foi aí que entrou em cena o lampejo de Fasullo e Trenberth. Como é difícil observar diretamente as propriedades das nuvens e compará-las com o que os modelos oferecem, eles decidiram estudar a umidade relativa do ar, sobretudo nas regiões subtropicais, em geral mais secas.

A vantagem é que dados de umidade relativa são obtidos com confiança a partir de satélites, de forma que é possível contrastar as previsões dos modelos para o presente com observações reais. Também há forte correlação entre a umidade relativa e o processo de formação de nuvens, de forma que, a partir de um, é possível inferir o efeito de outro. O chato é que os modelos que parecem estar mais corretos são justamente aqueles que preveem mudanças mais fortes, da ordem de 4,5º C.

A questão das nuvens, porém, não é a única fonte de incertezas. “Esse trabalho é só uma das peças do quebra-cabeças da sensibilidade climática”, afirma Karen Shell, da Universidade Estadual do Oregon (EUA), que comentou a pesquisa na mesma edição da revista “Science” na qual os resultados saíram.

Desastre natural é empecilho ao desenvolvimento do Brasil (O Globo)

JC e-mail 4621, de 09 de Novembro de 2012.

Especialista do Banco Mundial, Joaquín Toro diz que enchentes dos últimos cinco anos custaram R$ 15 bilhões; problema deve se agravar com mudança climática.

O Brasil gosta de se imaginar como um país livre de desastres naturais. Isso é verdade?
O Brasil não tem eventos catastróficos que afetem o País inteiro, como tsunamis, terremotos, furacões. Quer dizer, não com muita intensidade. Porque, na verdade, temos terremotos, há zonas sísmicas em Minas e no Nordeste, e ciclones tropicais – houve dois nos últimos dez anos, embora não muito grandes. Há uma percepção no País de que não há eventos catastróficos. Mas quando olhamos por estado, vemos grandes perdas, tanto humanas quanto econômicas.

Qual foi o pior deles?
Nos últimos cinco anos, tivemos quatro grandes eventos. O primeiro, em 2008, as enchentes do Vale do Itajaí, em Santa Catarina. Tivemos enchentes também em Pernambuco e Alagoas, em 2010, e as enxurradas no Rio, na Região Serrana, no começo do ano passado. Para dizer qual foi o pior, qual teve o maior impacto, depende do que for levado em conta. Em termos de número de vidas perdidas, o do Rio de Janeiro foi o pior dos últimos tempos do Brasil, com cerca de mil mortos. Mas se considerarmos o impacto econômico comparado com o PIB do estado, por exemplo, vemos que o de Alagoas foi o mais impactante: quase 8% do PIB.

Por que fazer os estudos agora?
Nunca foi feita sistematicamente no Brasil a avaliação do impacto econômico de desastres. Não diz respeito apenas a perdas diretas, como a destruição de uma ponte, de uma escola, de infraestrutura. Mas também, o impacto da perda da ponte na produção econômica. Essa avaliação não era muito sistematizada. Havia a cultura de pagar pelo desastre. Como em geral não morre muita gente, a percepção é de que o desastre não foi grande. Mas economicamente foi catastrófico.

Mesmo em comparação ao furacão Sandy, nos EUA?
O furacão teve um impacto econômico de US$ 50 bilhões, o equivalente a 2% do PIB da região afetada. Em Alagoas, o prejuízo foi de 8% do PIB. Claro que Alagoas é um dos estados mais pobres do Brasil, qualquer impacto será grande. Mas o que estamos querendo demonstrar é que isso pode ser um empecilho ao desenvolvimento.

Como isso ocorre?
Geralmente o que acontece é que, para pagar o desastre, a reconstrução, é preciso buscar recursos em algum lugar. Primeiro, o município começa a usar todos os recursos que tem. Vão embora seus planos de desenvolvimento, programas sociais, educação, saúde. Todos os recursos vão suprir a reconstrução. Aí vêm as transferências estaduais e federais, que também saem de algum orçamento, porque não existe fundo de emergência. Outros estados acabam sendo afetados.

Qual foi o atraso no desenvolvimento por conta desses eventos?
Não temos esse número, mas o impacto econômico dos desastres naturais nos últimos cinco anos foi de R$ 15 bilhões. A pergunta é: o que poderíamos ter feito com R$ 15 bilhões?

É mais caro reconstruir?
É muito mais caro. Estudos mostram que para cada dólar investido em prevenção ou redução de riscos, representa uma economia de 5 a 7 dólares na recuperação.

Por que não há prevenção?
Por um lado não tínhamos muito conhecimento do risco, não entendíamos o problema. Não há cultura de prevenção e as pessoas esquecem muito rápido, o que ocorreu há cinco, dez anos. Mas há mudanças. Há uma nova política de redução de riscos.

Piora com o aquecimento global?
A pergunta é o que vamos fazer para evitar o crescimento desordenado das cidades. Se tivermos de 10% a 20% a mais de chuvas mas também cidades bem resolvidas, o impacto será muito menor. Mas, se não pudermos nos adaptar, será ainda mais difícil. Vamos ter mais chuvas e secas, e variabilidade climática alta.