Jul 3, 1:10 PM EDT
By SETH BORENSTEIN
AP Science Writer
WASHINGTON (AP) — Is it just freakish weather or something more? Climate scientists suggest that if you want a glimpse of some of the worst of global warming, take a look at U.S. weather in recent weeks.
Horrendous wildfires. Oppressive heat waves. Devastating droughts. Flooding from giant deluges. And a powerful freak wind storm called a derecho.
These are the kinds of extremes experts have predicted will come with climate change, although it’s far too early to say that is the cause. Nor will they say global warming is the reason 3,215 daily high temperature records were set in the month of June.
Scientifically linking individual weather events to climate change takes intensive study, complicated mathematics, computer models and lots of time. Sometimes it isn’t caused by global warming. Weather is always variable; freak things happen.
And this weather has been local. Europe, Asia and Africa aren’t having similar disasters now, although they’ve had their own extreme events in recent years.
But since at least 1988, climate scientists have warned that climate change would bring, in general, increased heat waves, more droughts, more sudden downpours, more widespread wildfires and worsening storms. In the United States, those extremes are happening here and now.
So far this year, more than 2.1 million acres have burned in wildfires, more than 113 million people in the U.S. were in areas under extreme heat advisories last Friday, two-thirds of the country is experiencing drought, and earlier in June, deluges flooded Minnesota and Florida.
“This is what global warming looks like at the regional or personal level,” said Jonathan Overpeck, professor of geosciences and atmospheric sciences at the University of Arizona. “The extra heat increases the odds of worse heat waves, droughts, storms and wildfire. This is certainly what I and many other climate scientists have been warning about.”
Kevin Trenberth, head of climate analysis at the National Center for Atmospheric Research in fire-charred Colorado, said these are the very record-breaking conditions he has said would happen, but many people wouldn’t listen. So it’s I told-you-so time, he said.
As recently as March, a special report an extreme events and disasters by the Nobel Prize-winning Intergovernmental Panel on Climate Change warned of “unprecedented extreme weather and climate events.” Its lead author, Chris Field of the Carnegie Institution and Stanford University, said Monday, “It’s really dramatic how many of the patterns that we’ve talked about as the expression of the extremes are hitting the U.S. right now.”
“What we’re seeing really is a window into what global warming really looks like,” said Princeton University geosciences and international affairs professor Michael Oppenheimer. “It looks like heat. It looks like fires. It looks like this kind of environmental disasters.”
Oppenheimer said that on Thursday. That was before the East Coast was hit with triple-digit temperatures and before a derecho – a large, powerful and long-lasting straight-line wind storm – blew from Chicago to Washington. The storm and its aftermath killed more than 20 people and left millions without electricity. Experts say it had energy readings five times that of normal thunderstorms.
Fueled by the record high heat, this was among the strongest of this type of storm in the region in recent history, said research meteorologist Harold Brooks of the National Severe Storm Laboratory in Norman, Okla. Scientists expect “non-tornadic wind events” like this one and other thunderstorms to increase with climate change because of the heat and instability, he said.
Such patterns haven’t happened only in the past week or two. The spring and winter in the U.S. were the warmest on record and among the least snowy, setting the stage for the weather extremes to come, scientists say.
Since Jan. 1, the United States has set more than 40,000 hot temperature records, but fewer than 6,000 cold temperature records, according to the National Oceanic and Atmospheric Administration. Through most of last century, the U.S. used to set cold and hot records evenly, but in the first decade of this century America set two hot records for every cold one, said Jerry Meehl, a climate extreme expert at the National Center for Atmospheric Research. This year the ratio is about 7 hot to 1 cold. Some computer models say that ratio will hit 20-to-1 by midcentury, Meehl said.
“In the future you would expect larger, longer more intense heat waves and we’ve seen that in the last few summers,” NOAA Climate Monitoring chief Derek Arndt said.
The 100-degree heat, drought, early snowpack melt and beetles waking from hibernation early to strip trees all combined to set the stage for the current unusual spread of wildfires in the West, said University of Montana ecosystems professor Steven Running, an expert on wildfires.
While at least 15 climate scientists told The Associated Press that this long hot U.S. summer is consistent with what is to be expected in global warming, history is full of such extremes, said John Christy at the University of Alabama in Huntsville. He’s a global warming skeptic who says, “The guilty party in my view is Mother Nature.”
But the vast majority of mainstream climate scientists, such as Meehl, disagree: “This is what global warming is like, and we’ll see more of this as we go into the future.”
Intergovernmental Panel on Climate Change report on extreme weather: http://ipcc-wg2.gov/SREX/
U.S. weather records:
Seth Borenstein can be followed at http://twitter.com/borenbears
* * *
July 3, 2012
To Predict Environmental Doom, Ignore the Past
By Todd Myers
The information presented here cannot be used directly to calculate Earth’s long-term carrying capacity for human beings because, among other things, carrying capacity depends on both the affluence of the population being supported and the technologies supporting it. – Paul Ehrlich, 1986
One would expect scientists to pause when they realize their argument about resource collapse makes the king of environmental catastrophe, Paul Ehrlich, look moderate by comparison. Ehrlich is best known for a 40-year series of wildly inaccurate predictions of looming environmental disaster. Yet he looks positively reasonable compared to a paper recently published in the scientific journal Nature titled “Approaching a state shift in Earth’s biosphere.”
The paper predicts we are rapidly approaching a moment of “planetary-scale critical transition,” due to overuse of resources, climate change and other human-caused environmental damage. As a result, the authors conclude, this will “require reducing world population growth and per-capita resource use; rapidly increasing the proportion of the world’s energy budget that is supplied by sources other than fossil fuels,” and a range of other drastic policies. If these sound much like the ideas proposed in the 1970s by Ehrlich and others, like The Club of Rome, it is not a coincidence. TheNature paper is built on Ehrlich’s assumptions and cites his work more than once.
The Nature article, however, suffers from numerous simple statistical errors and assumptions rather than evidence. Its authors do nothing to deal with the fundamental mistakes that led Ehrlich and others like him down the wrong path so many times. Instead, the paper simply argues that with improved data, this time their predictions of doom are correct.
Ultimately, the piece is a good example of the great philosopher of science Thomas Kuhn’s hypothesis, written 50 years ago, that scientists often attempt to fit the data to conform to their particular scientific paradigm, even when that paradigm is obviously flawed. When confronted with failure to explain real-world phenomena, the authors of the Nature piece have, as Kuhn described in The Structure of Scientific Revolutions, devised “numerous articulations and ad hoc modifications of their theory in order to eliminate any apparent conflict.” Like scientists blindly devoted to a failed paradigm, the Nature piece simply tries to force new data to fit a flawed concept.
“Assuming this does not change”
During the last half-century, the world has witnessed a dramatic increase in food production. According to the U.N.’s Food and Agriculture Organization, yields per acre of rice have more than doubled, corn yields are more than one-and-a-half times larger than 50 years ago, and wheat yields have almost tripled. As a result, even as human population has increased, worldwide hunger has declined.
Despite these well-known statistics, the authors of the Nature study assume not only no future technological improvements, but that none have occurred over the last 200 years. The authors simply choose one data point and then project it both into the past and into the future. The authors explain the assumption that underlies their thesis in the caption to a graphic showing the Earth approaching environmental saturation. They write:
“The percentages of such transformed lands… when divided by 7,000,000,000 (the present global human population) yield a value of approximately 2.27 acres (0.92 ha) of transformed land for each person. That value was used to estimate the amount of transformed land that probably existed in the years 1800, 1900 and 1950, and which would exist in 2025 and 2045 assuming conservative population growth and that resource use does not become any more efficient.” (emphasis added)
In other words, the basis for their argument ignores the easily accessible data from the last half century. They take a snapshot in time and mistake it for a historical trend. In contrast to their claim of no change in the efficient use of resources, it would be difficult to find a time period in the last millennium when resource use did not become more efficient.
Ironically, this is the very error Ehrlich warns against in his 1986 paper – a paper the authors themselves cite several times. Despite Ehrlich’s admonition that projections of future carrying capacity are dependent upon technological change, the authors of the Nature article ignore history to come to their desired conclusion.
A Paradigm of Catastrophe
What would lead scientists to make such simplistic assumptions and flat-line projections? Indeed, what would lead Nature editors to print an article whose statistical underpinnings are so flawed? The simple belief in the paradigm of inevitable environmental catastrophe: humans are doing irreparable damage to the Earth and every bit of resource use moves us closer to that catastrophe. The catastrophe paradigm argues a simple model that eventually we will run out of space and resources, and determining the date of ultimate doom is a simple matter of doing the math.
Believing in this paradigm also justifies exaggeration in order to stave off the serious consequences of collapse. Thus, they describe the United Nations’ likely population estimate for 2050 as “the most conservative,” without explaining why. They claim “rapid climate change shows no signs of slowing” without providing a source citation for the claim, and despite an actual slowing of climate change over the last decade.
The need to avoid perceived global catastrophe also encourages the authors to blow past warning signs that their analysis is not built on solid foundations – as if the poor history of such projections were not already warning enough. Even as they admit the interactions “between overlapping complex systems, however, are providing difficult to characterize mathematically,” they base their conclusions on the simplest linear mathematical estimate that assumes nothing will change except population over the next 40 years. They then draw a straight line, literally, from today to the environmental tipping point.
Why is such an unscientific approach allowed to pass for science in a respected international journal? Because whatever the argument does not supply, the paradigm conveniently fills in. Even if the math isn’t reliable and there are obvious counterarguments, “everyone” understands and believes in the underlying truth – we are nearing the limits of the planet’s ability to support life. In this way the conclusion is not proven but assumed, making the supporting argument an impenetrable tautology.
Such a circumstance creates the conditions of scientific revolutions, where the old paradigm fails to explain real-world phenomena and is replaced by an alternative. Given the record of failure of the paradigm of resource catastrophe, dating back to the 1970s, one would hope we are moving toward such a change. Unfortunately, Nature and the authors of the piece are clinging to the old resource-depletion model, simply trying to re-work the numbers.
Let us hope policymakers recognize the failure of that paradigm before they make costly and dangerous policy mistakes that impoverish billions in the name of false scientific assumptions.
* * *
The Washington Policy Center labels itself as a non-partisan think tank. It’s a mischaractization to say the least but that is their bread and butter. Based in Seattle, with a director in Spokane, the WPC’s mission is to “promote free-market solutions through research and education.” It makes sense they have an environmental director in the form of Todd Myers who has a new book called“Eco-Fads: How The Rise Of Trendy Environmentalism Is Harming The Environment.” You know, since polar bears love to swim.
From the WPC’s newsletter:
Wherever we turn, politicians, businesses and activists are promoting the latest fashionable “green” policy or product. Green buildings, biofuels, electric cars, compact fluorescent lightbulbs and a variety of other technologies are touted as the next key step in protecting the environment and promoting a sustainable future. Increasingly, however, scientific and economic information regarding environmental problems takes a back seat to the social and personal value of being seen and perceived as “green.”
As environmental consciousness has become socially popular, eco-fads supplant objective data. Politicians pick the latest environmental agenda in the same way we choose the fall fashions – looking for what will yield the largest benefit with our public and social circles.
Eco-Fads exposes the pressures that cause politicians, businesses, the media and even scientists to fall for trendy environmental fads. It examines why we fall for such fads, even when we should know better. The desire to “be green” can cloud our judgment, causing us to place things that make us appear green ahead of actions that may be socially invisible yet environmentally responsible.
By recognizing the range of forces that have taken us in the wrong direction, Eco-Fads shows how we can begin to get back on track, creating a prosperous and sustainable legacy for our planet’s future. Order Eco-Fads today for $26.95 (tax and shipping included).
This is what the newsletter doesn’t tell you about Todd Myers.
Myers has spoken at the Heartland Institute’s International Conference on Climate Change. In case you didn’t know, the Heartland Institute has received significant funding from ExxonMobil, Phillip Morris and numerous other corporations and conservative foundations with vested interest in the so-called debate around climate change. That conference was co-sponsored by numerous prominent climate change denier groups, think tanks and lobby groups, almost all of which have received money from the oil industry.
Why not just call it the Washington Fallacy Center? For a litte more background, including ties back to the Koch Brothers, go HERE. In fact, Jack Kemp calls it “The Heritage Foundation of the Northwest.”
* * *
Photo by USAF.
The wildfires raging through Colorado and the West are unbelievable. As of yesterday there were 242 fires burning, according to the National Interagency Fire Center. Almost 350 homes have been destroyed in Colorado Springs, where 36,000 people have been evacuated from their homes. President Obama is visiting today to assess the devastation for himself.
Obviously the priority is containing the fires and protecting people. But inevitably the question is going to come up: Did climate change “cause” the fires? Regular readers know that this question drives me a little nuts. Pardon the long post, but I want to try to tackle this causation question once and for all.
What caused the Colorado Springs fire? Well, it was probably a careless toss of a cigarette butt, or someone burning leaves in their backyard, or a campfire that wasn’t properly doused. [UPDATE:Turns out it was lightning.] That spark, wherever it came from, is what triggered the cascading series of events we call “a fire.” It was what philosophers call the proximate cause, the most immediate, the closest.
All the other factors being discussed — the intense drought covering the state, the dead trees left behind by bark beetles, the high winds — are distal causes. Distal causes are less tightly connected to their effects. The dead trees didn’t make any particular fire inevitable; there can be no fire without a spark. What they did is make it more likelythat a fire would occur. Distal causes are like that: probabilistic. Nonetheless, our intuitions tell us that distal causes are in many ways more satisfactory explanations. They tell us something about themeaning of events, not just the mechanisms, which is why they’re also called “ultimate” causes. It’s meaning we usually want.
When we say, “the fires in Colorado were caused by unusually dry conditions, high winds, and diseased trees,” no one accuses us of error or imprecision because it was “really” the matches or campfires that caused them. We are not expected to say, “no individual fire can be definitively attributed to hot, windy conditions, but these are the kinds of fires we would expect to see in those conditions.” Why waste the words? We are understood to be talking about distal causes.
When we talk about, not fires themselves, but the economic and socialimpacts of fires, the range of distal causes grows even broader. For a given level of damages, it’s not enough to have dry conditions and dead trees, not even enough to have fire — you also have to take into account the density of development, the responsiveness of emergency services, and the preparedness of communities for prevention or evacuation.
So if we say, “the limited human toll of the Colorado fires is the result of the bravery and skill of Western firefighters,” no one accuses us of error or imprecision because good firefighting was only one of many contributors to the final level of damages. Everything from evacuation plans to the quality of the roads to the vagaries of the weather contributed in some way to that state of affairs. But we are understood to be identifying a distal cause, not giving a comprehensive account of causation.
What I’m trying to say is, we are perfectly comfortable discussing distal causes in ordinary language. We don’t require scientistic literalism in our everyday talk.
The reason I’m going through all this, you won’t be surprised, is to tie it back to climate change. We know, of course, that climate change was not the proximate cause of the fires. It was a distal cause; it made the fires more likely. That much we know with a high degree of confidence, as this excellent review of the latest science by Climate Communication makes clear.
One can distinguish between distal causes by their proximity to effects. Say the drought made the fires 50 percent more likely than average June conditions in Colorado. (I’m just pulling these numbers out of my ass to illustrate a point.) Climate change maybe only made the fires 1 percent more likely. As a cause, it is more distal than the drought. And there are probably causes even more distal than climate change. Maybe the exact tilt of the earth’s axis this June made the fires 0.0001 percent more likely. Maybe the location of a particular proton during the Big Bang made them 0.000000000000000001 percent more likely. You get the point.
With this in mind, it’s clear that the question as it’s frequently asked — “did climate change cause the fires?” — is not going to get us the answer we want. If it’s yes or no, the answer is “yes.” But that doesn’t tell us much. What people really want to know when they ask that question is, “how proximate a cause is climate change?”
When we ask the question like that, we start to see why climate is such a wicked problem. Human beings, by virtue of their evolution, physiology, and socialization, are designed to heed causes within a particular range between proximate and distal. If I find my kid next to an overturned glass and a puddle of milk and ask him why the milk is spilled, I don’t care about the neurons firing and the muscles contracting. That’s too proximate. I don’t care about humans evolving with poor peripheral vision. That’s too distal. I care about my kid reaching for it and knocking it over. That’s not the only level of causal explanation that is correct, but it’s the level of causal explanation that is most meaningful to me.
For a given effect — a fire, a flood, a dead forest — climate change is almost always too distal a cause to make a visceral impression on us. We’re just not built to pay heed to those 1 percent margins. It’s too abstract. The problem is, wildfires being 1 percent more likely averaged over the whole globe actually means a lot more fires, a lot more damage, loss, and human suffering. Part of managing the Anthropocene is finding ways of making distal causes visceral, giving them a bigger role in our thinking and institutions.
That’s what the “did climate change cause XYZ?” questions are always really about: how proximate a cause climate change is, how immediate its effects are in our lives, how close it is.
There is, of course, a constant temptation among climate hawks to exaggerate how proximate it is, since, all things being equal, proximity = salience. But I don’t think that simply saying “climate change caused the fires” is necessarily false or exaggerated, any more than saying “drought caused the fires” is. The fact that the former strikes many people as suspect while the latter is immediately understood mostly just means that we’re not used to thinking of climate change as a distal cause among others.
That’s why we reach for awkward language like, “fires like this are consonant with what we would expect from climate change.” Not because that’s the way we discuss all distal causes — it’s clearly not — but simply because we’re unaccustomed to counting climate change among those causes. It’s an unfamiliar habit. As it grows more familiar, I suspect we’ll quit having so many of these tedious semantic disputes.
And I’m afraid that, in coming years, it will become all-too familiar.
* * *
Perspective On The Hot and Dry Continental USA For 2012 Based On The Research Of Judy Curry and Of McCabe Et Al 2004
Photo is from June 26 2012 showing start of the June 26 Flagstaff firenear Boulder Colorado
I was alerted to an excellent presentation by Judy Curry [h/t to Don Bishop] which provides an informative explanation of the current hot and dry weather in the USA. The presentation is titled
Climate Dimensions of the Water Cycle by Judy Curry
First, there is an insightful statement by Judy where she writes in slide 5
CMIP century scale simulations are designed for assessing sensitivity to greenhouse gases using emissions scenarios They are not fit for the purpose of inferring decadal scale or regional climate variability, or assessing variations associated with natural forcing and internal variability. Downscaling does not help.
We need a much broader range of scenarios for regions (historical data, simple models, statistical models, paleoclimate analyses, etc). Permit creatively constructed scenarios as long as they can’t be falsified as incompatible with background knowledge.
With respect to the current hot and dry weather, the paper referenced by Judy in her Powerpoint talk
Gregory J. McCabe, Michael A. Palecki, and Julio L. Betancourt, 2004: Pacific and Atlantic Ocean influences on multidecadal drought frequency in the United States. PNAS 2004 101 (12) 4136-4141; published ahead of print March 11, 2004, doi:10.1073/pnas.0306738101
has the abstract [highlight added]
More than half (52%) of the spatial and temporal variance in multidecadal drought frequency over the conterminous United States is attributable to the Pacific Decadal Oscillation (PDO) and the Atlantic Multidecadal Oscillation (AMO). An additional 22% of the variance in drought frequency is related to a complex spatial pattern of positive and negative trends in drought occurrence possibly related to increasing Northern Hemisphere temperatures or some other unidirectional climate trend. Recent droughts with broad impacts over the conterminous U.S. (1996, 1999–2002) were associated with North Atlantic warming (positive AMO) and northeastern and tropical Pacific cooling (negative PDO). Much of the long-term predictability of drought frequency may reside in the multidecadal behavior of the North Atlantic Ocean. Should the current positive AMO (warm North Atlantic) conditions persist into the upcoming decade, we suggest two possible drought scenarios that resemble the continental-scale patterns of the 1930s (positive PDO) and 1950s (negative PDO) drought.
They also present the figure below with the title “Impact of AMO, PDO on 20-yr drought frequency (1900-1999)”. The figures correspond to A: Warm PDO, cool AMO; B: Cool PDO, cool AMO; C: Warm PDO, warm AMO and D: Cool PDO, warm AMO
The current Drought Monitor analysis shows a remarkable agreement with D, as shown below
As Judy shows in her talk (slide 8) since 1995 we have been in a warm phase of the AMO and have entered a cool phase of the PDO. This corresponds to D in the above figure. Thus the current drought and heat is not an unprecedented event but part of the variations in atmospheric-ocean circulation features that we have seen in the past. This reinforces what Judy wrote that
[w]e need a much broader range of scenarios for regions (historical data, simple models, statistical models, paleoclimate analyses
in our assessment of risks to key resources due to climate. Insightful discussions of the importance of these circulation features are also presented, as just a few excellent examples, by Joe Daleo and Joe Bistardi on ICECAP, by Bob Tisdale at Bob Tisdale – Climate Observations, and in posts on Anthony Watts’s weblog Watts Up With That.
Hotter summers could be a part of Washington’s future
View Photo Gallery — D.C. region sweats its way through heat wave: First came the record-setting heat, then the storm that knocked out power. Without air conditioning, residents are doing their best to stay cool.
Are unbroken weeks of sweltering weather becoming the norm rather than the exception?
The answer to the second, however, is a little more complicated.
Call it a qualified yes.
“Trying to wrap an analysis around it in real time is like trying to diagnose a car wreck as the cars are still spinning,” said Deke Arndt, chief of climate monitoring at the National Climatic Data Center in Asheville, N.C. “But we had record heat for the summer season on the Eastern Seaboard in 2010. We had not just record heat, but all-time record heat, in the summer season in 2011. And then you throw that on top of this [mild] winter and spring and the year to date so far, it’s very consistent with what we’d expect in a warming world.”
Nothing dreadfully dramatic is taking place — the seasons are not about to give way to an endless summer.
Heat-trapping greenhouse gases pumped into the atmosphere may be contributing to unusually hot and long heat waves — the kind of events climate scientists have long warned will become more common. Many anticipate a steady trend of ever-hotter average temperatures as human activity generates more and more carbon pollution.
To some, the numbers recorded this month and in recent years fit together to suggest a balmy future.
“We had a warm winter, a cold spring and now a real hot summer,” said Jessica Miller, 21, a visitor from Ohio, as she sat on a bench beneath the trees in Lafayette Square. “I think the overall weather patterns are changing.”
Another visitor, who sat nearby just across from the White House, shared a similar view.
“I think it’s a natural changing of the Earth’s average temperatures,” said Joe Kaufman, a Pennsylvanian who had just walked over from Georgetown.
Arndt said he expects data for the first half of this year will show that it was the warmest six months on record. Experts predict that average temperatures will rise by 3 to 5 degrees by mid-century and by 6 to 10 degrees by the end of the century.
If that worst prediction comes true, 98 degrees will become the new normal at this time of year in Washington 88 years from now.
Will every passing year till then break records?
“Not so much record-breaking every year,” Arndt said. “But we’ll break records on the warm end more often than on the cold end, that’s for sure. As we continue to warm, we will be flirting with warm records much more than with cold records, and that’s what’s played out over much of the last few years.”
If the present is our future, it may be sizzling. The current heat wave has had eight consecutive days of 95-degree weather. The temperature may reach 106 on Saturday, and the first break will come Monday, when a few days of more seasonable highs in the upper 80s are expected.
The hot streak began June 28 and peaked the next day with a 104-degree record-breaker, the hottest temperature ever recorded here in June. That broke a record of 102 set in 1874 and matched in June 2011.