For the first time, national science standards will include guidelines on how to teach climate change kindergarten through 12th grade students — but how will teachers incorporate the subject into the curriculum?
We had more on this struggle Wednesday on the NewsHour, as part of our Coping with Climate Changeseries.
On Thursday, Hari Sreenivasan chatted here with some of those featured in the broadcast piece. The participants included:
Cheryl Manning, who teaches honors earth science and Advanced Placement environmental science at Evergreen High School in Colorado.
A health worker explains methods of contraception during a reproductive health fair held to mark World Population Day in Quezon City, Metro Manila, Philippines, July 11, 2009. REUTERS/John Javellana
By Lisa Anderson
NEW YORK (AlertNet) – Finding a way to put the environmental impact of population and women’s reproductive health more prominently on the climate change agenda is increasingly urgent, experts said in Washington this week.
Suggesting a strong connection between family planning and the environment often risks an explosion in the highly charged political landscape of climate talks, meaning the word “population” is rarely heard, observed speakers on a panel assembled by the Wilson Center’s Environmental Change and Security Program (ECSP).
Kavita Ramdas, executive director of Stanford University’s social entrepreneurship program, calls making the link between population and the environment “the last taboo”.
“This connection … needs to be in a place where we can talk thoughtfully about the fact that yes, more people on this planet – and we’ve just crossed 7 billion – does actually put pressure on the planet. And no, it is not just black women or brown women or Chinese women who create that problem,” she told a session on women’s health and climate adaptation strategies.
“In fact, the issues around consumption in the more developed part of the world are profoundly significant. And when you know that every American baby born consumes 40 times as much as every Indian baby born, clearly there is a need to be able to tie those issues together,” she added.
Daniel Schensul, a technical specialist in the climate change, population and development branch of the United Nations Population Fund (UNFPA), noted that adapting to a shifting climate amounts to building resilience in the face of change. “Women’s ability to control fertility, I think, is at the very centre of this,” he said.
Kathleen Mogelgaard, a consultant on the Wilson Center’s ECSP, described universal access to reproductive health as “a win-win opportunity for climate change adaptation”. Compared with other adaptation strategies, family planning is already in demand among women around the world, although many lack access to it, she said.
And it’s relatively inexpensive, she added, requiring only an additional $3.6 billion a year to fully meet women’s reproductive health needs.
FEAR OF LIMITING RIGHTS
Nonetheless, social and political barriers to including population in climate discussions persist, Stanford University’s Ramdas said. Climate experts avoid talking about population issues out of fear they will be labelled racists or eugenicists, and in an effort “not to muddy the waters” surrounding the already delicate subject of climate change, she said.
“At the same time women’s rights activists also have been reluctant to jump into the argument. You can’t discuss contraception without being drawn into a debate about abortion,” she added.
The ECSP’s Mogelgaard noted that population is rarely included in assessments of climate change vulnerability and adaptation. In her experience, climate specialists have a limited understanding of population dynamics and the scale of coming demographic change – such as populations tripling in countries like Malawi by 2050.
And, if they do grasp the issues, they “assume that doing something about population means limiting people’s rights,” she said. “What this says to me is that there is a real need for raising awareness of the connection between population, climate change and reproductive health.”
More academic evidence supporting the connection would help get population considered as a legitimate issue in the climate community, the experts argued. “There hasn’t been enough work that directly shows us that, when a woman’s need for reproductive health is met, how that impacts on adaptation,” Mogelgaard said.
She knows of only one study – “Linking Population, Fertility and Family Planning with Adaptation to Climate Change: Views from Ethiopia”, issued byPopulation Action International (PAI) in October 2009 – that “shows that when women have access to reproductive health they say they are better able to cope with climate change”.
Schensul said UNFPA wants to see population and reproductive health on the June agenda of Rio+20, the U.N. Conference on Sustainable Development. To that end, it is working with partners to “establish a nuanced, evidence-based and human rights-based perspective on the operational links between population, reproductive health and climate change”.
If these inter-related factors remain neglected in climate discussions, “silence around this issue will continue to leave us in a space where the planet and her women will continue to have no voice,” Ramdas warned.
A Perfect Moral Storm: The Ethical Tragedy of Climate Change by Stephen Gardiner
Oxford, 512 pp, £22.50, July 2011, ISBN 978 0 19 537944 0
For the benefit of anyone who has spent the past decade or so on a different planet, the most frequently asked questions about climate change on this one are as follows. Is it getting warmer? Yes, surface temperatures have risen by 0.8°C from pre-industrial levels. Are humans causing it? Almost certainly. The gases produced by industrialisation and agriculture are known to have an insulating effect, and their concentration in the earth’s atmosphere has increased in line with rising temperatures, while natural causes of global warming have remained constant. Will it get warmer still? Very probably, though no one can accurately predict when or by how much. The 2007 Intergovernmental Panel on Climate Change (IPCC) Report offers a range of projections within which its best estimates are for a temperature rise of somewhere between 1.8°C and 4°C over the course of the 21st century, depending on the level of greenhouse emissions. Is there anything we can do about it? Potentially, yes. If we were to keep emissions to the low end of that spectrum, global warming might just be kept at 2°C or below, and its impacts minimised.
Climate change sceptics are an assortment of cussed old men, mostly without relevant scientific training, who disagree with one or more of these answers. Their aim is scattershot, but they do have some ammunition. The first decade of the 21st century may have been the hottest on record, but global temperatures did not get significantly hotter in the course of the decade as they had in the 1980s and 1990s. There are several possible explanations for this, one of which is the protective effect of sulphate aerosols, another result of industrialisation (Chinese in this case), which may also explain the flattening of the upward secular trend in temperatures from the 1940s to the 1970s. If that’s so, there is no reason to adjust the trend-line, for greenhouse gases stay in the atmosphere a lot longer, and sulphates mask rather than modify their effect.
That said, even though Chinese industrialisation was well advanced in the 1980s, its influence on the climate was not widely anticipated, and anyone looking back at the 1990 IPCC projections on global warming can see that they overestimate temperature rises in the 2000s by some margin (though not the associated environmental impact). This is also an indication of the difficulty of modelling future changes, and given that the range of the 2007 IPCC projections is sufficiently wide for the highest value in the low-emissions scenario (2.9°C) to be 0.5°C above the lowest in the high-emissions scenario (2.4°C), it’s clear that we are some way from quantifying all the variables involved.
Although they often have to give ground on the science, the sceptics have correctly spotted that there is something odd about the discourse around climate change. Public policy debates are rarely concerned with possibilities so remote in time and uncertain in outcome, and when they are, the policies that result are correspondingly tentative. The peculiarity of climate change is that the seemingly natural relationship of policy to time and certainty is inverted: it is precisely because climate change is so uncertain that we have to consider the possibility that it will bring disaster on a global scale, and it is precisely because its impact is long deferred that we must act decisively now.
Are these demands reasonable? They might be if – as James Hansen, one of the founders of climate science, has claimed – it is ‘our last chance to save humanity’. But is it? Any change in temperature will inevitably benefit some species and harm others, so it probably is the last chance to save those adapted only to specific ecological niches dependent on the existing climate. One pro-climate change website helpfully provides parallel columns of the positive and negative impacts: top of the list on the positive side is an increase in the numbers of chinstrap and gentoo penguins; on the negative, the extinction of the European land leech.
What about the impact on human beings? Here, too, the effects of climate change appear ambiguous. In terms of temperature change itself, the World Health Organisation estimates that climate change since the 1970s is already responsible for 140,000 deaths annually. That sounds terrible, but any temperature variation is going to result in excess deaths from either heat or cold, and it is far from clear that the net effect of an increase in temperature will in itself be harmful – it might even be beneficial. As for rises in sea level, the 2007 IPCC projections range from 18 to 59 centimetres – which is not enough to submerge anywhere other than the lowest-lying areas. And with regard to fresh water, everyone agrees that higher temperatures mean higher levels of precipitation, so there should be more water to go round. The 2007 IPCC report acknowledged that climate change reduces per capita water stress, and one recent study suggests that, with a temperature rise of around 2.4°C, water stress would increase for 1.2 billion people by 2100 but decrease for three billion others.
So what is the problem? There are two: differential impacts and high-end uncertainty. Most of the negative consequences will be felt in the earth’s mid-latitudes, already the poorest parts of the world, where secondary effects such as economic disruption, disease, famine and war will be experienced most acutely. Climate change is therefore likely to have a disproportionate impact on the vulnerable and exacerbate existing inequalities. A mid-range increase in global temperatures, which might be quite pleasant in Canada, is potentially disastrous for the population of Bangladesh or Somalia. Rises in sea level will not affect most populations at all, but even a mid-range increase would make the habitats of between sixty and a hundred million additional people liable to flooding by the end of the century. There are millions of chinstrap penguins already, but the European land leech is exceedingly rare.
However, nobody can be confident that the effects of global warming will end there. The lowest value in the high-emissions scenario might be 2.4°C, but the highest is an alarming 6.4°C, and some scientists consider the IPCC unduly cautious. Positive feedback mechanisms – the earth’s reduced albedo (reflectivity), the transformation of carbon sinks into carbon sources, or the release of methane from thawing permafrost – could push temperatures towards the top of the range and so trigger irreversible non-linear changes such as the melting of the polar ice-sheets and the disruption of thermohaline circulation in the world’s oceans. Were all that to happen, much of the planet would be uninhabitable.
What is the rational response? The possibility that climate variation is not anthropogenic, or that it will not get much worse, or that some as yet unknown technological development will mitigate its effects, cannot be wholly discounted. All are unlikely, but each has a probability well above zero. How do these combined independent probabilities compare with the probability that global political initiatives in the next, say, twenty years will make a decisive positive difference to the outcome for future generations? That depends on several conditions being met: that climate change is anthropogenic (almost certain); that it is going to get worse (very probable); that decisive and timely global political action takes place (rather doubtful); that it is sufficiently sustained to be effective (unlikely, if the past twenty years are anything to go by).
Even someone who both accepted anthropogenic global warming and believed that it was possible to do something about it might look at the odds and think that fatalism was the most appropriate response. As long ago as the 1990s, Al Gore admitted that ‘the minimum that is scientifically necessary’ to combat global warming ‘far exceeds the maximum that is politically feasible’, and many now seem to agree. Aside from the spike created by the Copenhagen summit in 2009, newspaper coverage of climate change has been dropping since 2007. Perhaps we should just acknowledge the problem, try not to exacerbate it too much and hope for the best. That, after all, is what most people have decided to do about the nightmare of the previous generation, nuclear weapons, and there is no reliable means of quantifying whether nuclear war is more or less likely than severe climate change, or whether its effects would be more or less destructive.
The real question is whether such fatalism is ethically defensible. The moral argument for preventing further climate change is easily stated. It is not just a matter of protecting the vulnerable from harm, but of taking responsibility for a harm that we in the industrialised North have both caused and benefited from. However, the worst effects of climate change are likely to be experienced by beings from other times, places or species, and as Stephen Gardiner points out, this allows us to rationalise our obligations to suit our inclinations, rather in the way that, in Sense and Sensibility, John Dashwood and his wife Fanny gradually persuade themselves that the large sum of money John had promised to support his stepmother and half-sisters really ought, in the best interests of everyone involved, to be reduced to nothing at all.
Global surveys already show that people who live in countries with high per capita emissions are less inclined to believe that global warming is a serious problem than those who live in hotter, more vulnerable countries with low emissions. But in this case it is not necessarily just a matter of self-interest prevailing over honesty and virtue. Climate change creates what Gardiner calls ‘a perfect moral storm’, within which it is difficult to keep one’s bearings. The key elements of this storm, which he enumerates with admirable – if exhausting – clarity, are problems of agency, the temptation to intergenerational buck-passing, and the inapplicability of existing political theories.
It is no secret that the 1997 Kyoto Protocol, designed to bring the emissions of industrialised countries below their 1990 levels, has been unable to achieve its targets (or only with unexpected help from economic recessions), or that the Copenhagen summit of 2009 failed to reach any meaningful agreement at all. Such failures, according to Gardiner, reflect a fragmentation of agency: while it might be collectively rational for nations to co-operate on climate change, it is individually rational for them not to. Even greater difficulties are presented by what Gardiner calls the ‘pure intergenerational problem’. The current generation has nothing to gain from reducing emissions and every subsequent one has more at stake than its predecessor. In game-theoretical terms, this means that the current generation has no incentive to co-operate even if every other generation were willing to do so, and that the same will be true of the next generation if the present one has failed to co-operate and passed the buck instead. If successive generations were distinct in this way, it would never be rational to do anything about global warming. In practice, of course, they are not distinct, but even if future generations overlap with ours, they can do little for us or to us as far as climate change is concerned, so our relationship with them is effectively non-reciprocal.
How does the difficulty of achieving co-operation between nations relate to that of achieving co-operation across generations? Gardiner opposes the two, arguing that taking nation-states to represent the interests of their citizens in perpetuity effectively excludes the intergenerational aspect of the climate change problem. However, there are good reasons for thinking that the reverse is true. People routinely make sacrifices for their children and grandchildren, and both individuals and governments are far more likely to invest their resources for the benefit of people who are temporally remote but genetically or culturally proximate than they are for their spatially distant coevals. In these cases, the possibility of future-recognition (transmitted forward through family tradition or cultural memory) trumps that of future-reciprocity. And it is the nation, conceived as a community bound together by cross-generational ties that stretch into the future, that functions as the primary vehicle of such recognition.
Paradoxically, therefore, the intergenerational politics of climate change brings us back to the political form seemingly least able to cope with it: the nation-state. For while the fragmentation of space appears to call for supranational institutions to monitor and enforce agreement, fragmentation in time demands national institutions capable of identifying with and aggregating the interests of future generations. Nation-states could act as the self-appointed representatives of future generations of their own citizens, and then (alongside various NGOs like the WWF) lobby some supranational body on their behalf. In this scenario, what climate change most conspicuously undermines is not the nation-state but democracy, for it requires supranational institutions at a time when there is no supranational democracy, and allows that at a national level the interests of future generations might take precedence over those of the current one. Perhaps, as James Lovelock has argued, climate change means that ‘it may be necessary to put democracy on hold for a while.’
Gardiner acknowledges that it is doubtful whether democratic political institutions, with their short time horizons, have the capacity to deal with deferred climate impacts, but it does not occur to him that the ‘tyranny of the contemporary’ of which he complains might be coextensive with democracy itself. In the aftermath of the French Revolution, it was Edmund Burke who argued that society ‘is a partnership not only between those who are living, but between those who are living, those who are dead and those who are to be born’, and Tom Paine who, ‘contending for the rights of the living’, responded that ‘every generation is, and must be, competent to all the purposes which its occasions require.’ If the absolute rights of the living are a form of tyranny, then their freedom to choose their own government must be called into question as well.
That might sound bizarre, but although the dead and the unborn cannot make choices now, their interests could be registered through a form of what Burke called ‘virtual representation’, in which ‘there is a communion of interests, and a sympathy in feelings and desires between those who act in the name of any description of people, and the people in whose name they act, though the trustees are not actually chosen by them.’ The current generation may of necessity furnish the representatives, but it does not follow that it is in its entirety an appropriate virtual representative of other generations, for it is collectively liable to prefer its own interests to theirs. Other generations will be more adequately represented by that minority best equipped to act for them.
One version of this arrangement would be the Burkean one in which power resides with a natural aristocracy able to mediate between past and future by conserving what is best and passing it on. Its members are conscious of what is due to posterity precisely because they are mindful of what they have received from their ancestors, and do not think it ‘among their rights to cut off the entail or commit waste on the inheritance … hazarding to leave to those who come after them a ruin instead of a habitation’. Without this, according to Burke, ‘the whole chain and continuity of the commonwealth would be broken. No one generation could link with the other.’
As Paine observed, this version of inter-generational politics has a strong bias towards the past, allowing people to govern from the grave and bind future generations for ever. An alternative weighting would be closer to the Leninist idea of a vanguard. Articulated in opposition to those who wanted to fight only ‘for themselves and for their children, and not for some kind of socialism for some future generation’, Lenin’s account of the party as the vanguard of the proletariat was founded on the idea that it embodied their objective class interests in a way they could not yet do themselves. In this manner, as Georg Lukács puts it, ‘the party, on the basis of its knowledge of society in its totality, represents the interests of the whole proletariat (and in doing so mediates the interests of all the oppressed – the future of mankind).’
The virtual representatives of other generations will inevitably have to press their claims against those of the living. In respect of climate change, the way in which they do so will depend largely on the weighting given to past emissions, on the one hand, and future prosperity, on the other. Should the magnitude of past emissions (for which the United States and the EU nations are mostly responsible) have a positive or negative impact on the extent of emissions in the future? And should we discount the costs and benefits that accrue to future generations on the basis that economic growth will probably make them richer than we are? A Burkean would argue that past emissions are irrelevant, and that it is reasonable to discount the future to preserve the comparability and continuity across the generations; a Leninist might say that past emissions extracted value from the lives of future generations, and that any future discounting should be at a zero or negative rate. The Burkean move is liable to have the effect of entrenching the stranglehold of the past over the future: the Leninist creates a dictatorship of the future over the present.
Gardiner himself argues that past emissions do matter, and (it would appear, though he is very cautious here) that the future should not be discounted. But he gives little thought to the far-reaching political implications of these conclusions. Insofar as we move beyond the tyranny of the contemporary, we invite other forms of dictatorship, and the hard-won battle of democracy to exclude its ideological rivals by establishing the present as the temporal locus of sovereignty is under threat. Rather than being able to take its destiny in its own hands, as Paine advocated, the current generation is in danger of becoming the squeezed middle – a victim of the careless excess of the past, yet still obliged to save all its resources for the needs of those to come.
Should this shift in the temporality of political thinking be resisted, or is the need for it an indication that the political forms fostered by industrialisation have proved unsuited to dealing with its consequences, and are now obsolete? With its unavoidable reliance on virtual representation, and its insistence on appropriate deliberation about technical matters beyond the grasp of the uninformed, climate change politics suggests that technocratic government, the contemporary version of Burke’s natural elite, is the only appropriate solution. And yet, with its emphasis on the ‘future of mankind’ and its deployment of backcasting (working backwards from a desired future state to determine what measures are necessary to achieve it), climate change politics has, for all its apocalyptic rhetoric, a distinctively utopian form.
Is this because the emergence of concern about global warming coincided with the failure of Communism? As some climate change sceptics have noted, there was something suspicious about the way that Communism departed stage right moments before climate change entered stage left as the new nemesis of consumer capitalism. Perhaps we should think of climate change as an updated version of the chess-playing Turkish puppet that Walter Benjamin likened to historical materialism operated by the hidden hand of theology, save that historical materialism has now become the wizened hunchback that controls the puppet and has to keep out of sight.
That would be too simplistic. The recognition that actions are liable to have unintended negative consequences is a constant in human affairs, whether mediated through the discourse of theology, economics or environmental science. Such negative consequences provide the phantom opponents against whom we strive and from whom we try to learn. Counter-hegemonic movements invariably seek to harness the latent power of unintended negative consequences to challenge the status quo. But they are not alone in this. All morality is in part an effort to mobilise sentiment to pre-empt negative outcomes, and climate science is just the latest means through which our actions are amplified back to us to create a moral connection with their consequences.
One indication of the distinctively moral nature of the discourse around climate change is the concern Gardiner expresses about treating it as a purely physical problem susceptible to a technical resolution. Those sulphate aerosols, which may be responsible for the stabilisation of global temperatures in the 21st century, could in theory be pumped into the atmosphere indefinitely for the sole purpose of reducing global warming. Any state (or company or individual for that matter) with the requisite resources could do it unilaterally, thus changing the earth’s atmosphere for everyone else. Given that sulphates are themselves a pollutant, this would be a less desirable option than controlling greenhouse emissions, but in the absence of effective action on that front, it might well be a lesser evil than uncontrolled climate change.
Gardiner devotes an entire chapter to warning against any such solution. Lesser evils, he suggests, may still tarnish those who commit them and blight their lives and those of others, rather as Sophie’s life is destroyed by the sacrifice of one child in Sophie’s Choice. The analogy is absurd but revealing, for what Gardiner calls ‘marring evils’ are meta-ethical evils that arise not from the action itself, but from the resulting negative moral assessment of the agent. On this view, the moral failure threatened by sulphate injection, or other forms of geo-engineering, arises not so much from its result, as from the failure of the action as a moral response.
What this reveals is the extent to which climate change is now constructed not as a scientific problem that generates unexpected moral dilemmas, but as an ethical problem that necessarily requires moral solutions. The sceptics are understandably wary of this, and, as Björn Lomborg has argued, we are not generally as moral as climate change ethics assumes, for if we were we might not make climate change our top priority. If we were concerned about polar bears we would start by not shooting them, rather than worrying about how much ice they had left to stand on, and if we were really worried about the global poor, we could help them now rather than helping their descendants at the end of the century, who will probably be a lot better off anyway.
These are in many respects valid arguments, but they miss the point that were it not for climate change, we would be giving even less thought to polar bears, or to the global poor, and would see little connection between our actions and their fate. As Peter Unger’s Living High and Letting Die showed, our customary moral intuitions barely extend to poor foreigners, let alone to their descendants, or to Arctic fauna. It is thanks to climate change that an entire body of political thought has emerged which positions our everyday actions in direct relation to their most distant consequences.
Adam Smith once noted that we are less troubled by the prospect of a hundred million people dying as a result of an earthquake in some distant location than of losing our little finger, but would nevertheless be horrified by the idea we might allow them to die in order to save it. Climate change effectively transforms the former scenario into the latter, and so places unprecedented demands on our moral imagination. Almost every little thing we do contributes to our carbon footprint, which increases greenhouse gases, which could in turn ultimately threaten hundreds of millions of lives in some remote time and place – the uncertainty only adding to the sublime awfulness of our responsibilities.
Contrary to Gardiner’s concerns about moral corruption, climate change does not tempt us to be less moral than we might otherwise be; it invites us to be more moral than we could ever have imagined. Unlike the Dashwoods, we never knew how many relatives we had. Climate ethics is not morality applied but morality discovered, a new chapter in the moral education of mankind. It may tell us things we do not wish to know (about democracy, perhaps), but the future development of humanity may depend on what, if anything, it can teach us.
New billboards designed by the Heartland Institute compare climate scientists to the Unabomber, and other mass murderers. Climate scientists and other writers respond.
This billboard displayed in the Chicago area compared climate scientists to Ted Kaczynski, an anti-industrial mail bomber whose explosives murdered three and injured 23 more over two decades.
Image taken from heartland.org
Update, 5:23 p.m Eastern Time: In a statement by Heartland president Joseph Bast, the organization announced that it will be taking down the Unabomber billboard after only 24 hours. Bast wrote that the billboard was an “experiment” meant to “turn the tables” on climate-change advocates.
“We know that our billboard angered and disappointed many of Heartland’s friends and supporters, but we hope they understand what we were trying to do with this experiment,” Bast wrote. “We do not apologize for running the ad, and we will continue to experiment with ways to communicate the ‘realist’ message on the climate.”
The “experiment” resulted in “uncivil name-calling and disparagement” from climate-change scientists and activists, Bast complained.
The billboards, paid for the Heartland Institute, are designed to promote the organization’s International Congress on Climate Change in Chicago later this month. The Heartland Institute describes itself as a nonprofit devoted to promoting free-market solutions for social and economic problems.
Climate scientists are already reacting to the actions, calling them “truly heinous” and the work of individuals who don’t get real global-warming science. In addition, they say the billboards will only bring global-warming skeptics and those who support global warming further apart.
The first billboard, which went up along the Eisenhower Expressway in Maywood, Ill., today (May 4), according to a Heartland spokesperson, features a mug shot of Kaczynski with the words “I still believe in Global Warming. Do you?” and a Web address for the Heartland Institute. In a press release, the organization justified this juxtaposition by calling the support for human-caused global warming “nutty.”
“The point is that believing in global warming is not ‘mainstream,’ smart, or sophisticated,” the organization wrote. “In fact, it is just the opposite of those things.” [The Reality of Climate Change: 10 Myths Busted]
Climate scientists and mass murderers
Heartland further struck out at Peter Gleick, a prominent climate scientist who leaked internal Heartland documents online in February, revealing the Institute’s fundraising efforts and plans to spread doubt about climate change. Heartland claims that one of the documents was faked, referring to the occurrence as “fakegate” in their release.
Gleick says the documents were anonymously mailed to him and he sought the other documents to verify the information. The information in the disputed document is backed up in the other documents, the veracity of which Heartland has not disputed. Individuals named in these documents have confirmed that they were working with Heartland on the plans.
Nevertheless, Heartland has sought to portray itself as on the defensive. In its most recent statement, the organization writes that the leaked memo scandal “revealed that the leaders of the global warming movement are willing to break the law and the rules of ethics to shut down scientific debate and implement their left-wing agendas.”
“The people who still believe in man-made global warming are mostly on the radical fringe of society,” the statement reads. “This is why the most prominent advocates of global warming aren’t scientists. They are murderers, tyrants, and madmen.”
The target of their new campaign, Heartland spokesperson Jim Lakely said, is “people who aren’t otherwise following the global-warming debate.”
“Heartland is not usually in the provocation business, which is a common tactic of the global-warming alarmists,” Lakely toldLiveScience. “The reaction to this billboard has been interesting.”
Scientists respond
Unsurprisingly, some of the scientists who research climate change took umbrage at this portrayal.
“This is only the latest in a long history of truly heinous actions by the Heartland Institute,” said Michael Mann, the Pennsylvania State University climate scientist who originally published the famous “hockey stick” graph showing a rise in average global temperatures after the industrial revolution.
“The only thing I can think of here is that they are acting out of true desperation,” Mann told LiveScience.
News of — and jokes about — the billboards quickly spread around the social-networking site Twitter.
“#Heartland Institute believes in gravity. SO DID HITLER,” wrote Kevin Borgia, the director of the Illinois Wind Energy Coalition.
“Ted Kaczynsk[i] believes the world is round, and the Heartland Institute tries to persuade people that the world is flat,” tweeted Ken Caldeira, an environmental scientist at the Carnegie Institution in Stanford, Calif.
Jason Samenow, a meteorologist at Washington Post, gave his response in a blog post on the newspaper’s website.
“Their approach won’t help different perspectives find common ground and work towards the most appropriate path forward,” Samenow wrote. “But maybe that’s what Heartland, in reality, is fighting against …”
Editor’s Note: The article was updated at 2:11 p.m. to correct Jason Samenow’s professional affiliation.
May 3, 2012 – Billboards in Chicago paid for by The Heartland Institute point out that some of the world’s most notorious criminals say they “still believe in global warming” – and ask viewers if they do, too.
Heartland’s first digital billboard – along the inbound Eisenhower Expressway (I-290) in Maywood – is the latest effort by the free-market think tank to inform the public about what it views as the collapsing scientific, political, and public support for the theory of man-made global warming. It is also reminding viewers of the questionable ethics of global warming’s most prominent proponents.
“The most prominent advocates of global warming aren’t scientists,” said Heartland’s president, Joseph Bast. “They areCharles Manson, a mass murderer; Fidel Castro, a tyrant; and Ted Kaczynski, the Unabomber. Global warming alarmists include Osama bin Laden and James J. Lee (who took hostages inside the headquarters of the Discovery Channel in 2010).
Bast added, “The leaders of the global warming movement have one thing in common: They are willing to use force and fraud to advance their fringe theory.” For more about the billboards and why Heartland says people should not still believe in global warming, click here.
Background
The Heartland Institute is widely recognized as a leading source of science and economics questioning claims that man-made global warming is a crisis. It has published two extensive volumes citing thousands of peer-reviewed studies: Climate Change Reconsidered2009 (880 pages) and Climate Change Reconsidered: 2011 Interim Report (416 pages). Both reports are available online at www.nipccreport.org and www.globalwarmingheartland.org.
The Heartland Institute will host its Seventh International Conference on Climate Change from Monday, May 21 through Wednesday, May 23, 2012 at the Hilton Chicago Hotel, starting on the final day of the historic NATO Summit. The conference will feature more than 50 scientists and economists lecturing on their latest findings, as well as political leaders and dignitaries from around the world.
Vaclav Klaus, president of the Czech Republic, will deliver the first dinner address on May 21. More information about the conference — including registration information for the public and the media – can be found atclimateconference.heartland.org. Videos from past conferences and describing the upcoming conference are also available on that site.
For more information, contact Director of Communications Jim Lakely at jlakely@heartland.org or 312/377-4000.
Do You Still Believe in Global Warming?
May 3, 2012 – Billboards in Chicago paid for by The Heartland Institute point out that some of the world’s most notorious criminals say they “still believe in global warming” – and ask viewers if they do, too. The first digital billboard – along the inbound Eisenhower Expressway (I-290) in Maywood – appeared today.
The Heartland Institute is widely recognized as a leading source of science and economics questioning claims that man-made global warming is a crisis. The rest of this page provides answers to some of the questions you might have about these billboards. For more information, contact Director of Communications Jim Lakely atjlakely@heartland.org and 312/377-4000.
1. Who appears on the billboards?
The billboard series features Ted Kaczynski, the infamous Unabomber; Charles Manson, a mass murderer; and Fidel Castro, a tyrant. Other global warming alarmists who may appear on future billboards include Osama bin Laden and James J. Lee (who took hostages inside the headquarters of the Discovery Channel in 2010).
These rogues and villains were chosen because they made public statements about how man-made global warming is a crisis and how mankind must take immediate and drastic actions to stop it.
2. Why did Heartland choose to feature these people on its billboards?
Because what these murderers and madmen have said differs very little from what spokespersons for the United Nations, journalists for the “mainstream” media, and liberal politicians say about global warming. They are so similar, in fact, that a Web site has a quiz that asks if you can tell the difference between what Ted Kaczynski, the Unabomber, wrote in his “Manifesto” and what Al Gore wrote in his book, Earth in the Balance.
The point is that believing in global warming is not “mainstream,” smart, or sophisticated. In fact, it is just the opposite of those things. Still believing in man-made global warming – after all the scientific discoveries and revelations that point against this theory – is more than a little nutty. In fact, some really crazy people use it to justify immoral and frightening behavior.
Of course, not all global warming alarmists are murderers or tyrants. But the Climategate scandal and the more recent Fakegate scandal revealed that the leaders of the global warming movement are willing to break the law and the rules of ethics to shut down scientific debate and implement their left-wing agendas.
Scientific, political, and public support for the theory of man-made global warming is collapsing. Most scientists and 60 percent of the general public (in the U.S.) do not believe man-made global warming is a problem. (Keep reading for proof of these statements.) The people who still believe in man-made global warming are mostly on the radical fringe of society. This is why the most prominent advocates of global warming aren’t scientists. They are murderers, tyrants, and madmen.
3. Why shouldn’t I still believe in global warming?
Because the best available science says about two-thirds of the warming in the 1990s was due to natural causes, not human activities; the warming trend of the second half of the twentieth century century already has stopped and forecasts of future warming are unreliable; and the benefits of a moderate warming are likely to outweigh the costs. Global warming, in other words, is not a crisis. For a plain English introductory essay with lots of links to research that proves these points, see “Global Warming: Not a Crisis.”
Most people who still believe in global warming do so because they trust the United Nations, the so-called mainstream media, and leading political figures to be telling them the truth about a complicated scientific issue. That trust has been betrayed.
The government agency created by the United Nations to find a link between human activities and global warming did exactly what it was created and paid to do! By ignoring natural causes of climate variation, it claims to have found evidence of a human impact and an urgent need for the UN to be given more money and more power to solve the problem. See Robert Carter’s book, Climate: The Counter Consensus, for an excellent recent commentary on just how unreliable the IPCC has become.
The mainstream media are “in the tank” with environmental activists and big-government advocates, to the point that they deliberately and expressly censor dissenting views on climate. Even distinguished scientists who dissent from the global warming dogma, such as MIT’s Richard Lindzen and the University of Virginia’s S. Fred Singer, are regularly savaged and defamed by reporters for some of the largest-circulation newspapers in the country. See the Media Research Center’s 2008 report, “Global Warming Censored,” for a good account of media bias on this topic.
And nobody should believe politicians who say they want to raise taxes, give subsidies to their buddies, or regulate growing industries in the name of “global warming.” Politicians aren’t scientists, and they aren’t motivated by the search for scientific truth. Mostly, they want to raise taxes, redistribute wealth, and regulate industry because doing so increases their power and chances for reelection. Two good recent books that make this point are Climate Coup by Patrick Michaels and Eco-Tyrannyby Brian Sussman.
4. But isn’t it true that 98 percent of climate scientists believe in global warming?
No, this is just a myth that gets repeated over and over by global warming advocates. The alleged sources of this claim are two studies. One is a survey that didn’t ask if global warming is bad or even how much of past warming was man-made. That survey also excluded all but 79 (not a typo!) of the thousands of people who responded to it in order to arrive at the 98 percent figure.
The other study reported the number of times global warming alarmists and realists appeared in academic journals, and found that a small group of alarmists appeared hundreds of times. That doesn’t mean they are more likely to be right. In fact, there are many reasons why realists appear to be published less often than alarmists.
A detailed analysis of these two studies appears in this essay: “The Myth of the 98%.”
More broadly, the claim that there is a “scientific consensus” that global warming is both man-made and a serious problem is untrue. Sources used to document this claim invariably fail to do so, while more reliable surveys and examinations of the literature reveal that most scientists do not believe in the key scientific claims upon which global warming alarmism rests. For example, most scientists do not believe computer models are sufficiently reliable to make long-term forecasts of climate temperatures.
That goes to the very heart of the alarmists’ predictions and worries. For a detailed analysis of the claim of a “scientific consensus” on global warming, see this essay: “You Call This Consensus?”
5. Are you saying anyone who believes in global warming is a mass murderer, tyrant, or terrorist?
Of course not. But we are saying that the ethics of many advocates of global warming are very suspect. Consider two recent scandals that exposed the way they think:
Climategate was the leak of emails from the Climatic Research Unit at the University of East Anglia in England in 2010 and 2011. The emails revealed a conspiracy to suppress debate, rig the peer review process to keep out of the leading academic journals any scientists skeptical of catastrophic man-caused global warming, hiding data, fudging research findings, and dodging Freedom of Information Act requests.
Fakegate was the theft in early 2012 of confidential corporate documents from The Heartland Institute by Dr. Peter Gleick, a leading climate scientist and president of the Pacific Institute for Studies in Development, Environment, and Security in Oakland, California. Gleick admitted on February 20 to using a false identity to steal the documents and then disseminating them – along with a fake memo purporting to be Heartland’s “climate strategy” – to sympathetic bloggers and journalists.
Megan McArdle wrote this about Fakegate in The Atlantic: “Gleick has done enormous damage to his cause and his own reputation, and it’s no good to say that people shouldn’t be focusing on it. If his judgement is this bad, how is his judgement on matters of science? For that matter, what about the judgement of all the others in the movement who apparently see nothing worth dwelling on in his actions?”
Robert Tracinski wrote this at Real Clear Politics: “The global warming alarmists are losing the argument, and the latest scandal–James Delingpole calls it Fakegate–shows just how desperate they have become.”
Poor judgement … believing the ends justify the means … desperation. Now do you see why we really shouldn’t be surprised to learn that Charles Manson, Fidel Castro, Ted Kaczynski, and other famous criminals believe in global warming?
6. Why should I believe The Heartland Institute?
We don’t think you should “believe” anyone. Do your own research. Come to your own conclusions. But since you ask …
The Heartland Institute has been conducting research into the real science and economics of climate change for more than 15 years. We have assembled hundreds of scientists to share their knowledge, participate in debates, and conduct peer review of our publications. Importantly, nobody here is paid to believe in global warming.
Heartland is a 28-year-old national nonprofit organization with offices in Chicago, Illinois and Washington, DC. Its mission is to discover, develop, and promote free-market solutions to social and economic problems. It is supported by approximately 1,800 individuals, foundations, and corporations. No corporation gives more than 5 percent of its annual budget.
Heartland has distributed millions of copies of books, booklets, videos, and reprints that examine the causes and consequences of climate change. It published two hefty volumes citing thousands of peer-reviewed studies: Climate Change Reconsidered 2009 (880 pages) and Climate Change Reconsidered: 2011 Interim Report (416 pages). Both reports are available online at NIPCCreport.org and GlobalWarmingHeartland.org.
Heartland has hosted six International Conferences on Climate Change attracting nearly 3,000 people. Many of the world’s leading scientists, economists, and political leaders have spoken at these conferences. Video of the presentations made at those events can be found online.
So if you are looking for objective research on climate change, we are a good place to start.
7. Should I attend the ICCC-7?
The Heartland Institute will host its Seventh International Conference on Climate Change from Monday, May 21 through Wednesday, May 23, 2012 at the Hilton Chicago Hotel, starting on the final day of the historic NATO Summit. The conference will feature more than 50 scientists and economists lecturing on their latest findings, as well as political leaders and dignitaries from around the world.
Vaclav Klaus, president of the Czech Republic, will deliver the first dinner address on Monday, May 21. More information about the conference – including registration information for the public and the media – can be found at climateconference.heartland.org. Videos from past conferences and describing the upcoming conference are also available on that site.
This year’s conference theme is “Real Science, Real Choices.” We will feature approximately 50 scientists and policy experts speaking at plenary sessions and on three tracks of concurrent panel sessions exploring what real climate science is telling us about the causes and consequences of climate change, and the real consequences of choices being made based on the current perceptions of the state of climate science.
Past conferences have taken place in New York City, Chicago, Washington DC, and Sydney, Australia and have attracted nearly 3,000 participants from 20 countries. The proceedings have been covered by ABC, CBS, NBC, Fox News, the BBC, the New York Times, the Washington Post, Le Monde, and most other leading media outlets.
The ostensibly large number of recent extreme weather events has triggered intensive discussions, both in- and outside the scientific community, on whether they are related to global warming. Here, we review the evidence and argue that for some types of extreme — notably heatwaves, but also precipitation extremes — there is now strong evidence linking specific events or an increase in their numbers to the human influence on climate. For other types of extreme, such as storms, the available evidence is less conclusive, but based on observed trends and basic physical concepts it is nevertheless plausible to expect an increase.
I sent the article around to some researchers working on these questions. Here are their reactions, along with another valuable assessment posted by Michael Tobis at Planet 3.0:
– Exaggerated language, and many unsubstantiated assertions. For instance, in what manner did the last decade experience an “unprecedented” number of extreme weather events? Note that the increase in heat waves was largely balanced by a decrease in cold waves—-
– Overly simplistic view of the relation between damage, human suffering, and the extremes. Much more balanced arguments can be found in R. Pielke Jr.’s work that consider changes in society, communities, coastal development, etc. Also, a more useful perspective is found in the recent EOS article by Mike Wallace, titled “Weather and Climate Extreme Events: Teachable Moments.”
– Very few of the [cases of extreme weather listed in the paper] have undergone a scientific investigation of contributing factors, let alone human impacts. I believe that a read of the Lewis and Clark journals would reveal an impressive list of extreme weather also…. so what is one to make of this list for the 2001-2011 period provided in this Perspective by Coumou and Rahmstorf. The fact is that extremes happen, have happened, and will continue to happen. For some, their character, preferred phase, and intensity may be changing (aside from temperature extremes, the detection and attribution evidence to date is weak).
– I suspect that if one engaged in grand mitigation today (as useful as that would be for many other purposes), many of the extremes listed in [the paper] would happen anyway, and will likely happen again.
– The piece lacks all perspective on the human and technological elements contributing to greater observational capacity to sense extremes (radar, satellite), nor does it consider the reality of a heighten interest by the public in extremes, given recent public discourses.
– The matter of attribution, as raised in the second to last paragraph, is a much broader science that merely determining the change in probability due to greenhouse-gas forcing….which is an inherently difficult and uncertain undertaking. The piece ignores the broader context in which all manner of contributing factors is assessed to understand the magnitude of events, their temporal and regional specificity (e.g., why did the heat wave happen over Texas (rather than Washington), why did it occur in 2011 (and not 2009, or next year), and why did it break the previous records by a factor of 2. After all, the irony of extreme events is that the larger the magnitude the smaller the fractional contribution by human climate change.
– Consistent with the policy-direct tone of this piece, hyperbole is used throughout. The piece often convoluting apparent “effects” of apparent changes in extremes in the last decade with causes not to arise till the latter part of the 21st century.
My reactions to the article are very much along the same lines as Marty Hoerling’s. By exaggerating the influence of climate change on today’s weather and climate-related extreme events, a part of our community is painting itself into a rhetorical corner.
My opinion piece, “Weather and Climate-Related Extreme Events: Teachable Moments ” to which Hoerling refers, serves as a counterpoint to Coumou and Rahmsdorf’s article. Before submitting it to Eos, as an experiment, I submitted it to Nature: Climate Change, where their article was published. I cannot say that I was surprised when the editors informed me that they would not be sending it out for review because “we are not persuaded that your article represents a sufficiently substantial contribution to the ‘climate change debate’ [my quotation marks] to justify publication in the journal”. Perhaps to ease the pain of rejection, the editor added, “more Commentaries are actively commissioned and […] we only rarely publish unsolicited contributions to the section”.
Although it may sound a bit like sour grapes, here’s the way that I’ve rationalized Nature’s editorial decision. I’ve become convinced that many of the editors of the high impact journals are inclined to cast opinion pieces as salvos in the ongoing war between climate change believers and skeptics. Articles like mine that take issue with the way in which the war is being waged are not particularly welcome. By soliciting opinion pieces and by selecting, from among the growing list of contributed articles, the very few that will be sent out for peer review, the editors promote their vision of what constitutes “groundbreaking” and “policy relevant” science. What if it is not the right vision?
By granting the editors of Nature and other high impact journals ever increasing power in deciding which of our articles should be singled out for emphasis in the news media, we risk losing control of the peer review process upon which our public image depends. The way to maintain control is to make a point of sending our most newsworthy scientific articles and opinion pieces to the journals of our own professional societies, in which the peer review process is editor-facilitated, rather than editor-directed. Dot.Earth could render our community a valuable service by ensuring that newsworthy articles published in our journals receive the public attention that they deserve.
Kerry Emanuel, longtime climate scientist at the Massachusetts Institute of Technology (focused on the impact of greenhouse-driven heating on hurricanes):
I read the piece differently from the way Mike and Martin read it. It was published as a “perspective” and I did not read it as a scientific paper or letter. It tries to draw attention to the point that weather extremes a) affect society more so than means, and b) require a different statistical approach to detect trends. This is certainly old hat to climate scientists, but there is so much literature on the the mean temperature response that I believe there is room to draw attention to the problem of extremes. Thus I think the perspective piece is useful. The one criticism I would level, echoing to some extent what Martin and Mike have said, is that it is a bit heavy on weather anecdotes (this record broken here; that record there), which draws attention away from the central issue of the statistics of extremes.
It is vital that as a community we focus more attention on detecting changes in the tails of the distributions of weather events. To the extent that this perspective piece may draw scientists from other disciplines into this endeavor, it will have proven useful.
On last point: I completely agree with Mike that you could do science a service by getting journalists to pay more attention to our own professional journals and not focus so exclusively on the high profile journals, which often tend toward the sensational at the expense of solid advances.
Michael Tobis, a scientist, programmer and climate bloggerfrom the University of Texas, posted a nice essay on the Coumou-Rahmstorf article and related issues. The piece, “Disequilibrium is Not Your Friend,” examines the consequences of disturbing a system in a state of complex equilibrium, whether it is an intricate Alexander Calder mobile sculpture or the climate. Here’s an excerpt:
It’s a general principle of complex equilibria that the more they are disturbed, the more complex the processes involved in restoring their equilibrium. The mobile sculpture is not unusual in this regard….
What makes the sculpture less predictable under forcing? Both the size and duration of the impact matter. If you moved the piece ten yards very gently, its behavior might be nothing out of the ordinary, while if you moved it an inch suddenly, a lot of complexity would emerge. (If you moved the piece ten yards suddenly, you would expect permanent alterations, with a whole new set of modes created and many of the old ones destroyed. Let’s hope we do not take the analogous experiment that far.)
While this in no way constitutes a mathematical proof for any given system, the underlying behavior is common and intuitively understandable. If a complex system acts otherwise, it would be something extraordinary that deserves explanation. As applied to the climate system, consider it a plausibility argument: the more rapidly and extensively the system is disturbed, the more we would expect that unexpected behaviors will emerge, and the further from expectations they will be. [Please read the rest.]
April 11, 9:47 a.m. | Updated Stefan Rahmstorf offers his response here:
There is a broad spectrum of views on extreme events in the community – you’ve sampled some of those. It is precisely this range of opinions which made us think it worthwhile to take a good dispassionate look at the evidence and stimulate some discussion. We noticed this range also in the reviews of our Perspective. One reviewer asked us to make stronger statements on the link between climate change and extremes, another just asked the opposite and the third one found we got it about right. I think overall we struck a good balance, and I’ve never gotten such an overwhelming positive feedback from colleagues after publishing a paper – lots of emails still coming in. Looks like we struck a chord.
Hoerling’s claim that we make “many unsubstantiated assertions” is itself one. First he claims we said that the last decade experienced an unprecedented number of extreme weather events – which we do not say anywhere in our paper. And then he claims that “the increase in heat waves was largely balanced by a decrease in cold waves,” which is a popular climate sceptics argument but demonstrably false. Already the IPCC TAR in 2001 illustrated that this is not the case, see the famous TAR graph and compare the size of the pink/red and blue areas in panels (a) or (c). We explained this again in our 2011 PNAS paper, and we demonstrate it again in the present Perspective: In a stationary climate you’d get approximately the same amount of hot and cold records. We cite the global data analysis of Benestad (2004) in Fig. 2 which shows that record heat waves already have increased more than threefold as compared to a stationary climate. Now even if record cold waves would have declined to zero in number (which they have not), it is obvious that this could not balance a more than threefold increase in heat waves.
Interestingly, Hoerling immediately raises the climate policy issue (stating that mitigation efforts would not prevent extremes) and even denounces our Perspective as “policy-direct”, even though we do not even mention policy – it is simply not the topic of our article, we exclusively discuss scientific questions and we point out at the outset that societal impacts and possible policy strategies are discussed in the SREX.
We cite James Hansen’s 1988 statement on global warming at the end. Back then he got a lot of criticism for it, but in hindsight it turned out he was right. We hope that in hindsight we will find out that we were wrong, and global warming is not leading to more unprecedented extremes. But the evidence is pointing the other way, I’m afraid.
The pull of the “front-page thought” and the eagerness of climate campaigners to jog the public have sometimes created a tendency to tie mounting losses from weather-related disasters to human-driven global warming.
But finding a statistically robust link between such disasters and the building human climate influence remains a daunting task. A new analysis of nearly two dozen papers assessing trends in disaster losses in light of climate change finds no convincing link. The author concludes that, so far, the rise in disaster losses is mainly a function of more investments getting in harm’s way as communities in places vulnerable to natural hazards grow.
The paper — “Have disaster losses increased due to anthropogenic climate change?” — is in press in the Bulletin of the American Meteorological Society. It was written byLaurens M. Bouwer, a researcher at Vrije University in Amsterdam focused on climate and water resources (and a lead author of a chapter in the 2001 assessment from the Intergovernmental Panel on Climate Change). You can read more about the paper at the blog of Roger Pielke, Jr., which drew my attention to this work.
Here’s the summary and a link to the full paper:
The increasing impact of natural disasters over recent decades has been well documented, especially the direct economic losses and losses that were insured. Claims are made by some that climate change has caused more losses, but others assert that increasing exposure due to population and economic growth has been a much more important driver. Ambiguity exists today, as the causal link between climate change and disaster losses has not been addressed in a systematic manner by major scientific assessments. Here I present a review and analysis of recent quantitative studies on past increases in weather disaster losses and the role of anthropogenic climate change. Analyses show that although economic losses from weather related hazards have increased, anthropogenic climate change so far did not have a significant impact on losses from natural disasters. The observed loss increase is caused primarily by increasing exposure and value of capital at risk. This finding is of direct importance for studies on impacts from extreme weather and for disaster policy. (Read the rest.)
None of this negates the importance of moving to limit emissions of long-lived greenhouse gases; the analysis just reinforces the reality that while that effort proceeds, there’s plenty of other work to do, as well, if humanity desires a relatively smooth journey in this century (as was recently stressed by Robert Verchick here).
Through decades of work, James E. Hansen of NASA has earned his plaudits as a climate scientist. But his intensifying personal push for aggressive cuts in emissions of greenhouse gases has come with a framing of climate science that is being criticized by some respected researchers for stepping beyond what peer-reviewed studies have concluded.
“Over the next several decades, the Western United States and the semi-arid region from North Dakota to Texas will develop semi-permanent drought, with rain, when it does come, occurring in extreme events with heavy flooding. Economic losses would be incalculable. More and more of the Midwest would be a dust bowl. California’s Central Valley could no longer be irrigated. Food prices would rise to unprecedented levels.”
He doesnt define “several decades,” but a reasonable assumption is that he refers to a period from today through mid-century. I am unaware of any projection for “semi-permanent” drought in this time frame over the expansive region of the Central Great Plains. He implies the drought will be due to a lack of rain (except for the brief, and ineffective downpours). I am unaware of indications, from model projections, for a material decline in mean rainfall. Indeed, that region has seen a general increase in rainfall over the long term during most seasons (certainly no material decline). Also, for the warm season when evaporative loss is especially effective, the climate of the central Great Plains has not become materially warmer (perhaps even cooled) since 1900. In other words, climate conditions in the growing season of the Central Great Plains are today not materially different from those existing 100 years ago. This observational fact belies the expectations from climate simulations and, in truth, our science lacks a good explanation for this discrepancy.
The Hansen piece is policy more than it is science, to be sure, and one can read it for the former. But facts should, and do, matter to some. The vision of a Midwest Dustbowl is a scary one, and the author appears intent to instill fear rather than reason.
The article makes these additional assertions:
“The global warming signal is now louder than the noise of random weather…”
This is patently false. Take temperature over the U.S. as an example. The variability of daily temperature over the U.S. is much larger than the anthropogenic warming signal at the time scales of local weather. Depending on season and location, the disparity is at least a factor of 5 to 10.
I think that a more scientifically justifiable statement, at least for the U.S. and extratropical land areas is that daily weather noise continues to drum out the siren call of climate change on local, weather scales.
Hansen goes on to assert that:
“Extremely hot summers have increased noticeably. We can say with high confidence that the recent heat waves in Texas and Russia, and the one in Europe in 2003, which killed tens of thousands, were not natural events — they were caused by human-induced climate change.”
Published scientific studies on the Russian heat wave indicate this claim to be false. Our own study on the Texas heat wave and drought, submitted this week to the Journal of Climate, likewise shows that that event was not caused by human-induced climate change. These are not de novo events, but upon scientific scrutiny, one finds both the Russian and Texas extreme events to be part of the physics of what has driven variability in those regions over the past century. This is not to say that climate change didn’t contribute to those cases, but their intensity owes to natural, not human, causes.
The closing comment by Hansen is then all the more ironic, though not surprising knowing he often writes from passion and not reason:
“The science of the situation is clear — it’s time for the politics to follow. ”
“Those who continue to talk in certain terms of how local weather extremes are the result of human climate change are failing to heed all the available evidence.”
Kerry Emanuel:
I see overstatements on all sides. Extreme weather begets extreme views. On the Russian heat wave, Marty is citing a single paper that claims it had nothing to do with climate change, but there are other papers that purport to demonstrate that events of that magnitude are now three times more likely than before the industrial era.
This is a collision between the fledgling application of the science of extremes and the inexperience we all have in conveying what we do know about this to the public. A complicating factor is the human psychological need to ascribe every unusual event to a cause. Our Puritan forebears ascribed them to sin, while in the 80’s is was fashionable to blame unusual weather on El Niño. Global warming is the latest whipping boy. But even conveying our level of ignorance is hard: Marty’s quotation of Harold Brooks makes it sound as though he is saying that the recent uptick in severe weather had nothing to do with climate change. The truth is that we do not know whether it did or did not; absence of evidence is not evidence of absence.
Regular readers of my work will not be surprised that I align with Emanuel.
At roughly the same time, Hoerling sent an amplification on his arguments and Miller sent a critique of Hoerling’s initial post. You can read both below. Keep in mind that neither writer has seen the other’s piece. (I asked Hansen for his thoughts on the complaints of Hoerling and Kerry Emanuel, another climate scientist who weighed in on Dot Earth. His response is at the end of this post.)
Here’s Hoerling’s expanded critique of Hansen [if you’re having trouble reading it, click here for a downloadable version]:
I have several papers well along in the publication process that make clear your characterizations are far off the mark. The editors prefer, indeed are insistent, that I not discuss these in blogs. Some scientists may be able to spend their time blogging and e-mailing without a significant impact on their scientific productivity — I’m not one of them — but I do make an effort to make my papers understandable to a wide audience.
Entre 1933 e 2010, o total anual de chuvas aumentou 425 mm na região metropolitana, segundo dados da USP
A terra da garoa virou a megalópole da tempestade. Em cerca de 80 anos, a quantidade de chuva anual que cai na Região Metropolitana de São Paulo, onde um em cada 10 brasileiros vive numa área equivalente a quase 1% do território nacional, aumentou 425 milímetros (mm), metade do que chove em boa parte do semiárido brasileiro. Saltou de uma média anual de quase 1.200 mm na década de 1930 para algo em torno dos 1.600 nos anos 2000. Fazendo uma soma linear, é como se todo ano tivesse chovido 5,5 mm a mais do que nos 12 meses anteriores. A pluviosidade não apenas se intensificou como alterou seu padrão de ocorrência. Não está simplesmente chovendo um pouco mais a cada dia, um efeito que seria pouco perceptível na prática e incapaz de ocasionar alagamentos constantes na região. A quantidade de dias com chuva forte ou moderada cresceu, provocando inclusive tempestades no inverno, época normalmente seca. Em contrapartida, o número de dias com chuva fraca, menor do que 5 mm, diminuiu.
Um regime de extremos, pendular, passou a dominar o ciclo das águas na região metropolitana: quando chove, em geral é muito; mas, entre os dias de grande umidade, pode haver longos períodos de seca. A Grande São Paulo parece caminhar para o pior dos dois mundos, alternando períodos intensos de excesso e de falta de chuva ao longo do ano. “A urbanização e o chamado efeito ilha de calor, além da poluição atmosférica, parecem ter um papel importante na alteração do padrão de pluviosidade em São Paulo, em especial nas estações já normalmente mais úmidas, como primavera e verão”, afirma Maria Assunção da Silva Dias, do Instituto de Astronomia, Geofísica e Ciências Atmosféricas da Universidade de São Paulo (IAG-USP), autora de um estudo ainda inédito sobre o tema. “Nos meses mais secos, a influência das mudanças globais do clima é responsável por 85% da dinâmica envolvida no aumento de chuvas extremas.” Embora com menos nitidez, a mesma tendência de elevação no número de dias com chuva intensa foi detectada na Região Metropolitana do Rio de Janeiro.
O novo padrão pluviométrico em São Paulo não é como uma frente fria passageira. Veio para ficar, segundo modelagens feitas pelo Centro de Ciência do Sistema Terrestre do Instituto Nacional de Pesquisas Espaciais (CCST-Inpe). As projeções sugerem que a situação atual é uma espécie de prólogo do enredo futuro. Elas sinalizam que deverá ocorrer até o final deste século um aumento no número de dias com chuvas superiores a 10, 20, 30 e 50 mm, ou seja, praticamente em todas as faixas significativas de pluviosidade. Haverá apenas uma diminuição na quantidade de dias com chuvas muito fracas e possivelmente um aumento no número de dias secos. “A sazonalidade das chuvas também deverá mudar”, afirma José Marengo, chefe do CCST, coordenador de um trabalho ainda não publicado sobre as projeções de chuva na região metropolitana. “A quantidade de tempestades fora da época normalmente mais úmida deverá crescer, um tipo de situação que pega a população de surpresa.” As simulações levam em conta apenas os possíveis efeitos sobre o regime pluviométrico da região metropolitana causados pelas chamadas mudanças climáticas globais, sobretudo o aumento nas concentrações dos gases de efeito estufa, que esquentam a temperatura do ar. O peso que a urbanização e a poluição atmosférica podem ter sobre as chuvas da Grande São Paulo não é considerado nas projeções.
Verde escasso na metrópole de concreto e asfalto: se 25% do território da Grande São Paulo fosse coberto por árvores, a temperatura média cairia até 2,5ºC
Uma das grandes dificuldades de fazer grandes estudos, capazes de revelar flutuações climáticas do passado e servir de baliza para projeções futuras, é a ausência de séries históricas longas e confiáveis, com informações diárias sobre a incidência de chuvas. Sem elas, não é possível fazer uma análise estatística robusta e ter uma visão clara sobre quanto chovia e como se distribuía a pluviosidade ao longo dos anos e das estações climáticas (primavera, verão, outono e inverno). Os especialistas são unânimes em apontar essa deficiência no Brasil. A série com dados de melhor qualidade sobre chuvas num ponto do território nacional é a fornecida pela estação meteorológica do IAG, que fica no Parque do Estado, no bairro da Água Funda, zona Sul da cidade de São Paulo. Os registros se iniciaram em 1933, quando a unidade foi inaugurada, e prosseguem até hoje.
Outro fator reveste os dados fornecidos pela estação meteorológica do IAG de um caráter único. Os registros foram obtidos dentro de uma grande área verde da cidade de São Paulo que não mudou radicalmente seu perfil ao longo de quase oito décadas – uma raridade numa megalópole que não possui muitos parques e jardins. Em outras palavras, embora a cidade tenha sofrido um forte processo de urbanização e de impermeabilização do solo no século passado, as condições naturais nos arredores da estação do Parque do Estado não se alteraram radicalmente. Dessa forma, faz sentido comparar os dados do presente com os do passado, visto que o ambiente local é mais ou menos o mesmo. “Na zona Norte de São Paulo, no Mirante de Santana, existe uma estação meteorológica com medições desde os anos 1950”, afirma Pedro Leite da Silva Dias, pesquisador do IAG-USP e diretor do Laboratório Nacional de Computação Científica (LNCC), no Rio de Janeiro, também autor do estudo sobre a evolução das chuvas na região metropolitana. “Mas lá só havia matas algumas décadas atrás e hoje tem prédio do lado da estação.”
Devido à riqueza de dados fornecidos pela estação do IAG no Parque do Estado, Assunção e seus colaboradores puderam enxergar detalhes e tendências mais sutis no regime das chuvas ao longo das últimas oito décadas. Entre 1935 e 1944 choveu, em média, mais do que 40 mm em cerca de 30 dias, com grande concentração de pluviosidade nos meses de verão e, em menor escala, na primavera e no outono. Durante o período não houve registros de episódios de pluviosidade dessa intensidade nos meses de inverno. A situação começou a mudar a partir de meados dos anos 1940. Desde então, em todas as décadas ocorreu, em média, ao menos uma chuva desse porte no inverno. Entre 2000 e 2009, o número total de jornadas com tempestades acima de 40 mm esteve na casa de 70 eventos. Uma tendência similar se repete quando se analisa década a década a ocorrência de chuvas diárias acima de 60 e de 80 mm.
De forma geral, dois fatores principais podem estar relacionados com a alteração no regime de chuvas na região metropolitana: as mudanças climáticas globais, um fenômeno de grande escala, e o efeito ilha de calor, de caráter localizado e típico das megacidades. Os dois atuam em conjunto. Um potencializa os efeitos do outro e, em geral, é difícil traçar uma linha divisória entre ambos. Segundo Marengo, a maioria dos modelos climáticos indica que haverá um aumento na quantidade de chuva desde a bacia do Prata até o Sudeste do Brasil nas próximas décadas. Dentro dessa moldura mais ampla, surge a questão específica do clima nas grandes cidades, em especial do efeito ilha de calor, que, ao tornar mais quentes as áreas extremamente urbanizadas, também funciona como um ímã de chuvas.
Brisa marinha mais úmida
A temperatura superficial do oceano Atlântico no litoral paulista aumentou cerca de um grau entre os anos de 1950 e 2010. Passou de 21,5°C para 22,5°C. Pode parecer pouco, mas uma das consequências desse aquecimento é aumentar a taxa de evaporação da água do oceano, combustível que torna a brisa marinha ainda mais carregada de umidade. Esse processo tem repercussões sobre o clima acima da serra do Mar, no planalto onde fica a região metropolitana.
Por que boa parte das chuvas na Grande São Paulo ocorre entre o meio e o final da tarde, depois das 15 ou 16 horas? Essa é a hora em que a brisa marinha, quente e úmida, vinda da Baixada Santista, termina de subir a serra e atinge a megalópole. “A zona Sudeste é geralmente o primeiro lugar da capital que sente os efeitos da brisa”, comenta Maria Assunção. A estrutura interna das cidades, com muitos prédios altos, altera a direção dos ventos e pode até provocar a ascensão da brisa marinha em certoss pontos da região metropolitana e favorecer localmente a formação de nuvens de chuvas. A poluição urbana, sobretudo os aerossóis, pode tanto favorecer como inibir a ocorrência de tempestades sobre as cidades, dependendo de sua quantidade.
Estudos feitos nos Estados Unidos na década de 1990 sugerem que parte do aumento de pluviosidade em algumas regiões metropolitanas, como na de Saint Louis, se deve à sua crescente urbanização. Nessa área do estado de Missouri, onde vivem cerca de 2,9 milhões de pessoas, as chuvas aumentaram entre 5% e 25% nas últimas décadas. Um estudo do ano passado, conduzido em grandes cidades da Índia, conclui que as alterações no regime pluviométrico dessas concentrações urbanas derivam mais das flutuações naturais do clima do que de fenômenos locais.
Estratégias de mitigação
No caso da Região Metropolitana de São Paulo, o trabalho da USP encontrou uma forte correlação entre seu processo de urbanização e as alterações no regime das chuvas. Os episódios de chuvas extremas, acima de 40 mm, se acentuam à medida que a população de São Paulo e de suas cidades vizinhas cresce e os territórios desses municípios viram praticamente uma única mancha de ocupação contínua, com pouco verde, muito asfalto e repleta de fontes de poluição e calor. De 1940 a 2010, a população da região metropolitana aumentou 10 vezes, de 2 para 20 milhões de habitantes. A mancha urbana cresceu 12 vezes entre 1930 e 2002, de 200 para 2.400 quilômetros quadrados. A temperatura média anual de São Paulo subiu 3°C entre 1933 e 2009, de acordo com os registros da estação do IAG no Parque do Estado e o total de chuvas aumentou em um terço. “Antes estudávamos esse processo de forma teórica”, afirma Pedro Leite da Silva Dias. “Agora temos mais dados, inclusive de fontes digitais.”
Mitigar o efeito ilha de calor pode ser uma forma de reduzir os episódios de chuvas extremas nos centros urbanos. O físico Edmilson Dias de Freitas, do IAG-USP, tem testado algumas medidas em simulações computacionais para ter uma ideia de seu impacto sobre o clima da Região Metropolitana de São Paulo. Pintar de branco as superfícies das casas e prédios não seria um procedimento eficaz. “A poluição e os eventos meteorológicos escurecem o branco rapidamente em São Paulo”, diz Freitas. “Não há como manter isso.” A medida mais eficaz seria aumentar a cobertura vegetal da cidade. Segundo as simulações, se 25% da área da região metropolitana fosse tomada por árvores, a temperatura média poderia ser reduzida entre 1,5°C e 2,5°C. Um clima mais ameno reduziria o efeito ilha de calor e talvez não atraísse tanta chuva para a região. Hoje as áreas verdes não representam nem 10% da Grande São Paulo.
Por tabela, se houvesse mais parques e menos áreas impermeabilizadas na maior metrópole brasileira, o efeito mais perverso das tempestades também seria minimizado: as chuvas intensas produziriam menos enchentes e alagamentos. O solo exposto absorve mais as águas que caem sobre ele. “São Paulo fere um princípio básico de drenagem: a água da chuva tem de se infiltrar no solo onde ela cai”, diz a engenheira civil Denise Duarte, professsora da Faculdade de Arquitetura e Urbanismo da USP, que colabora com colegas do IAG. “Aqui, com boa parte da cidade impermeabilizada, a água é simplesmente escoada.” A chuva de um lugar é transferida para outro, em geral os situados em pontos baixos da mancha urbana.
Nas zonas mais úmidas, em geral pontuadas por serras e montanhas, a pluviosidade anual pode chegar a 2.400 mm, quantidade de chuva parecida com a da floresta amazônica. Esse é caso da porção da Grande São Paulo cortada pela serra do Mar, que pega o trecho sul da capital paulista e parte de cidades como São Bernardo do Campo e Rio Grande da Serra, e também de trechos de Santana do Parnaíba e Cajamar, no oeste da região metropolitana. Nas áreas menos úmidas, como uma grande parte de Mogi das Cruzes, o índice de chuvas pode ficar na casa dos 1.300 mm por ano. Entre esses dois extremos há vários níveis intermediários de pluviosidade.O valor atual de aproximadamente 1.600 mm anuais de chuva registrado na estação do IAG funciona como uma referência genérica ao regime pluviométrico vigente na região metropolitana. Numa área que hoje se estende por 8 mil quilômetros quadrados e engloba os territórios de 39 municípios, a quantidade de chuva realmente medida ano a ano em cada estação meteorológica pode variar bastante. Um trabalho do CCST traça uma espécie de distribuição geográfica da pluviosidade na Grande São Paulo a partir de séries históricas, com o total diário de chuva, fornecidas por 94 estações meteorológicas do Departamento de Águas e Energia (DAEE) do Estado de São Paulo e da Agência Nacional de Águas (ANA). Dados de um período de 25 anos, entre 1973 e 1997, foram utilizados no trabalho.
“Essa diferença de níveis de chuvas se mantém ao longo do ano e em todas as estações climáticas”, diz Guillermo Obregón, do CCST, principal autor do estudo sobre a distribuição geográfica da chuva na região metropolitana. “Nos locais mais úmidos predominam as chuvas orográficas ou de relevo.” Esse mecanismo faz as massas de ar quente e úmido subirem ao se chocar com elevações topográficas, condensarem-se e gerarem precipitações frequentes. Seja por seus prédios e asfalto, seja por suas áreas montanhosas, a Grande São Paulo parece estar no caminho das chuvas.
Artigos científicos
1 SILVA DIAS, M.A.F. et al. Changes in extreme daily rainfall for São Paulo, Brazil. Climatic Change. no prelo. 2012.
2 MARENGO, J. A. et al. The climate in future: projections of changes in rainfall extremes for the Metropolitan Area of São Paulo (Masp). Climate Research. no prelo. 2012
As chuvas na Região Metropolitana do Rio de Janeiro, a segunda maior do país com 12,5 milhões de habitantes, parecem exibir tendências semelhantes às de São Paulo. Embora a capital fluminense não disponha de uma série histórica sobre pluviosidade tão longa e confiável como a do IAG-USP, duas estações do Instituto Nacional de Meteorologia (Inmet) instaladas no Rio de Janeiro fornecem dados de qualidade razoável sobre ao menos quatro décadas de chuva.
De acordo com os registros obtidos entre 1967 e 2007 pela estação mantida no Alto da Boa Vista, a quantidade de água despejada sobre esse bairro da zona Norte da capital fluminense nos dias de forte tempestade elevou-se, em média, 11,7 mm ao ano. A estação fica no Parque Nacional da Tijuca, uma das maiores florestas urbanas do planeta. “Houve uma tendência de aumento da pluviosidade total na região metropolitana e as áreas de floresta, como o Alto da Boa Vista, se tornaram mais úmidas”, afirma a meteorologista Claudine Dereczynski, da Universidade Federal do Rio de Janeiro (UFRJ), principal autora do estudo, ainda não publicado.
A outra estação do Inmet se situa em Santa Cruz, bairro com menos áreas verdes da zona Oeste. Nessa região, os sinais de intensificação das chuvas foram discretos, segundo as informações coletadas entre 1964 e 2009, e não foram considerados estatisticamente significativos. “No Rio, os dados climáticos das últimas décadas sinalizam mais claramente um aumento na temperatura local e de forma mais fraca uma elevação da quantidade de chuvas”, diz Claudine. Simulações feitas por pesquisadores do Inpe e da UFRJ projetam para as próximas décadas um aumento na intensidade e na frequência tanto dos dias de chuva intensa como dos de seca. A pluviosidade apresenta tendência a se tornar mais mal distribuída ao longo do ano e a se concentrar fortemente em alguns dias.
Os Projetos
1Narrowing the Uncertainties on Aerosol and Climate Changes in São Paulo State – Nuance-SPS – n° 08/58104-8 2Assessment of impacts and vulnerability to climate change in Brazil and strategies for adaptation option – n° 08/58161-1
Modalidade
1e2Programa FAPESP de Pesquisa sobre Mudanças Climáticas Globais – Projeto Temático
Coordenadores
1Maria de Fátima Andrade – IAG-USP 2José Marengo – Inpe
GLOBAL warming isn’t a prediction. It is happening. That is why I was so troubled to read a recent interview with President Obama in Rolling Stone in which he said that Canada would exploit the oil in its vast tar sands reserves “regardless of what we do.”
If Canada proceeds, and we do nothing, it will be game over for the climate.
Canada’s tar sands, deposits of sand saturated with bitumen, contain twice the amount of carbon dioxide emitted by global oil use in our entire history. If we were to fully exploit this new oil source, and continue to burn our conventional oil, gas and coal supplies, concentrations of carbon dioxide in the atmosphere eventually would reach levels higher than in the Pliocene era, more than 2.5 million years ago, when sea level was at least 50 feet higher than it is now. That level of heat-trapping gases would assure that the disintegration of the ice sheets would accelerate out of control. Sea levels would rise and destroy coastal cities. Global temperatures would become intolerable. Twenty to 50 percent of the planet’s species would be driven to extinction. Civilization would be at risk.
That is the long-term outlook. But near-term, things will be bad enough. Over the next several decades, the Western United States and the semi-arid region from North Dakota to Texas will develop semi-permanent drought, with rain, when it does come, occurring in extreme events with heavy flooding. Economic losses would be incalculable. More and more of the Midwest would be a dust bowl. California’s Central Valley could no longer be irrigated. Food prices would rise to unprecedented levels.
If this sounds apocalyptic, it is. This is why we need to reduce emissions dramatically. President Obama has the power not only to deny tar sands oil additional access to Gulf Coast refining, which Canada desires in part for export markets, but also to encourage economic incentives to leave tar sands and other dirty fuels in the ground.
The global warming signal is now louder than the noise of random weather, as I predicted would happen by now in the journal Science in 1981. Extremely hot summers have increased noticeably. We can say with high confidence that the recent heat waves in Texas and Russia, and the one in Europe in 2003, which killed tens of thousands, were not natural events — they were caused by human-induced climate change.
We have known since the 1800s that carbon dioxide traps heat in the atmosphere. The right amount keeps the climate conducive to human life. But add too much, as we are doing now, and temperatures will inevitably rise too high. This is not the result of natural variability, as some argue. The earth is currently in the part of its long-term orbit cycle where temperatures would normally be cooling. But they are rising — and it’s because we are forcing them higher with fossil fuel emissions.
The concentration of carbon dioxide in the atmosphere has risen from 280 parts per million to 393 p.p.m. over the last 150 years. The tar sands contain enough carbon — 240 gigatons — to add 120 p.p.m. Tar shale, a close cousin of tar sands found mainly in the United States, contains at least an additional 300 gigatons of carbon. If we turn to these dirtiest of fuels, instead of finding ways to phase out our addiction to fossil fuels, there is no hope of keeping carbon concentrations below 500 p.p.m. — a level that would, as earth’s history shows, leave our children a climate system that is out of their control.
We need to start reducing emissions significantly, not create new ways to increase them. We should impose a gradually rising carbon fee, collected from fossil fuel companies, then distribute 100 percent of the collections to all Americans on a per-capita basis every month. The government would not get a penny. This market-based approach would stimulate innovation, jobs and economic growth, avoid enlarging government or having it pick winners or losers. Most Americans, except the heaviest energy users, would get more back than they paid in increased prices. Not only that, the reduction in oil use resulting from the carbon price would be nearly six times as great as the oil supply from the proposed pipeline from Canada, rendering the pipeline superfluous, according to economic models driven by a slowly rising carbon price.
But instead of placing a rising fee on carbon emissions to make fossil fuels pay their true costs, leveling the energy playing field, the world’s governments are forcing the public to subsidize fossil fuels with hundreds of billions of dollars per year. This encourages a frantic stampede to extract every fossil fuel through mountaintop removal, longwall mining, hydraulic fracturing, tar sands and tar shale extraction, and deep ocean and Arctic drilling.
President Obama speaks of a “planet in peril,” but he does not provide the leadership needed to change the world’s course. Our leaders must speak candidly to the public — which yearns for open, honest discussion — explaining that our continued technological leadership and economic well-being demand a reasoned change of our energy course. History has shown that the American public can rise to the challenge, but leadership is essential.
The science of the situation is clear — it’s time for the politics to follow. This is a plan that can unify conservatives and liberals, environmentalists and business. Every major national science academy in the world has reported that global warming is real, caused mostly by humans, and requires urgent action. The cost of acting goes far higher the longer we wait — we can’t wait any longer to avoid the worst and be judged immoral by coming generations.
James Hansen directs the NASA Goddard Institute for Space Studies and is the author of “Storms of My Grandchildren.”
ScienceDaily (Feb. 22, 2011) — A new paper by George Mason University researchers shows that ‘Climategate’ — the unauthorized release in late 2009 of stolen e-mails between climate scientists in the U.S. and United Kingdom — undermined belief in global warming and possibly also trust in climate scientists among TV meteorologists in the United States, at least temporarily.
In the largest and most representative survey of television weathercasters to date, George Mason University’s Center for Climate Change Communication and Center for Social Science Research asked these meteorologists early in 2010, when news stories about the climate e-mails were breaking, several questions about their awareness of the issue, attention to the story and impact of the story on their beliefs about climate change. A large majority (82 percent) of the respondents indicated they had heard of Climategate, and nearly all followed the story at least “a little.”
Among the respondents who indicated that they had followed the story, 42 percent indicated the story made them somewhat or much more skeptical that global warming is occurring.These results stand in stark contrast to the findings of several independent investigations of the emails, conducted later, that concluded no scientific misconduct had occurred and nothing in the emails should cause doubts about the fact which show that global warming is occurring.
The results, which were published in the journal Bulletin of the American Meteorology Society, also showed that the doubts were most pronounced among politically conservative weathercasters and those who either do not believe in global warming or do not yet know. The study showed that age was not a factor nor was professional credentials, but men — independent of political ideology and belief in global warming — were more likely than their female counterparts to say that Climategate made them doubt that global warming was happening.
“Our study shows that TV weathercasters — like most people — are motivated consumers of information in that their beliefs influence what information they choose to see, how they evaluate information, and the conclusions they draw from it,” says Ed Maibach, one of the researchers. “Although subsequent investigations showed that the climate scientists had done nothing wrong, the allegation of wrongdoing undermined many weathercasters’ confidence in the conclusions of climate science, at least temporarily.”
The poll of weathercasters was conducted as part of a larger study funded by the National Science Foundation on American television meteorologists. Maibach and others are now working with a team of TV meteorologists to test what audience members learn when weathercasters make efforts to educate their viewers about the relationship between the changing global climate and local weather conditions.
Ultimately, the team hopes to answer key research questions about how to help television meteorologists nationwide become an effective source of informal science education about climate change.
“Most members of the public consider television weather reporters to be a trusted source of information about global warming — only scientists are viewed as more trustworthy,” says Maibach. “Our research here is based on the premise that weathercasters, if given the opportunity and resources, can become an important source of climate change education for a broad cross section of Americans.”
ScienceDaily (Mar. 29, 2010) — In a time when only a handful of TV news stations employ a dedicated science reporter, TV weathercasters may seem like the logical people to fill that role, and in many cases they do.
In the largest and most representative survey of television weathercasters to date, George Mason University’s Center for Climate Change Communication shows that two-thirds of weathercasters are interested in reporting on climate change, and many say they are already filling a role as an informal science educator.
“Our surveys of the public have shown that many Americans are looking to their local TV weathercaster for information about global warming,” says Edward Maibach, director of the Center for Climate Change Communication. “The findings of this latest survey show that TV weathercasters play — or can play — an important role as informal climate change educators.”
According to the survey, climate change is already one of the most common science topics TV weathercasters discuss — most commonly at speaking events, but also at the beginning or end of their on-air segments, on blogs and web sites, on the radio and in newspaper columns.
Weathercasters also indicated that they are interested in personalizing the story for their local viewers — reporting on local stories such as potential flooding/drought, extreme heat events, air quality and crops. About one-quarter of respondents said they have already seen evidence of climate change in their local weather patterns.
“Only about 10 percent of TV stations have a dedicated specialist to cover these topics,” says University of Texas journalism professor Kristopher Wilson, a collaborator on the survey. “By default, and in many cases by choice, science stories become the domain of the only scientifically trained person in the newsroom — weathercasters.”
Many of the weathercasters said that having access to resources such as climate scientists to interview and high-quality graphics and animations to use on-air would increase their ability to educate the public about climate change.
However, despite their interest in reporting more on this issue, the majority of weathercasters (61 percent) feel there is a lot of disagreement among scientists about the issue of global warming. Though 54 percent indicated that global warming is happening, 25 percent indicated it isn’t, and 21 percent say they don’t know yet.
“A recent survey showed that more than 96 percent of leading climate scientists are convinced that global warming is real and that human activity is a significant cause of the warming,” says Maibach. “Climate scientists may need to make their case directly to America’s weathercasters, because these two groups appear to have a very different understanding about the scientific consensus on climate change.”
This survey is one part of a National Science Foundation-funded research project on meteorologists. Using this data, Maibach and his research team will next conduct a field test of 30-second, broadcast-quality educational segments that TV weathercasters can use in their daily broadcasts to educate viewers about the link between predicted (or current) extreme weather events in that media market and the changing global climate.
Ultimately, the team hopes to answer key research questions supporting efforts to activate TV meteorologists nationwide as an important source of informal science education about climate change.
Comentário de Alexandre A. Costa, um dos mais respeitados meteorologistas do Brasil, sobre a entrevista:
A Negação da Mudança Climática e a Direita Organizada (10 de maio de 2012 – postado no Facebook)
Vocês devem ter assistido ou ouvido falar da entrevista recentemente veiculada no programa do Jô, com o Sr. Ricardo Felício que, mesmo sendo professor da Geografia da USP, atacou a comunidade de cientistas do clima, esboçou uma série de teorias conspiratórias e cometeu absurdos que não fazem sentido científico algum como as afirmações de que “não há elevação do nível do mar”, “o efeito estufa não existe”, “a camada de ozônio não existe”, “a Floresta Amazônica se reconstituiria em 20 anos após ser desmatada” e chegou ao auge ao apresentar uma explicação desprovida de sentido para a alta
temperatura de Vênus, apresentando uma interpretação totalmente absurda da lei dos gases.
Enfim, o que levaria uma pessoa que, a princípio é ligada à comunidade acadêmica, a postura tão absurda? Primeiro, achei tratar-se de alpinismo midiático. Como o currículo da figura não mostra nenhuma produção minimamente relevante, achei apenas que bater no “mainstream” fosse uma maneira de chamar atenção, atrair publicidade, ganhar fama, etc. Ingenuidade minha.
Entrevistador: “Você conhece alguma instituição que apóie o seu pensamento? Como ela funciona? E o que ela faz?” Ridardo Felício: “Recomendo que procurem, aqui no Brasil, a MSIa – Movimento de Solidariedade Ibero-Americana.”
Mas quem é essa MSIa? Um grupo de extrema-direita especialista em teorias conspiratórias e em ataques ao Greenpeace (“um instrumento político das oligarquias internacionais”), ao Movimento de Trabalhadores Sem Terra — MST (“um instrumento de guerra contra o Estado Brasileiro), o Foro de São Paulo (“reúne grupos revolucionários que objetivam desestabilizar as Forças Armadas”), a Pastoral da Terra, etc. Eu mesmo fui no site dessa organização e a última desse pessoal é uma campanha contra a Comissão da Verdade, a favor dos militares (“A quem interessa uma crise militar”)! Para quem quiser conhecer os posicionamentos desse pessoal, basta checar em http://www.msia.org.br/
Eis que um pouco mais de busca e achei o Ricardo Felicio sendo citado (‘”A ONU achou um jeito de implementar seu governo global, e o mundo será gerido por painéis pseudocientíficos””) onde? No site http://www.midiasemmascara.org/ do ultra-direitista Olavo de Carvalho…
Parece ser sintomático que às vésperas do final do prazo para veto do Código ruralista, alguém com esse tipo de vínculo (a MSIa se associa à UDR) venha dizer que se pode desmatar a Amazônia que a mesma se regenera em vinte anos… É interessante que a acusação de uma agenda “ambientalista”, “comunista”, de “governança internacional” ou qualquer que seja o delírio que os negadores da mudança climática colocam ao tentarem politizar-ideologizar a questão apenas mostram de onde vem essa politização-ideologização e com que matiz.
Como costumo dizer, moléculas de CO2 não têm ideologia e absorvem radiação infravermelho, independente da existência não só de posições políticas, mas até dos humanos que as expressam. O aumento de suas concentrações na atmosfera terrestre não poderiam ter outro efeito que não o de aquecimento do sistema climático global. Negar uma verdade científica óbvia então só faz sentido para aqueles que têm interesses atingidos. E fica claro. Esse senhor, que academicamente é um farsante é, na verdade, um militante de direita. Parafraseando aqueles que tanto o admiram, precisa aparecer na mídia sem a máscara de “professor da USP”, “climatologista”, etc., mas sim com sua verdadeira face.
Alexandre A. Costa, Ph.D.
Professor Titular
Mestrado em Ciências Físicas Aplicadas
Universidade Estadual do Ceará
A Negação das Mudanças Climáticas e a Direita Organizada – Parte II: Mais Revelações (13 de maio de 2012 – postado no Facebook)
Não é difícil continuar a ligar os pontos, após a aparição do Sr. Ricardo Felício no programa do Jô Soares. Por que alguém se disporia a se expor ao ridículo daquela forma? Como alguém seria capaz de, na posição de doutor em Geografia, professor da USP e “climatologista”, assassinar não apenas o conhecimento científico recente, mas leis básicas da Física, conhecimentos fundamentais de Química, Ecologia, etc.? O que levaria alguém a insultar de forma tão grosseira a comunidade acadêmica brasileira e internacional, principalmente a nós, Cientistas do Clima?
O que pretendo mostrar é que para chegar a esse ponto, é preciso ter motivações. E estas, meus caros, não são de mera vaidade, desejo pelo estrelato, etc. É uma agenda.
Para os que quiserem continuar comigo a rastrear a motivação por trás dessa tal entrevista, peço que visitem, mesmo que isso dê a eles alguma audiência, o repositório dos vídeos do pop-star tupiniquim da negação das mudanças climáticas em http://www.youtube.com/user/TvFakeClimate. Lá, os links são para o conhecido site http://www.msia.org.br/ do “Movimento de Solidariedade Íbero-Americana”, cujo nome pomposo esconde o neo-fascismo LeRouchista, especializado em teorias conspiratórias e manipulação e inimigo visceral, como se pode ver em seu site, do MST, do movimento feminista, do movimento de direitos humanos, da Comissão da Verdade, etc; para o não menos direitoso http://www.midiaamais.com.br/, cujos artigos não consegui ler até o fim, mas que são de ataques de direita a Obama, de ridicularização do movimento dos moradores do Pinheirinho, em SJC, de combate à decisão do STF em considerar as cotas constitucionais e, claro, negação da mudança climática e ataques ao IPCC, etc,; um site anti-movimento ambientalista de nome http://ecotretas.blogspot.com/, que por sua vez contém links neo-fascistas como “vermelho não” (http://vermelhosnao.blogspot.com.br/search/label/verdismo), que por sinal está fazendo a campanha “Não Veta, Dilma”, ou especializados em teorias conspiratórias como http://paraummundolivre.blogspot.com.br/ e até diretistas exóticos, defensores da restauração da monarquia em Portugal (http://quartarepublica.wordpress.com/) ou neo-salazaristas (http://nacionalismo-de-futuro.blogspot.com.br/).
Como coloquei em diversos momentos, não é a escolha política-ideológica que faz com que alguém tenha ou não razão em torno da questão climática. Tenho colegas em minha comunidade de pesquisa que simpatizam com os mais variados matizes político-ideológicos (o que por si só já dificultaria que nos juntássemos numa “conspiração”… como é mesmo… ah!… para “conquistar uma governança mundial da ONU via painéis de clima”, tipo de histeria típico da direita mais tresloucada dos EUA). A questão do clima é objetiva. Os mecanismos de controle do clima são conhecidos, incluindo o papel dos gases de efeito estufa. As medições, os resultados de modelos (atacados de maneira desonesta pelo entrevistado), os testemunhos paleoclimáticos, todos convergem. E dentre todas as possíveis hipóteses para o fenômeno do aquecimento do sistema climático, a contribuição antrópica via emissão de gases de efeito estufa foi a única a permanecer de pé após todos os testes. Constatar isso independe de ideologia. Basta abrir os olhos. O tipo de política pública a ser aplicada para lidar com os impactos, a adaptação às mudanças e a mitigação das mesmas, aí sim… é um terreno em que as escolhas políticas adquirem grau de liberdade.
O problema é que, para uma determinada franja político-ideológica, no caso a extrema-direita, há realmente incompatibilidade com qualquer agenda ambiental que possa significar controle público sobre o capital privado. Há também uma necessidade de ganhar respaldos afagando desejos escondidos da opinião pública (como o de que nada precisa ser feito a respeito das mudanças climáticas) e fazendo apelos ao nacionalismo (típico dos Mussolinis, dos Hitlers, dos Francos, dos Salazares e de tantas ditaduras de direita na América Latina) – ainda que eventualmente isso signifique adotar um discurso falsamente antiimperialista. Com esses objetivos “maiores”, que incluem sabotar a campanha pelo veto presidencial sobre o monstro que é o Código Florestal aprovado pelos deputados, para que compromisso com a verdade científica? Para que ética e tratamento respeitoso em relação aos demais colegas de mundo acadêmico?
É impressionante como aqueles que nos acusam de “fraude”, “conspiração”, etc., na verdade são exatamente os que as praticam. Como coloquei em outros textos que escrevi sobre o assunto, é preciso desmistificar cientificamente os pseudo-argumentos apresentados pelos negadores (e isso tenho feito em outros textos), mas como bem lembra o colega Michael Mann, eles são como a hidra. Sempre têm mais mentiras na manga para lançarem por aí e não têm preocupação nenhuma em apresentarem um todo coerente em oposição aos pontos de vista da comunidade científica. Interessa a eles semearem confusão, ganharem espaço político, atrasarem ações de proteção da estabilidade climática, darem tempo para os que os financiam na base (ainda que possa haver negadores não ligados diretamente à indústria de petróleo e outras, mas já ficou evidente a ligação desta com a campanha articulada anti-ciência do clima em escala mundial). A pseudo-ciência e a impostura intelectual são as cabeças da hidra. O coração do monstro é a agenda político-ideológica. Mas a espada da verdade é longa o suficiente para ferir-lhe de morte!
Alexandre A. Costa, Ph.D.
Professor Titular
Mestrado em Ciências Físicas Aplicadas
Universidade Estadual do Ceará
Em Defesa da Ciência do Clima (10 de maio de 2012 – postado no Facebook)
Tenho me preocupado muito com os ataques feitos recentemente à Ciência do Clima, dentre outros motivos, porque estes tem se constituído num amálgama estranho que reúne o Tea Party, a indústria petroquímica e pessoas que parecem acreditar numa grande conspiração imperialista para, ao impedir que queimem suas reservas de combustíveis fósseis, a periferia do capitalismo se “desenvolva”, o que, com o perdão da palavra, já é per si uma visão absolutamente tacanha de “desenvolvimento”.
Mas essa não é uma questão ideológica, mesmo porque se o fosse estaria eu distante de Al Gore. É uma questão científica, pois moléculas de CO2 não têm ideologia. O que elas são dotadas, assim como outras moléculas (caso do CH4 e do próprio vapor d’água), é de uma propriedade da qual não gozam os gases majoritários em nossa atmosfera, que é a de um modo de oscilação cuja frequência coincide com a de uma região do espectro eletromagnético conhecida como infravermelho. A retenção do calor é uma consequência da presença desses gases (mesmo tão minoritários) na atmosfera terrestre. Não fosse por eles, a Terra teria temperatura média de -18 graus, em contraste com os moderados 15, para não falar do papel dos mesmos em mantê-la entre limites amenos. A Terra não é Mercúrio que, por não ter atmosfera, devolve livremente a energia absorvida do Sol na porção em que é dia, levando-o a contrastes de temperatura de 430 graus durante o dia e -160 graus à noite. Felizmente, tampouco é Vênus, cuja cobertura de nuvens faz com que chegue à sua superfície menos energia solar do que na Terra, mas cujo efeito estufa, causado por sua atmosfera composta quase que exclusivamente por CO2, eleva sua temperatura a praticamente constantes 480 graus.
Desconhecer essas idéias científicas simples, de que o CO2 é um gás de efeito estufa (conhecido e medido por Tyndall, Arrhenius e outros, desde o século XIX), com mecanismo bem explicado pela Física de sua estrutura molecular; ignorar o conhecido efeito global que o CO2 tem sobre um planeta vizinho, o que é bem estabelecido pela astronomia desde o saudoso Sagan, não faz sentido, especialmente no meio acadêmico, onde encontram-se alguns dos negadores mais falantes. A esses eu gostaria de lembrar de algo básico no método científico. De um lado, a ciência não tem dogma, nem verdades definitivas. Suas verdades são sempre, por construção, parciais e provisórias (que bom, senão viraria algo chato e tedioso como, digamos, uma religião). No entanto, por outro lado, o conhecimento científico é cumulativo e, nesse sentido, não se pode andar para trás! Só quando uma teoria falha, se justifica uma nova e esta não pode ser apenas a negação da anterior, pois precisa ser capaz de reproduzir todos os seus méritos (caso da Mecânica Clássica e da Relatividade, que se reduz à primeira para baixas velocidades).
Não é uma questão de crença. “Monotonia” à parte, é ciência bem estabelecida, bem conhecida. Tanto quanto a Gravitação Universal (que também é “apenas” uma teoria) ou a Evolução das Espécies.
INJUSTIÇA, DESRESPEITO E SUBESTIMAÇÃO
Os Cientistas do Clima tem sofrido ataques, com base em factóides que em nenhum momento se assemelham à realidade de nossa área. Nenhuma Ciência é hoje tão pública e aberta. Quem quiser, pode obter facilmente, na maioria dos casos diretamente pela internet, dados observados do clima, que demonstram claramente o aquecimento global (www.cru.uea.ac.uk/cru/data/ dentre outros), dados de modelagem que estão sendo gerados agora e que certamente subsidiarão o 5o relatório do IPCC (http://cmip-pcmdi.llnl.gov/cmip5/data_portal.html) ou dados de testemunhos paleoclimáticos, que servem para analisar o clima do passado (www.ncdc.noaa.gov). Pode obter os relatórios do IPCC, emwww.ipcc.ch e seguir as referências, revisadas e publicadas em sua esmagadora maioria, principalmente no caso do Grupo de Trabalho que lida com as Bases Físicas do Sistema Climático, em revistas de grande impacto, sejam gerais (Science, Nature), sejam da área. Duvido que, em nossas universidades, cheias de laboratórios com convênios privados, sejam na engenharia de materiais ou na bioquímica, haja um segmento tão aberto, que tenha o desprendimento de sentar à mesa, compartilhar dados, levantar o estado-da-arte em sua ciência e elaborar coletivamente um relatório de síntese. Duvido! Desafio!
Os cientistas que participamos desses painéis não somos “representantes de governos”. Nada é criado ou inventado nesses painéis, além de uma síntese da Ciência que é produzida de maneira independente e publicada na literatura revisada por pares. Os que participam da comunidade acadêmica podem, inclusive, se informar melhor com facilidade, junto a colegas da comunidade científica brasileira que participaram e participam das iniciativas do IPCC e do PBMC sobre o funcionamento desses painéis, antes de emitir opinião, para que não terminem, na prática, difamando o que desconhece. Algumas pessoas, sem a menor conduta crítica em relação aos detratores do IPCC, repete-lhes a verborragia, quando poderiam ser céticos em relação aos “céticos”.
Mas não o são. Em nenhum momento, questionam as reais motivações de dois ou três (felizmente, são tão raros) que assumem a conduta lamentável da negação anti-ciência, ou por serem abertamente corruptos e serviçais da indústria petroquímica ou, simplesmente, por terem uma vaidade que não cabe no papel secundário que cumpririam caso estivessem, como nós, desprendendo, em geral quase anonimamente, enorme energia para colocar tijolo por tijolo no edifício da Ciência do Clima. É preciso saber distinguir entre o ceticismo honesto, genuíno, que é saudável em ciência, consonante com a dúvida sincera e a conduta crítica, da negação religiosa, baseada em fé e na necessidade cega de defender determinado ponto de vista, independente se o mesmo tem base real ou não e, principalmente, da canalhice pura e simples, que é o que é promovido por alguns dos negadores. O possível “sucesso” dessas idéias junto ao público, para mim, são terreno da psicologia social, mas a melhor analogia que tenho é a da popularidade de idéias religiosas, em geral mentiras reconfortantes que são preferidas em detrimento de verdades desagradáveis.
O verdadeiro ceticismo levou até onde os físicos de Berkeley foram (http://www.berkeleyearth.org/index.php). Inicialmente questionando os resultados obtidos por nossa comunidade, se municiaram de um enorme banco de dados de temperatura em escala mundial, mais amplo do que os que o Hadley Centre inglês e a NASA dispunham. Testaram outras metodologias, chegaram até a excluir as estações meteorológicas usadas por nossos centros de pesquisa. A postura inicial de Richard Muller, idealizador dessa iniciativa, era de tamanho questionamento em relação a nossos resultados que ele chegou a alavancar recursos da famigerada Fundação Koch, abertamente anti-Ciência do Clima. Mas o que Muller e seus parceiros encontraram? O mesmo resultado que já nos era conhecido. A Terra está aquecendo e este aquecimento se acelerou bastante nas últimas décadas do século XX. Este aquecimento se aproxima de um grau e portanto está muito acima de todas as flutuações naturais registradas desde que se tem registro instrumental. Aliás, confirmou o que também sabíamos: que os dados da Universidade de East Anglia (aqueles mesmos da farsa montada sob o nome altissonante de “climategate”, aqueles que foram perseguidos e cuja reputação foi ignominiosamente atacada, com repercussões em suas carreiras profissionais e vidas pessoais) contém um erro… para menos! O aquecimento sugerido pelos dados da CRU/UEA é um décimo de grau inferior aos das outras fontes de dados e, claro, entre nós, ninguém os acusa de desonestos por isso.
Outra impostura – e infelizmente, apesar da dureza do termo, acho que é neste caso em que ele se aplica – é a subestimação da inteligência de nossa comunidade, aliada ao desconhecimento dos materiais por ela produzidos. O 4o relatório do IPCC já contém um capítulo exclusivamente sobre Paleoclimatologia, isto é, sobre o clima do passado. Eu pessoalmente tenho dedicado grandes esforços na análise de testemunhos do clima passado e na modelagem das condições climáticas passadas. Existe uma preocupação permanente em discernir o sinal natural e separar, dele, o sinal antrópico, desde o primeiro relatório do IPCC. Para isso, avalia-se o papel das variações de atividade solar, as emissões dos vulcões, etc. Já avaliamos as possíveis influências naturais e as descartamos como possível causa para o aquecimento observado.
Nesse sentido, não há lugar para sofismas e tergiversações. Sobre os registros paleoclimáticos, que são capazes de recontar o histórico de temperatura e de concentração de gases de efeito estufa de 800 mil anos atrás até o presente, todos sabemos que, no passado, um pequeno aquecimento do planeta precedeu o aumento da concentração dos gases de efeito estufa. Isso se deu antes do encerramento de todas as eras glaciais. Mas é um raciocínio obtuso deduzir daí que o CO2 não exerce nenhum papel ou, nas palavras dos negadores “é consequência e não causa”. Existem diversos processos de retroalimentação no sistema climático e este é um dos melhores exemplos. As sutis variações da insolação e da distribuição desta sobre a superfície da Terra associadas aos ciclos orbitais são – e isto é do conhecimento de todos – muito pequenas para explicar as grandes diferenças de temperatura entre os períodos glaciais (“eras do gelo”) e os interglaciais (períodos quentes, mais breves, que as intercalaram). Mas um aquecimento sutil, após alguns séculos, mostrou-se suficiente para elevar a emissões naturais de CO2 e metano, que causam efeito estufa e amplificam o processo. Essa retroalimentação só era refreada, em condições livres da ação do homem, quando as condições orbitais mudavam novamente, levando a um resfriamento sutil, que induzia a captura de CO2 no sistema terrestre, que por sua vez amplificava o resfriamento e assim por diante.
Mas não é porque pessoas morrem de câncer e infarto que não se possa atribuir responsabilidades a um assassino! Porque pessoas morrem naturalmente de derrame, alguém acha possível dizer que “é impossível que um tiro mate alguém”? Ou que não se deva julgar mais ninguém por assassinato? Antes, era preciso um pequeno aquecimento para deflagrar emissões naturais e aumento de concentração de CO2, para daí o aquecimento se acelerar. Hoje, há uma fonte independente de CO2, estranha aos ciclos naturais e esta é a queima de combustíveis fósseis! Devo, aliás, frisar que até a análise isotópica (a composição é diferente entre combustíveis fósseis e outras fontes) é clara: a origem do CO2 excedente na atmosfera terrestre é sim, em sua maioria, petróleo, carvão, gás natural! Um mínimo de verdadeiro aprofundamento científico deixa claro que, hoje, o aumento das concentrações de CO2 na atmosfera é eminentemente antrópico e que é isso que vem acarretando as mudanças climáticas observadas. Não é possível mais tapar o sol, ou melhor, tapar os gases de efeito estufa com uma peneira! Os registros paleoclimáticos mostram que o aquecimento atual é inédito nos últimos 2500 anos. Mostram que a concentração atual de CO2 está 110 ppm acima do observado antes da era industrial e quase 100 ppm acima do que se viu nos últimos 800 mil anos. Mostram que esse número é maior do que a diferença entre a concentração de CO2 existente nos interglaciais e nas “eras do gelo” e que isso faz, sim, grande diferença sobre o clima.
QUAIS OS VERDADEIROS ERROS
Algumas pessoas se dizem céticas, críticas e desconfiadas em relação à maioria de nossa comunidade de cientistas do clima, mas não percebem o erro fundamental que cometem: a absoluta falta de ceticismo, criticidade e desconfiança em relação aos que nos detratam. A postura dos que combatem a Ciência do Clima sob financiamento da indústria petroquímica, ou em associação com setores partidários e da mídia mais reacionários é auto-explicativa. Interessa o acobertamento da realidade. Mas não só. Há desde essas pessoas que recebem diretamente recursos da indústria do petróleo a falastrões que há muito não têm atuação científica de verdade na área e, sem serem capazes de permanecer em evidência trabalhando seriamente para contribuir com o avançar de nossa ciência, debruçando-se sobre as verdadeiras incertezas, contribuindo para coletar dados, melhorar métodos e modelos, etc., apenas para manterem holofotes sobre si, têm atacado o restante da comunidade. Estranho e espalhafatoso como as penas de pavão. Prosaico como os mecanismos evolutivos que levaram tais penas a surgirem. Daí é preciso também combater o ponto de vista daqueles que dão a esse ataque um falso verniz “de esquerda”, pois lançam mão de teorias de conspiração, uma deturpação patológica do raciocínio crítico. Lutar com o alvo errado, com a arma errada, é pior do que desarmar para a luta.
O IPCC é perfeito? Não, é claro. Cometeu erros. Mas querem saber, de fato, quais são? Uma coisa precisa ficar claro a todos. As avaliações do IPCC tendem a ser conservadoras. As projeções de temperatura realizadas para após o ano 2000 estão essencialmente acertadas, mas sabe o que acontece com as projeções de elevação do nível dos oceanos e de degelo no Ártico? Estão subestimadas. Isso mesmo. O cenário verdadeiro é mais grave do que o 4o relatório do IPCC aponta. Mas de novo não é por uma questão política, mas pela limitação, na época, dos modelos de criosfera, incapazes de levar em conta processos importantes que levam ao degelo. Provavelmente, baseando-se em artigos que vêm sendo publicados nesse meio tempo, o 5o relatório será capaz de corrigir essas limitações e mostrar um quadro mais próximo da real gravidade do problema em 2013-2014 quando de sua publicação.
QUAL A VERDADEIRA QUESTÃO IDEOLÓGICA?
Não faz sentido “acreditar” ou não na gravidade, na evolução ou no efeito estufa. Não se trata de uma “opção ideológica” (apesar de haver, nos EUA, uma forte correlação entre ideologia e ciência junto ao eleitorado republicano mais reacionário, que dá ouvidos aos detratores da ciência do clima e que também querem ver Darwin fora das escolas).
A verdadeira questão ideológica, é que as mudanças climáticas são um processo de extrema desigualdade, da raiz, aos seus impactos. Quem mais se beneficiou das emissões dos gases de efeito estufa foram e continuam sendo as classes dominantes dos países capitalistas centrais. Juntamente com os mega-aglomerados do capital financeiro, a indústria petroquímica, o setor de mineração (que inclui mineração de carvão), o setor energético, etc. concentraram riquezas usando a atmosfera como sua grande lata de lixo. Mais do que a “pegada” de carbono atual (que é ainda extremamente desigual se compararmos americanos, europeus e australianos, de um lado, com africanos do outro), é mais díspar ainda a “pegada histórica” (isto é, o já emitido, o acumulado a partir das emissões de cada país), que faz da Europa e, em seguida, dos EUA, grandes emissores históricos.
Cruelmente, em contrapartida, os impactos das mudanças no clima recairão sobre os países mais pobres, sobre as pequenas nações, principalmente sobre os pobres dos países pobres, sobre os mais vulneráveis. Perda de territórios em países insulares, questões de segurança hídrica e alimentar em regiões semi-áridas (tão vastas no berço de nossa espécie, que é o continente africano), efeitos de eventos severos (que, com base física muito clara, devem se tornar mais frequentes num planeta aquecido), comprometimento de ecossistemas marinhos costeiros e florestas, atingindo pesca e atividades de coleta; inviabilização de culturas agrícolas tradicionais… tudo isso recai onde? Sobre o andar de baixo! O de cima fala em “adaptação” e tem muito mais instrumentos para se adaptar às mudanças. A nós, neste caso, interessa sermos conservadores quanto ao clima e frear esse “experimento” desastrado, desordenado, de alteração da composição química da atmosfera terrestre e do balanço energético planetário! Para a maioria dos 7 bilhões de habitantes dessa esfera, a estabilidade climática é importante!
Alguns dos mais ricos, na verdade, veem o aquecimento global como “oportunidade”… Claro, “oportunidade” de expandir o agronegócio para as futuras terras agricultáveis do norte do Canadá e da Sibéria e para explorar petróleo no oceano que se abrirá com o crescente degelo do Ártico.
Assim, é preciso perceber que há uma verdadeira impostura vagando por aí e a Ciência precisa ser defendida. Uma rocha é uma rocha; uma árvore é uma árvore; uma molécula de CO2 é uma molécula de CO2, independente de ideologia. Mas os de baixo só serão/seremos capazes de se/nos armarem/armarmos para transformar a sociedade se estiverem/estivermos bem informados e aí, é preciso combater os absurdos proferidos pelos detratores da Ciência do Clima.
Alexandre Costa é bacharel em Física e mestre em Física pela Universidade Federal do Ceará, Ph.D. em Ciências Atmosféricas pela Colorado State University, com pós-doutorado pela Universidade de Yale, com publicações em diversos periódicos científicos, incluindo Science, Journal of the Amospheric Sciences e Atmospheric Research. É bolsista de produtividade do CNPq e membro do Painel Brasileiro de Mudanças Climáticas.
ScienceDaily (Oct. 16, 2009) — Worried about climate change and want to learn more? You probably aren’t watching television then. A new study by George Mason University Communication Professor Xiaoquan Zhao suggests that watching television has no significant impact on viewers’ knowledge about the issue of climate change. Reading newspapers and using the web, however, seem to contribute to people’s knowledge about this issue.
The study, “Media Use and Global Warming Perceptions: A Snapshot of the Reinforcing Spirals,” looked at the relationship between media use and people’s perceptions of global warming. The study asked participants how often they watch TV, surf the Web, and read newspapers. They were also asked about their concern and knowledge of global warming and specifically its impact on the polar regions.
“Unlike many other social issues with which the public may have first-hand experience, global warming is an issue that many come to learn about through the media,” says Zhao. “The primary source of mediated information about global warming is the news.”
The results showed that people who read newspapers and use the Internet more often are more likely to be concerned about global warming and believe they are better educated about the subject. Watching more television, however, did not seem to help.
He also found that individuals concerned about global warming are more likely to seek out information on this issue from a variety of media and nonmedia sources. Other forms of media, such as the Oscar-winning documentary “The Inconvenient Truth” and the blockbuster thriller “The Day After Tomorrow,” have played important roles in advancing the public’s interest in this domain.
Politics also seemed to have an influence on people’s perceptions about the science of global warming. Republicans are more likely to believe that scientists are still debating the existence and human causes of global warming, whereas Democrats are more likely to believe that a scientific consensus has already been achieved on these matters.
“Some media forms have clear influence on people’s perceived knowledge of global warming, and most of it seems positive,” says Zhao. “Future research should focus on how to harness this powerful educational function.”
ScienceDaily (Nov. 21, 2011) — People who believe there is a lot of disagreement among scientists about global warming tend to be less certain that global warming is happening and less supportive of climate policy, researchers at George Mason, San Diego State, and Yale Universities report in a new study published in the journal Nature Climate Change.
A recent survey of climate scientists conducted by researchers at the University of Illinois found near unanimous agreement among climate scientists that human-caused global warming is happening.
This new George Mason University study, however, using results from a national survey of the American public, finds that many Americans believe that most climate scientists actually disagree about the subject.
In the national survey conducted in June 2010, two-thirds of respondents said they either believed there is a lot of disagreement among scientists about whether or not global warming is happening (45 percent), that most scientists think it is not happening (5 percent), or that they did not know enough to say (16 percent.) These respondents were less likely to support climate change policies and to view climate change as a lower priority.
By contrast, survey respondents who correctly understood that there is widespread agreement about global warming among scientists were themselves more certain that it is happening, and were more supportive of climate policies.
“Misunderstanding the extent of scientific agreement about climate change is important because it undermines people’s certainty that climate change is happening, which in turn reduces their conviction that America should find ways to deal with the problem,” says Edward Maibach, director of the Center for Climate Change Communication at George Mason University.
Maibach argues that a campaign should be mounted to correct this misperception. “It is no accident that so many Americans misunderstand the widespread scientific agreement about human-caused climate change. A well-financed disinformation campaign deliberately created a myth about there being lack of agreement. The climate science community should take all reasonable measures to put this myth to rest.”
ScienceDaily (Oct. 14, 2010) — Sixty-three percent of Americans believe that global warming is happening, but many do not understand why, according to a national study conducted by researchers at Yale University.
The report titled “Americans’ Knowledge of Climate Change” found that only 57 percent know what the greenhouse effect is, only 45 percent of Americans understand that carbon dioxide traps heat from the Earth’s surface, and just 50 percent understand that global warming is caused mostly by human activities. Large majorities incorrectly think that the hole in the ozone layer and aerosol spray cans cause global warming. Meanwhile, 75 percent of Americans have never heard of the related problems of ocean acidification or coral bleaching.
However, many Americans do understand that emissions from cars and trucks and the burning of fossil fuels contribute to global warming and that a transition to renewable energy sources is an important solution.
Americans also recognize their own limited understanding. Only 1 in 10 say that they are “very well-informed” about climate change, and 75 percent say they would like to know more about the issue. Likewise, 75 percent say that schools should teach children about climate change and 68 percent would welcome a national program to teach Americans more about the issue.
“This study demonstrates that Americans need to learn more about the causes, impacts and potential solutions to global warming,” said study director Anthony Leiserowitz of Yale University. “But it also shows that Americans want to learn more about climate change in order to make up their minds and take action.”
The online survey was conducted by Knowledge Networks from June 24 to July 22, 2010, with 2,030 American adults 18 and older. The margin of sampling error is plus- or minus-2 percent, with 95 percent confidence.
ScienceDaily (Mar. 27, 2008) — The more you know the less you care — at least that seems to be the case with global warming. A telephone survey of 1,093 Americans by two Texas A&M University political scientists and a former colleague indicates that trend, as explained in their recent article in the peer-reviewed journal Risk Analysis.
“More informed respondents both feel less personally responsible for global warming, and also show less concern for global warming,” states the article, titled “Personal Efficacy, the Information Environment, and Attitudes toward Global Warming and Climate Change in the USA.”
The study showed high levels of confidence in scientists among Americans led to a decreased sense of responsibility for global warming.
The diminished concern and sense of responsibility flies in the face of awareness campaigns about climate change, such as in the movies An Inconvenient Truth and Ice Age: The Meltdown and in the mainstream media’s escalating emphasis on the trend.
The research was conducted by Paul M. Kellstedt, a political science associate professor at Texas A&M; Arnold Vedlitz, Bob Bullock Chair in Government and Public Policy at Texas A&M’s George Bush School of Government and Public Service; and Sammy Zahran, formerly of Texas A&M and now an assistant professor of sociology at Colorado State University.
Kellstedt says the findings were a bit unexpected. The focus of the study, he says, was not to measure how informed or how uninformed Americans are about global warming, but to understand why some individuals who are more or less informed about it showed more or less concern.
“In that sense, we didn’t really have expectations about how aware or unaware people were of global warming,” he says.
But, he adds, “The findings that the more informed respondents were less concerned about global warming, and that they felt less personally responsible for it, did surprise us. We expected just the opposite.
“The findings, while rather modest in magnitude — there are other variables we measured which had much larger effects on concern for global warming — were statistically quite robust, which is to say that they continued to appear regardless of how we modeled the data.”
Measuring knowledge about global warming is a tricky business, Kellstedt adds.
“That’s true of many other things we would like to measure in surveys, of course, especially things that might embarrass people (like ignorance) or that they might feel social pressure to avoid revealing (like prejudice),” he says.
“There are no industry standards, so to speak, for measuring knowledge about global warming. We opted for this straightforward measure and realize that other measures might produce different results.”
Now, for better or worse, scientists have to deal with the public’s abundant confidence in them. “But it cannot be comforting to the researchers in the scientific community that the more trust people have in them as scientists, the less concerned they are about their findings,” the researchers conclude in their study.
ScienceDaily (Mar. 26, 2008) — British Prime Minister Gordon Brown recently declared climate change a top international threat, and Al Gore urged politicians to get involved to fight global warming. Results from a recent survey conducted by a University of Missouri professor reveal that the U.S. public, while aware of the deteriorating global environment, is concerned predominantly with local and national environmental issues.
Potomac River near Washington DC. The top three issues that the US public wants the government to address are protecting community drinking water, reducing pollution of U.S. rivers and lakes, and improving urban air pollution issues like smog. (Credit: Michele Hogan)
“The survey’s core result is that people care about their communities and express the desire to see government action taken toward local and national issues,” said David Konisky, a policy research scholar with the Institute of Public Policy. “People are hesitant to support efforts concerning global issues even though they believe that environmental quality is poorer at the global level than at the local and national level. This is surprising given the media attention that global warming has recently received and reflects the division of opinion about the severity of climate change.”
Konisky, an assistant professor in the Truman School of Public Affairs at MU, recently surveyed 1,000 adults concerning their attitudes about the environment. The survey polled respondents about their levels of concern for the environment and preferences for government action to address a wide set of environmental issues.
A strong majority of the public expressed general concern about the environment. According to the survey, the top three issues that the public wants the government to address are protecting community drinking water, reducing pollution of U.S. rivers and lakes, and improving urban air pollution issues like smog. In the survey, global warming ranks eighth in importance.
“Americans are clearly most concerned about pollution issues that might affect their personal health, or the health of their families,” Konisky said.
Additionally, Konisky and his colleagues found that the best predictor of individuals’ environmental preferences is their political attributes. They examined the relationship between party identification and political ideology and support for action to address environmental problems.
“The survey reinforced the stark differences in people’s environmental attitudes, depending on their political leanings,” Konisky said. “Democrats and political liberals clearly express more desire for governmental action to address environmental problems. Republicans and ideological conservatives are much less enthusiastic about further government intervention.”
Results from the survey were recently presented at the annual meeting of the Western Political Science Association in San Diego.
ScienceDaily (May 8, 2012) — Americans’ support for government action on global warming remains high but has dropped during the past two years, according to a new survey by Stanford researchers in collaboration with Ipsos Public Affairs. Political rhetoric and cooler-than-average weather appear to have influenced the shift, but economics doesn’t appear to have played a role.
The survey directed by Jon Krosnick, a senior fellow at the Stanford Woods Institute for the Environment, shows that support for a range of policies intended to reduce future climate change dropped by an average of 5 percentage points per year between 2010 and 2012.
In a 2010 Stanford survey, more than three-quarters of respondents expressed support for mandating more efficient and less polluting cars, appliances, homes, offices and power plants. Nearly 90 percent of respondents favored federal tax breaks to spur companies to produce more electricity from water, wind and solar energy. On average, 72 percent of respondents supported government action on climate change in 2010. By 2012, that support had dropped to 62 percent.
The drop was concentrated among Americans who distrust climate scientists, even more so among such people who identify themselves as Republicans. Americans who do not trust climate science were especially aware of and influenced by recent shifts in world temperature, and 2011 was tied for the coolest of the last 11 years.
Krosnick pointed out that during the recent campaign, all but one Republican presidential candidate expressed doubt about global warming, and some urged no government action to address the issue. Rick Santorum described belief in climate change as a “pseudo-religion,” while Ron Paul called it a “hoax.” Mitt Romney, the apparent Republican nominee, has said, “I can tell you the right course for America with regard to energy policy is to focus on job creation and not global warming.”
The Stanford-Ipsos study found no evidence that the decline in public support for government action was concentrated among respondents who lived in states struggling the most economically.
The study found that, overall, the majority of Americans continue to support many specific government actions to mitigate global warming’s effect. However, most Americans remain opposed to consumer taxes intended to decrease public use of electricity and gasoline.
Author Uncovers DNA Links Between Members of Tribe
MONTAGE KURT HOFFMAN
By Jon Entine
Published May 04, 2012, issue of May 11, 2012.
In his new book, “Legacy: A Genetic History of the Jewish People,” Harry Ostrer, a medical geneticist and professor at Albert Einstein College of Medicine in New York, claims that Jews are different, and the differences are not just skin deep. Jews exhibit, he writes, a distinctive genetic signature. Considering that the Nazis tried to exterminate Jews based on their supposed racial distinctiveness, such a conclusion might be a cause for concern. But Ostrer sees it as central to Jewish identity.
“Who is a Jew?” has been a poignant question for Jews throughout our history. It evokes a complex tapestry of Jewish identity made up of different strains of religious beliefs, cultural practices and blood ties to ancient Palestine and modern Israel. But the question, with its echoes of genetic determinism, also has a dark side.
Geneticists have long been aware that certain diseases, from breast cancer to Tay-Sachs, disproportionately affect Jews. Ostrer, who is also director of genetic and genomic testing at Montefiore Medical Center, goes further, maintaining that Jews are a homogeneous group with all the scientific trappings of what we used to call a “race.”
For most of the 3,000-year history of the Jewish people, the notion of what came to be known as “Jewish exceptionalism” was hardly controversial. Because of our history of inmarriage and cultural isolation, imposed or self-selected, Jews were considered by gentiles (and usually referred to themselves) as a “race.” Scholars from Josephus to Disraeli proudly proclaimed their membership in “the tribe.”
Legacy: A Genetic History of the Jewish People
By Harry Ostrer
Oxford University Press, 288 Pages, $24.95
Ostrer explains how this concept took on special meaning in the 20th century, as genetics emerged as a viable scientific enterprise. Jewish distinctiveness might actually be measurable empirically. In “Legacy,” he first introduces us to Maurice Fishberg, an upwardly mobile Russian-Jewish immigrant to New York at the fin de siècle. Fishberg fervently embraced the anthropological fashion of the era, measuring skull sizes to explain why Jews seemed to be afflicted with more diseases than other groups — what he called the “peculiarities of the comparative pathology of the Jews.” It turns out that Fishberg and his contemporary phrenologists were wrong: Skull shape provides limited information about human differences. But his studies ushered in a century of research linking Jews to genetics.
Ostrer divides his book into six chapters representing the various aspects of Jewishness: Looking Jewish, Founders, Genealogies, Tribes, Traits and Identity. Each chapter features a prominent scientist or historical figure who dramatically advanced our understanding of Jewishness. The snippets of biography lighten a dense forest of sometimes-obscure science. The narrative, which consists of a lot of potboiler history, is a slog at times. But for the specialist and anyone touched by the enduring debate over Jewish identity, this book is indispensable.
“Legacy” may cause its readers discomfort. To some Jews, the notion of a genetically related people is an embarrassing remnant of early Zionism that came into vogue at the height of the Western obsession with race, in the late 19th century. Celebrating blood ancestry is divisive, they claim: The authors of “The Bell Curve” were vilified 15 years ago for suggesting that genes play a major role in IQ differences among racial groups.
Furthermore, sociologists and cultural anthropologists, a disproportionate number of whom are Jewish, ridicule the term “race,” claiming there are no meaningful differences between ethnic groups. For Jews, the word still carries the especially odious historical association with Nazism and the Nuremberg Laws. They argue that Judaism has morphed from a tribal cult into a worldwide religion enhanced by thousands of years of cultural traditions.
Is Judaism a people or a religion? Or both? The belief that Jews may be psychologically or physically distinct remains a controversial fixture in the gentile and Jewish consciousness, and Ostrer places himself directly in the line of fire. Yes, he writes, the term “race” carries nefarious associations of inferiority and ranking of people. Anything that marks Jews as essentially different runs the risk of stirring either anti- or philo-Semitism. But that doesn’t mean we can ignore the factual reality of what he calls the “biological basis of Jewishness” and “Jewish genetics.” Acknowledging the distinctiveness of Jews is “fraught with peril,” but we must grapple with the hard evidence of “human differences” if we seek to understand the new age of genetics.
Although he readily acknowledges the formative role of culture and environment, Ostrer believes that Jewish identity has multiple threads, including DNA. He offers a cogent, scientifically based review of the evidence, which serves as a model of scientific restraint.
“On the one hand, the study of Jewish genetics might be viewed as an elitist effort, promoting a certain genetic view of Jewish superiority,” he writes. “On the other, it might provide fodder for anti-Semitism by providing evidence of a genetic basis for undesirable traits that are present among some Jews. These issues will newly challenge the liberal view that humans are created equal but with genetic liabilities.”
Jews, he notes, are one of the most distinctive population groups in the world because of our history of endogamy. Jews — Ashkenazim in particular — are relatively homogeneous despite the fact that they are spread throughout Europe and have since immigrated to the Americas and back to Israel. The Inquisition shattered Sephardi Jewry, leading to far more incidences of intermarriage and to a less distinctive DNA.
In traversing this minefield of the genetics of human differences, Ostrer bolsters his analysis with volumes of genetic data, which are both the book’s greatest strength and its weakness. Two complementary books on this subject — my own “Abraham’s Children: Race, Identity, and the DNA of the Chosen People” and “Jacob’s Legacy: A Genetic View of Jewish History” by Duke University geneticist David Goldstein, who is well quoted in both “Abraham’s Children” and “Legacy” — are more narrative driven, weaving history and genetics, and are consequently much more congenial reads.
The concept of the “Jewish people” remains controversial. The Law of Return, which establishes the right of Jews to come to Israel, is a central tenet of Zionism and a founding legal principle of the State of Israel. The DNA that tightly links Ashkenazi, Sephardi and Mizrahi, three prominent culturally and geographically distinct Jewish groups, could be used to support Zionist territorial claims — except, as Ostrer points out, some of the same markers can be found in Palestinians, our distant genetic cousins, as well. Palestinians, understandably, want their own right of return.
That disagreement over the meaning of DNA also pits Jewish traditionalists against a particular strain of secular Jewish liberals that has joined with Arabs and many non-Jews to argue for an end to Israel as a Jewish nation. Their hero is Shlomo Sand, an Austrian-born Israeli historian who reignited this complex controversy with the 2008 publication of “The Invention of the Jewish People.”
Sand contends that Zionists who claim an ancestral link to ancient Palestine are manipulating history. But he has taken his thesis from novelist Arthur Koestler’s 1976 book, “The Thirteenth Tribe,” which was part of an attempt by post-World War II Jewish liberals to reconfigure Jews not as a biological group, but as a religious ideology and ethnic identity.
The majority of the Ashkenazi Jewish population, as Koestler, and now Sand, writes, are not the children of Abraham but descendants of pagan Eastern Europeans and Eurasians, concentrated mostly in the ancient Kingdom of Khazaria in what is now Ukraine and Western Russia. The Khazarian nobility converted during the early Middle Ages, when European Jewry was forming.
Although scholars challenged Koestler’s and now Sand’s selective manipulation of the facts — the conversion was almost certainly limited to the tiny ruling class and not to the vast pagan population — the historical record has been just fragmentary enough to titillate determined critics of Israel, who turned both Koestler’s and Sand’s books into roaring best-sellers.
Fortunately, re-creating history now depends not only on pottery shards, flaking manuscripts and faded coins, but on something far less ambiguous: DNA. Ostrer’s book is an impressive counterpoint to the dubious historical methodology of Sand and his admirers. And, as a co-founder of the Jewish HapMap — the study of haplotypes, or blocks of genetic markers, that are common to Jews around the world — he is well positioned to write the definitive response.
In accord with most geneticists, Ostrer firmly rejects the fashionable postmodernist dismissal of the concept of race as genetically naive, opting for a more nuanced perspective.
When the human genome was first mapped a decade ago, Francis Collins, then head of the National Genome Human Research Institute, said: “Americans, regardless of ethnic group, are 99.9% genetically identical.” Added J. Craig Venter, who at the time was chief scientist at the private firm that helped sequenced the genome, Celera Genomics, “Race has no genetic or scientific basis.” Those declarations appeared to suggest that “race,” or the notion of distinct but overlapping genetic groups, is “meaningless.”
But Collins and Venter have issued clarifications of their much-misrepresented comments. Almost every minority group has faced, at one time or another, being branded as racially inferior based on a superficial understanding of how genes peculiar to its population work. The inclination by politicians, educators and even some scientists to underplay our separateness is certainly understandable. But it’s also misleading. DNA ensures that we differ not only as individuals, but also as groups.
However slight the differences (and geneticists now believe that they are significantly greater than 0.1%), they are defining. That 0.1% contains some 3 million nucleotide pairs in the human genome, and these determine such things as skin or hair color and susceptibility to certain diseases. They contain the map of our family trees back to the first modern humans.
Both the human genome project and disease research rest on the premise of finding distinguishable differences between individuals and often among populations. Scientists have ditched the term “race,” with all its normative baggage, and adopted more neutral terms, such as “population” and “clime,” which have much of the same meaning. Boiled down to its essence, race equates to “region of ancestral origin.”
Ostrer has devoted his career to investigating these extended family trees, which help explain the genetic basis of common and rare disorders. Today, Jews remain identifiable in large measure by the 40 or so diseases we disproportionately carry, the inescapable consequence of inbreeding. He traces the fascinating history of numerous “Jewish diseases,” such as Tay-Sachs, Gaucher, Niemann-Pick, Mucolipidosis IV, as well as breast and ovarian cancer. Indeed, 10 years ago I was diagnosed as carrying one of the three genetic mutations for breast and ovarian cancer that mark my family and me as indelibly Jewish, prompting me to write “Abraham’s Children.”
Like East Asians, the Amish, Icelanders, Aboriginals, the Basque people, African tribes and other groups, Jews have remained isolated for centuries because of geography, religion or cultural practices. It’s stamped on our DNA. As Ostrer explains in fascinating detail, threads of Jewish ancestry link the sizable Jewish communities of North America and Europe to Yemenite and other Middle Eastern Jews who have relocated to Israel, as well as to the black Lemba of southern Africa and to India’s Cochin Jews. But, in a twist, the links include neither the Bene Israel of India nor Ethiopian Jews. Genetic tests show that both groups are converts, contradicting their founding myths.
Why, then, are Jews so different looking, usually sharing the characteristics of the surrounding populations? Think of red-haired Jews, Jews with blue eyes or the black Jews of Africa. Like any cluster — a genetic term Ostrer uses in place of the more inflammatory “race” — Jews throughout history moved around and fooled around, although mixing occurred comparatively infrequently until recent decades. Although there are identifiable gene variations that are common among Jews, we are not a “pure” race. The time machine of our genes may show that most Jews have a shared ancestry that traces back to ancient Palestine but, like all of humanity, Jews are mutts.
About 80% of Jewish males and 50% of Jewish females trace their ancestry back to the Middle East. The rest entered the “Jewish gene pool” through conversion or intermarriage. Those who did intermarry often left the faith in a generation or two, in effect pruning the Jewish genetic tree. But many converts became interwoven into the Jewish genealogical line. Reflect on the iconic convert, the biblical Ruth, who married Boaz and became the great-grandmother of King David. She began as an outsider, but you don’t get much more Jewish than the bloodline of King David!
To his credit, Ostrer also addresses the third rail of discussions about Jewishness and race: the issue of intelligence. Jews were latecomers to the age of freethinking. While the Enlightenment swept through Christian Europe in the 17th century, the Haskalah did not gather strength until the early 19th century. By the beginning of the new millennium, however, Jews were thought of as among the smartest people on earth. The trend is most prominent in America, which has the largest concentration of Jews outside Israel and a history of tolerance.
Although Jews make up less than 3% of the population, they have won more than 25% of the Nobel Prizes awarded to American scientists since 1950. Jews also account for 20% of this country’s chief executives and make up 22% of Ivy League students. Psychologists and educational researchers have pegged their average IQ at 107.5 to 115, with their verbal IQ at more than 120, a stunning standard deviation above the average of 100 found in those of European ancestry. Like it or not, the IQ debate will become an increasingly important issue going forward, as medical geneticists focus on unlocking the mysteries of the brain.
Many liberal Jews maintain, at least in public, that the plethora of Jewish lawyers, doctors and comedians is the product of our cultural heritage, but the science tells a more complex story. Jewish success is a product of Jewish genes as much as of Jewish moms.
Is it “good for the Jews” to be exploring such controversial subjects? We can’t avoid engaging the most challenging questions in the age of genetics. Because of our history of endogamy, Jews are a goldmine for geneticists studying human differences in the quest to cure disease. Because of our cultural commitment to education, Jews are among the top genetic researchers in the world.
As humankind becomes more genetically sophisticated, identity becomes both more fluid and more fixed. Jews in particular can find threads of our ancestry literally anywhere, muddying traditional categories of nationhood, ethnicity, religious belief and “race.” But such discussions, ultimately, are subsumed by the reality of the common shared ancestry of humankind. Ostrer’s “Legacy” points out that — regardless of the pros and cons of being Jewish — we are all, genetically, in it together. And, in doing so, he gets it just right.
Jon Entine is the founder and director of the Genetic Literacy Project at George Mason University, where he is senior research fellow at the Center for Health and Risk Communication. His website is www.jonentine.com.
In the summer of 1816, a young British woman by the name of Mary Godwin and her boyfriend Percy Shelley went to visit Lord Byron in Lake Geneva, Switzerland. They had planned to spend much of the summer outdoors, but the eruption of Mount Tambora in Indonesia the previous year had changed the climate of Europe. The weather was so bad that they spent most of their time indoors, discussing the latest popular writings on science and the supernatural.
After reading a book of German ghost stories, somebody suggested they each write their own. Byron’s physician, John Polidori, came up with the idea for The Vampyre, published in 1819,1 which was the first of the “vampire-as-seducer” novels. Godwin’s story came to her in a dream, during which she saw “the pale student of unhallowed arts kneeling beside the thing he had put together.”2 Soon after that fateful summer, Godwin and Shelley married, and in 1818, Mary Shelley’s horror story was published under the title, Frankenstein, Or, the Modern Prometheus.3
Frankenstein lives on in the popular imagination as a cautionary tale against technology. We use the monster as an all-purpose modifier to denote technological crimes against nature. When we fear genetically modified foods we call them “frankenfoods” and “frankenfish.” It is telling that even as we warn against such hybrids, we confuse the monster with its creator. We now mostly refer to Dr. Frankenstein’s monster as Frankenstein. And just as we have forgotten that Frankenstein was the man, not the monster, we have also forgotten Frankenstein’s real sin.
Dr. Frankenstein’s crime was not that he invented a creature through some combination of hubris and high technology, but rather that heabandoned the creature to itself. When Dr. Frankenstein meets his creation on a glacier in the Alps, the monster claims that it was notborn a monster, but that it became a criminal only after being left alone by his horrified creator, who fled the laboratory once the horrible thing twitched to life. “Remember, I am thy creature,” the monster protests, “I ought to be thy Adam; but I am rather the fallen angel, whom thou drivest from joy for no misdeed… I was benevolent and good; misery made me a fiend. Make me happy, and I shall again be virtuous.”
Written at the dawn of the great technological revolutions that would define the 19th and 20th centuries, Frankenstein foresees that the gigantic sins that were to be committed would hide a much greater sin. It is not the case that we have failed to care for Creation, but that we have failed to care for our technological creations. We confuse the monster for its creator and blame our sins against Nature upon our creations. But our sin is not that we created technologies but that we failed to love and care for them. It is as if we decided that we were unable to follow through with the education of our children.4
Let Dr. Frankenstein’s sin serve as a parable for political ecology. At a time when science, technology, and demography make clear that we can never separate ourselves from the nonhuman world — that we, our technologies, and nature can no more be disentangled than we can remember the distinction between Dr. Frankenstein and his monster — this is the moment chosen by millions of well-meaning souls to flagellate themselves for their earlier aspiration to dominion, to repent for their past hubris, to look for ways of diminishing the numbers of their fellow humans, and to swear to make their footprints invisible?
The goal of political ecology must not be to stop innovating, inventing, creating, and intervening. The real goal must be to have the same type of patience and commitment to our creations as God the Creator, Himself. And the comparison is not blasphemous: we have taken the whole of Creation on our shoulders and have become coextensive with the Earth.
What, then, should be the work of political ecology? It is, I believe, tomodernize modernization, to borrow an expression proposed by Ulrich Beck.5 This challenge demands more of us than simply embracing technology and innovation. It requires exchanging the modernist notion of modernity for what I have called a “compositionist” one that sees the process of human development as neither liberation from Nature nor as a fall from it, but rather as a process of becoming ever-more attached to, and intimate with, a panoply of nonhuman natures.
1.
At the time of the plough we could only scratch the surface of the soil. Three centuries back, we could only dream, like Cyrano de Bergerac, of traveling to the moon. In the past, my Gallic ancestors were afraid of nothing except that the “sky will fall on their heads.”
Today we can fold ourselves into the molecular machinery of soil bacteria through our sciences and technologies. We run robots on Mars. We photograph and dream of further galaxies. And yet we fear that the climate could destroy us.
Everyday in our newspapers we read about more entanglements of all those things that were once imagined to be separable — science, morality, religion, law, technology, finance, and politics. But these things are tangled up together everywhere: in the Intergovernmental Panel on Climate Change, in the space shuttle, and in the Fukushima nuclear power plant.
If you envision a future in which there will be less and less of these entanglements thanks to Science, capital S, you are a modernist. But if you brace yourself for a future in which there will always be more of these imbroglios, mixing many more heterogeneous actors, at a greater and greater scale and at an ever-tinier level of intimacy requiring even more detailed care, then you are… what? A compositionist!
The dominant, peculiar story of modernity is of humankind’semancipation from Nature. Modernity is the thrusting-forward arrow of time — Progress — characterized by its juvenile enthusiasm, risk taking, frontier spirit, optimism, and indifference to the past. The spirit can be summarized in a single sentence: “Tomorrow, we will be able to separate more accurately what the world is really like from the subjective illusions we used to entertain about it.”
The very forward movement of the arrow of time and the frontier spirit associated with it (the modernizing front) is due to a certain conception of knowledge: “Tomorrow, we will be able to differentiate clearly what in the past was still mixed up, namely facts and values, thanks to Science.”
Science is the shibboleth that defines the right direction of the arrow of time because it, and only it, is able to cut into two well-separated parts what had, in the past, remained hopelessly confused: a morass of ideology, emotions, and values on the one hand, and, on the other, stark and naked matters of fact.
The notion of the past as an archaic and dangerous confusion arises directly from giving Science this role. A modernist, in this great narrative, is the one who expects from Science the revelation that Nature will finally be visible through the veils of subjectivity — and subjection — that hid it from our ancestors.
And here has been the great failure of political ecology. Just when all of the human and nonhuman associations are finally coming to the center of our consciousness, when science and nature and technology and politics become so confused and mixed up as to be impossible to untangle, just as these associations are beginning to be shaped in our political arenas and are triggering our most personal and deepest emotions, this is when a new apartheid is declared: leave Nature alone and let the humans retreat — as the English did on the beaches of Dunkirk in the 1940s.
Just at the moment when this fabulous dissonance inherent in the modernist project between what modernists say (emancipation from all attachments!) and what they do (create ever-more attachments!) is becoming apparent to all, along come those alleging to speak for Nature to say the problem lies in the violations and imbroglios — the attachments!
Instead of deciding that the great narrative of modernism (Emancipation) has always resulted in another history altogether (Attachments), the spirit of the age has interpreted the dissonance in quasi-apocalyptic terms: “We were wrong all along, let’s turn our back to progress, limit ourselves, and return to our narrow human confines, leaving the nonhumans alone in as pristine a Nature as possible, mea culpa, mea maxima culpa…”
Nature, this great shortcut of due political process, is now used to forbid humans to encroach. Instead of realizing at last that the emancipation narrative is bunk, and that modernism was always about attachments, modernist greens have suddenly shifted gears and have begun to oppose the promises of modernization.
Why do we feel so frightened at the moment that our dreams of modernization finally come true? Why do we suddenly turn pale and wish to fall back on the other side of Hercules’s columns, thinking we are being punished for having transgressed the sign: “Thou shall not transgress?” Was not our slogan until now, as Nordhaus and Shellenberger note in Break Through, “We shall overcome!”?6
In the name of indisputable facts portraying a bleak future for the human race, green politics has succeeded in leaving citizens nothing but a gloomy asceticism, a terror of trespassing Nature, and a diffidence toward industry, innovation, technology, and science. No wonder that, while political ecology claims to embody the political power of the future, it is reduced everywhere to a tiny portion of electoral strap-hangers. Even in countries where political ecology is a little more powerful, it contributes only a supporting force.
Political ecology has remained marginal because it has not grasped either its own politics or its own ecology. It thinks it is speaking of Nature, System, a hierarchical totality, a world without man, an assured Science, but it is precisely these overly ordered pronouncements that marginalize it.
Set in contrast to the modernist narrative, this idea of political ecology could not possibly succeed. There is beauty and strength in the modernist story of emancipation. Its picture of the future is so attractive, especially when put against such a repellent past, that it makes one wish to run forward to break all the shackles of ancient existence.
To succeed, an ecological politics must manage to be at least as powerful as the modernizing story of emancipation without imagining that we are emancipating ourselves from Nature. What the emancipation narrative points to as proof of increasing human mastery over and freedom from Nature — agriculture, fossil energy, technology — can be redescribed as the increasing attachmentsbetween things and people at an ever-expanding scale. If the older narratives imagined humans either fell from Nature or freed themselves from it, the compositionist narrative describes our ever-increasing degree of intimacy with the new natures we are constantly creating. Only “out of Nature” may ecological politics start again and anew.
2.
The paradox of “the environment” is that it emerged in public parlance just when it was starting to disappear. During the heyday of modernism, no one seemed to care about “the environment” because there existed a huge unknown reserve on which to discharge all bad consequences of collective modernizing actions. The environment is what appeared when unwanted consequences came back to haunt the originators of such actions.
But if the originators are true modernists, they will see the return of “the environment” as incomprehensible since they believed they were finally free of it. The return of consequences, like global warming, is taken as a contradiction, or even as a monstrosity, which it is, of course, but only according to the modernist’s narrative of emancipation. In the compositionist’s narrative of attachments, unintended consequences are quite normal — indeed, the most expected things on earth!
Environmentalists, in the American sense of the word, never managed to extract themselves from the contradiction that the environment is precisely not “what lies beyond and should be left alone” — this was the contrary, the view of their worst enemies! The environment is exactly what should be even more managed, taken up, cared for, stewarded, in brief, integrated and internalized in the very fabric of the polity.
France, for its part, has never believed in the notion of a pristine Nature that has so confused the “defense of the environment” in other countries. What we call a “national park” is a rural ecosystem complete with post offices, well-tended roads, highly subsidized cows, and handsome villages.
Those who wish to protect natural ecosystems learn, to their stupefaction, that they have to work harder and harder — that is, to intervene even more, at always greater levels of detail, with ever more subtle care — to keep them “natural enough” for Nature-intoxicated tourists to remain happy.
Like France’s parks, all of Nature needs our constant care, our undivided attention, our costly instruments, our hundreds of thousands of scientists, our huge institutions, our careful funding. But though we have Nature, and we have nurture, we don’t know what it would mean for Nature itself to be nurtured.7
The word “environmentalism” thus designates this turning point in history when the unwanted consequences are suddenly considered to be such a monstrosity that the only logical step appears to be to abstain and repent: “We should not have committed so many crimes; now we should be good and limit ourselves.” Or at least this is what people felt and thought before the breakthrough, at the time when there was still an “environment.”
But what is the breakthrough itself then? If I am right, the breakthrough involves no longer seeing a contradiction between the spirit of emancipation and its catastrophic outcomes, but accepting it as the normal duty of continuing to care for unwanted consequences, even if this means going further and further down into the imbroglios. Environmentalists say: “From now on we should limit ourselves.” Postenvironmentalists exclaim: “From now on, we should stop flagellating ourselves and take up explicitly and seriously what we have been doing all along at an ever-increasing scale, namely, intervening, acting, wanting, caring.” For environmentalists, the return of unexpected consequences appears as a scandal (which it is for the modernist myth of mastery). For postenvironmentalists, the other, unintended consequences are part and parcel of any action.
3.
One way to seize upon the breakthrough from environmentalism to postenvironmentalism is to reshape the very definition of the “precautionary principle.” This strange moral, legal, epistemological monster has appeared in European and especially French politics after many scandals due to the misplaced belief by state authority in the certainties provided by Science.8
When action is supposed to be nothing but the logical consequence of reason and facts (which the French, of all people, still believe), it is quite normal to wait for the certainty of science before administrators and politicians spring to action. The problem begins when experts fail to agree on the reasons and facts that have been taken as the necessary premises of any action. Then the machinery of decision is stuck until experts come to an agreement. It was in such a situation that the great tainted blood catastrophe of the 1980s ensued: before agreement was produced, hundreds of patients were transfused with blood contaminated by the AIDS virus.9
The precautionary principle was introduced to break this odd connection between scientific certainty and political action, stating that even in the absence of certainty, decisions could be made. But of course, as soon as it was introduced, fierce debates began on its meaning. Is it an environmentalist notion that precludes action or a postenvironmentalist notion that finally follows action through to its consequences?
Not surprisingly, the enemies of the precautionary principle — which President Chirac enshrined in the French Constitution as if the French, having indulged so much in rationalism, had to be protected against it by the highest legal pronouncements — took it as proof that no action was possible any more. As good modernists, they claimed that if you had to take so many precautions in advance, to anticipate so many risks, to include the unexpected consequences even before they arrived, and worse, to be responsible for them, then it was a plea for impotence, despondency, and despair. The only way to innovate, they claimed, is to bounce forward, blissfully ignorant of the consequences or at least unconcerned by what lies outside your range of action. Their opponents largely agreed. Modernist environmentalists argued that the principle of precaution dictated no action, no new technology, no intervention unless it could be proven with certainty that no harm would result. Modernists we were, modernists we shall be!
But for its postenvironmental supporters (of which I am one) the principle of precaution, properly understood, is exactly the change ofzeitgeist needed: not a principle of abstention — as many have come to see it — but a change in the way any action is considered, a deep tidal change in the linkage modernism established between science and politics. From now on, thanks to this principle, unexpected consequences are attached to their initiators and have to be followed through all the way.
4.
The link between technology and theology hinges on the notion of mastery. Descartes exclaimed that we should be “maîtres et possesseurs de la nature.”10
But what does it mean to be a master? In the modernist narrative, mastery was supposed to require such total dominance by the master that he was emancipated entirely from any care and worry. This is the myth about mastery that was used to describe the technical, scientific, and economic dominion of Man over Nature.
But if you think about it according to the compositionist narrative, this myth is quite odd: where have we ever seen a master freed from any dependence on his dependents? The Christian God, at least, is not a master who is freed from dependents, but who, on the contrary, gets folded into, involved with, implicated with, and incarnated into His Creation. God is so attached and dependent upon His Creation that he is continually forced (convinced? willing?) to save it. Once again, the sin is not to wish to have dominion over Nature, but to believe that this dominion means emancipation and not attachment.
If God has not abandoned His Creation and has sent His Son to redeem it, why do you, a human, a creature, believe that you can invent, innovate, and proliferate — and then flee away in horror from what you have committed? Oh, you the hypocrite who confesses of one sin to hide a much graver, mortal one! Has God fled in horror after what humans made of His Creation? Then have at least the same forbearance that He has.
The dream of emancipation has not turned into a nightmare. It was simply too limited: it excluded nonhumans. It did not care about unexpected consequences; it was unable to follow through with its responsibilities; it entertained a wholly unrealistic notion of what science and technology had to offer; it relied on a rather impious definition of God, and a totally absurd notion of what creation, innovation, and mastery could provide.
Which God and which Creation should we be for, knowing that, contrary to Dr. Frankenstein, we cannot suddenly stop being involved and “go home?” Incarnated we are, incarnated we will be. In spite of a centuries-old misdirected metaphor, we should, without any blasphemy, reverse the Scripture and exclaim: “What good is it for a man to gain his soul yet forfeit the whole world?” /
1. Polidori, John, et al. 1819. The Vampyre: A Tale. Printed for Sherwood, Neely, and Jones.
2. Shelley, Mary W., 1823. Frankenstein: Or, The Modern Prometheus. Printed for G. and W.B. Whittaker.
3. Ibid.
4. This is also the theme of: Latour, Bruno. 1996. Aramis or the Love of Technology. Translated by Catherine Porter. Cambridge, Mass: Harvard University Press.
5. Beck, Ulrich. 1992. Risk Society: Towards a New Modernity. London: Sage.
6. Nordhaus, Ted, and Michael Shellenberger. 2007. Break Through: From the Death of Environmentalism to the Politics of Possibility. Boston: Houghton Mifflin Harcourt.
7. Descola, Philippe. 2005. Par dela nature et culture. Paris: Gallimard.
8. Sadeleer, Nicolas de, 2006. Implementing the Precautionary Principle: Approaches from Nordic Countries and the EU. Earthscan Publ. Ltd.
9. Hermitte, Marie-Angele. 1996. Le Sang Et Le Droit. Essai Sur La Transfusion Sanguine. Paris: Le Seuil.
10. Descartes, Rene. 1637. Discourse on Method in Discourse on Method and Related Writings. Translated by Desmond M. Clark. 1999. Part 6, 44. New York: Penguin.
It’s a national embarrassment. It has resulted in large unnecessary costs for the U.S. economy and needless endangerment of our citizens. And it shouldn’t be occurring.
What am I talking about? The third rate status of numerical weather prediction in the U.S. It is a huge story, an important story, but one the media has not touched, probably from lack of familiarity with a highly technical subject. And the truth has been buried or unavailable to those not intimately involved in the U.S. weather prediction enterprise. This is an issue I have mentioned briefly in previous blogs, and one many of you have asked to learn more about. It’s time to discuss it.
Weather forecasting today is dependent on numerical weather prediction, the numerical solution of the equations that describe the atmosphere. The technology of weather prediction has improved dramatically during the past decades as faster computers, better models, and much more data (mainly satellites) have become available.
Supercomputers are used for numerical weather prediciton.
U.S. numerical weather prediction has fallen to third or fourth place worldwide, with the clear leader in global numerical weather prediction (NWP) being the European Center for Medium Range Weather Forecasting (ECMWF). And we have also fallen behind in ensembles (using many models to give probabilistic prediction) and high-resolution operational forecasting. We used to be the world leader decades ago in numerical weather prediction: NWP began and was perfected here in the U.S. Ironically, we have the largest weather research community in the world and the largest collection of universities doing cutting-edge NWP research (like the University of Washington!). Something is very, very wrong and I will talk about some of the issues here. And our nation needs to fix it.
But to understand the problem, you have to understand the competition and the players. And let me apologize upfront for the acronyms.
In the U.S., numerical weather prediction mainly takes place at the National Weather Service’s Environmental Modeling Center (EMC), a part of NCEP (National Centers for Environmental Prediction). They run a global model (GFS) and regional models (e.g., NAM).
The Europeans banded together decades ago to form the European Center for Medium-Range Forecasting (ECMWF), which runs a very good global model. Several European countries run regional models as well.
The United Kingdom Met Office (UKMET) runs an excellent global model and regional models. So does the Canadian Meteorological Center (CMC).
There are other major global NWP centers such as the Japanese Meteorological Agency (JMA), the U.S. Navy (FNMOC), the Australian center, one in Beijing, among others. All of these centers collect worldwide data and do global NWP.
The problem is that both objective and subjective comparisons indicate that the U.S. global model is number 3 or number 4 in quality, resulting in our forecasts being noticeably inferior to the competition. Let me show you a rather technical graph (produced by the NWS) that illustrates this. This figure shows the quality of the 500hPa forecast (about halfway up in the troposphere–approximately 18,000 ft) for the day 5 forecast. The top graph is a measure of forecast skill (closer to 1 is better) from 1996 to 2012 for several models (U.S.–black, GFS; ECMWF-red, Canadian: CMC-blue, UKMET: green, Navy: FNG, orange). The bottom graph shows the difference between the U.S. and other nation’s model skill.
You first notice that forecasts are all getting better. That’s good. But you will notice that the most skillful forecast (closest to one) is clearly the red one…the European Center. The second best is the UKMET office. The U.S. (GFS model) is third…roughly tied with the Canadians.
Here is a global model comparison done by the Canadian Meteorological Center, for various global models from 2009-2012 for the 120 h forecast. This is a plot of error (RMSE, root mean square error) again for 500 hPa, and only for North America. Guess who is best again (lowest error)?–the European Center (green circle). UKMET is next best, and the U.S. (NCEP, blue triangle) is back in the pack.
Lets looks at short-term errors. Here is a plot from a paper by Garrett Wedam, Lynn McMurdie and myself comparing various models at 24, 48, and 72 hr for sea level pressure along the West Coast. Bigger bar means more error. Guess who has the lowest errors by far? You guessed it, ECMWF.
I could show you a hundred of these plots, but the answers are very consistent. ECMWF is the worldwide gold standard in global prediction, with the British (UKMET) second. We are third or fourth (with the Canadians). One way to describe this, is that the ECWMF model is not only better at the short range, but has about one day of additional predictability: their 8 day forecast is about as skillful as our 7 day forecast. Another way to look at it is that with the current upward trend in skill they are 5-7 years ahead of the U.S.
Most forecasters understand the frequent superiority of the ECMWF model. If you read the NWS forecast discussion, which is available online, you will frequently read how they often depend not on the U.S. model, but the ECMWF. And during the January western WA snowstorm, it was the ECMWF model that first indicated the correct solution. Recently, I talked to the CEO of a weather/climate related firm that was moving up to Seattle. I asked them what model they were using: the U.S. GFS? He laughed, of course not…they were using the ECMWF.
A lot of U.S. firms are using the ECMWF and this is very costly, because the Europeans charge a lot to gain access to their gridded forecasts (hundreds of thousands of dollars per year). Can you imagine how many millions of dollars are being spent by U.S. companies to secure ECMWF predictions? But the cost of the inferior NWS forecasts are far greater than that, because many users cannot afford the ECMWF grids and the NWS uses their global predictions to drive the higher-resolution regional models–which are NOT duplicated by the Europeans. All of U.S. NWP is dragged down by these second-rate forecasts and the costs for the nation has to be huge, since so much of our economy is weather sensitive. Inferior NWP must be costing billions of dollars, perhaps many billions.
The question all of you must be wondering is why this bad situation exists. How did the most technologically advanced country in the world, with the largest atmospheric sciences community, end up with third-rate global weather forecasts? I believe I can tell you…in fact, I have been working on this issue for several decades (with little to show for it). Some reasons:
1. The U.S. has inadequate computer power available for numerical weather prediction. The ECMWF is running models with substantially higher resolution than ours because they have more resources available for NWP. This is simply ridiculous–the U.S. can afford the processors and disk space it would take. We are talking about millions or tens of millions of dollars at most to have the hardware we need. A part of the problem has been NWS procurement, that is not forward-leaning, using heavy metal IBM machines at very high costs.
2. The U.S. has used inferior data assimilation. A key aspect of NWP is to assimilate the observations to create a good description of the atmosphere. The European Center, the UKMET Office, and the Canadians using 4DVAR, an advanced approach that requires lots of computer power. We used an older, inferior approach (3DVAR). The Europeans have been using 4DVAR for 20 years! Right now, the U.S. is working on another advanced approach (ensemble-based data assimilation), but it is not operational yet.
3. The NWS numerical weather prediction effort has been isolated and has not taken advantage of the research community. NCEP’s Environmental Modeling Center (EMC) is well known for its isolation and “not invented here” attitude. While the European Center has lots of visitors and workshops, such things are a rarity at EMC. Interactions with the university community have been limited and EMC has been reluctant to use the models and approaches developed by the U.S. research community. (True story: some of the advances in probabilistic weather prediction at the UW has been adopted by the Canadians, while the NWS had little interest). The National Weather Service has invested very little in extramural research and when their budget is under pressure, university research is the first thing they reduce. And the U.S. NWP center has been housed in a decaying building outside of D.C.,one too small for their needs as well. (Good news… a new building should be available soon).
4. The NWS approach to weather related research has been ineffective and divided. The governmnent weather research is NOT in the NWS, but rather in NOAA. Thus, the head of the NWS and his leadership team do not have authority over folks doing research in support of his mission. This has been an extraordinarily ineffective and wasteful system, with the NOAA research teams doing work that often has a marginal benefit for the NWS.
5. Lack of leadership. This is the key issue. The folks in NCEP, NWS, and NOAA leadership have been willing to accept third-class status, providing lots of excuses, but not making the fundamental changes in organization and priority that could deal with the problem. Lack of resources for NWP is another issue…but that is a decision made by NOAA/NWS/Dept of Commerce leadership.
This note is getting long, so I will wait to talk about the other problems in the NWS weather modeling efforts, such as our very poor ensemble (probabilistic) prediction systems. One could write a paper on this…and I may.
I should stress that I am not alone in saying these things. A blue-ribbon panel did a review of NCEP in 2009 and came to similar conclusions (found here). And these issues are frequently noted at conferences, workshops, and meetings.
Let me note that the above is about the modeling aspects of the NWS, NOT the many people in the local forecast offices. This part of the NWS is first-rate. They suffer from inferior U.S. guidance and fortunately have access to the ECMWF global forecasts. And there are some very good people at NCEP that have lacked the resources required and suitable organization necessary to push forward effectively.
This problem at the National Weather Service is not a weather prediction problem alone, but an example of a deeper national malaise. It is related to other U.S. issues, like our inferior K-12 education system. Our nation, gaining world leadership in almost all areas, became smug, self-satisfied, and a bit lazy. We lost the impetus to be the best. We were satisfied to coast. And this attitude must end…in weather prediction, education, and everything else… or we will see our nation sink into mediocrity.
The U.S. can reclaim leadership in weather prediction, but I am not hopeful that things will change quickly without pressure from outside of the NWS. The various weather user communities and our congressional representatives must deliver a strong message to the NWS that enough is enough, that the time for accepting mediocrity is over. And the Weather Service requires the resources to be first rate, something it does not have at this point.
* * *
Saturday, April 7, 2012
Lack of Computer Power Undermines U.S. Numerical Weather Prediction (Revised)
In my last blog on this subject, I provided objective evidence of how U.S. numerical weather prediction (NWP), and particularly our global prediction skill, lags between major international centers, such as the European Centre for Medium Range Weather Forecasting (ECMWF), the UKMET office, and the Canadian Meteorological Center (CMC). I mentioned briefly how the problem extends to high-resolution weather prediction over the U.S. and the use of ensemble (many model runs) weather prediction, both globally and over the U.S. Our nation is clearly number one in meteorological research and we certainly have the knowledge base to lead the world in numerical weather prediction, but for a number of reasons we are not. The cost of inferior weather prediction is huge: in lives lost, injuries sustained, and economic impacts unmitigated. Truly, a national embarrassment. And one we must change.
In this blog, I will describe in some detail one major roadblock in giving the U.S. state-of-the-art weather prediction: inadequate computer resources. This situation should clearly have been addressed years ago by leadership in the National Weather Service, NOAA, and the Dept of Commerce, but has not, and I am convinced will not without outside pressure. It is time for the user community and our congressional representatives to intervene. To quote Samuel L. Jackson, enough is enough. (…)
In the U.S. we are trying to use less computer resources to do more tasks than the global leaders in numerical weather prediction. (Note: U.S. NWP is done by National Centers for Environmental Prediction’s (NCEP) Environmental Modeling Center (EMC)). This chart tells the story:
Courtesy of Bill Lapenta, EMC.
ECMWF does global high resolution and ensemble forecasts, and seasonal climate forecasts. UKMET office also does regional NWP (England is not a big country!) and regional air quality. NCEP does all of this plus much, much more (high resolution rapid update modeling, hurricane modeling, etc.). And NCEP has to deal with prediction over a continental-size country.
If you would expect the U.S. has a lot more computer power to balance all these responsibilities and tasks, you would be very wrong. Right now the U.S. NWS has two IBM supercomputers, each with 4992 processors (IBM Power6 processors). One computer does the operational work, the other is for back up (research and testing runs are done on the back-up). About 70 teraflops (trillion floating points operations per second) for each machine.
NCEP (U.S.) Computer
The European Centre has a newer IBM machine with 8192, much faster, processors that gets 182 terraflops (yes, over twice as fast and with far fewer tasks to do).
The UKMET office, serving a far, far smaller country, has two newer IBM machines, each with 7680 processors for 175 teraflops per machine.
Here is a figure, produced at NCEP that compares the relative computer power of NCEP’s machine with the European Centre’s. The shading indicates computational activity and the x-axis for each represents a 24-h period. The relative heights allows you to compare computer resources. Not only does the ECMWF have much more computer power, but they are more efficient in using it…packing useful computations into every available minute.
Courtesy of Bill Lapenta, EMC
Recently, NCEP had a request for proposals for a replacement computer system. You may not believe this, but the specifications were ONLY for a system at least equal to the one that have. A report in acomputer magazine suggests that perhaps this new system (IBM got the contract) might be slightly less powerful (around 150 terraflops) than one of the UKMET office systems…but that is not known at this point.
The Canadians? They have TWO machines like the European Centre’s!
So what kind of system does NCEP require to serve the nation in a reasonable way?
To start, we need to double the resolution of our global model to bring it into line with ECMWF (they are now 15 km global). Such resolution allows the global model to model regional features (such as our mountains). Doubling horizontal resolution requires 8 times more computer power. We need to use better physics (description of things like cloud processes and radiation). Double again. And we need better data assimilation (better use of observations to provide an improved starting point for the model). Double once more. So we need 32 times more computer power for the high-resolution global runs to allow us to catch up with ECMWF. Furthermore, we must do the same thing for the ensembles (running many lower resolution global simulations to get probabilistic information). 32 times more computer resources for that (we can use some of the gaps in the schedule of the high resolution runs to fit some of this in…that is what ECMWF does). There are some potential ways NCEP can work more efficiently as well. Right now NCEP runs our global model out to 384 hours four times a day (every six hours). To many of us this seems excessive, perhaps the longest periods (180hr plus) could be done twice a day. So lets begin with a computer 32 times faster that the current one.
Many workshops and meteorological meetings (such as one on improvements in model physics that was held at NCEP last summer—I was the chair) have made a very strong case that the U.S. requires an ensemble prediction system that runs at 4-km horizontal resolution. The current national ensemble system has a horizontal resolution about 32 km…and NWS plans to get to about 20 km in a few years…both are inadequate. Here is an example of the ensemble output (mean of the ensemble members) for the NWS and UW (4km) ensemble systems: the difference is huge–the NWS system does not even get close to modeling the impacts of the mountains. It is similarly unable to simulate large convective systems.
Current NWS( NCEP) “high resolution” ensembles (32 km)
4 km ensemble mean from UW system
Let me make one thing clear. Probabilistic prediction based on ensemble forecasts and reforecasting (running models back for years to get statistics of performance) is the future of weather prediction. The days of giving a single number for say temperature at day 5 are over. We need to let people know about uncertainty and probabilities. The NWS needs a massive increase of computer power to do this. It lacks this computer power now and does not seem destined to get it soon.
A real champion within NOAA of the need for more computer power is Tom Hamill, an expert on data assimilation and model post-processing. He and colleagues have put together a compelling case for more NWS computer resources for NWP. Read it here.
Back-of-the-envelope calculations indicates that a good first step– 4km national ensembles–would require about 20,000 processors to do so in a timely manner–but it would revolutionize weather prediction in the U.S., including forecasting convection and in mountainous areas. This high-resolution ensemble effort would meld with data assimilation over the long-term.
And then there is running super-high resolution numerical weather prediction to get fine-scale details right. Here in the NW my group runs a 1.3 km horizontal resolution forecast out twice a day for 48h. Such capability is needed for the entire country. It does not exist now due to inadequate computer resources.
The bottom line is that the NWS numerical modeling effort needs a huge increase of computer power to serve the needs of the country–and the potential impacts would be transformative. We could go from having a third-place effort, which is slipping back into the pack, to a world leader. Furthermore, the added computer power will finally allow NOAA to complete Observing System Simulation Experiments (OSSEs) and Observing System Experiments (OSEs) to make rational decisions about acquisitions of very expensive satellite systems. The fact that this is barely done today is really amazing and a potential waste of hundreds of millions of dollars on unnecessary satellite systems.
But do to so will require a major jump in computational power, a jump our nation can easily afford. I would suggest that NWS’s EMC should begin by securing at least a 100,000 processor machine, and down the road something considerably larger. Keep in mind my department has about 1000 processors in our computational clusters, so this is not as large as you think.
For a country with several billion-dollar weather disasters a year, investment in reasonable computer resrouces for NWP is obvious.
The cost? Well, I asked Art Mann of Silicon Mechanics (a really wonderful local vendor of computer clusters) to give me rough quote: using fast AMD chips, you could have such a 100K core machine for 11 million dollars. (this is without any discount!) OK, this is the U.S. government and they like expensive, heavy metal machines….lets go for 25 million dollars. The National Center for Atmospheric Research (NCAR) is getting a new machine with around 75,000 processors and the cost will be around 25-35 million dollars. NCEP will want two machines, so lets budget 60 million dollars. We spend this much money on a single jet fighter, but we can’t invest this amount to greatly improve forecasts and public safety in the U.S.? We have machines far larger than this for breaking codes, doing simulations of thermonuclear explosions, and simulating climate change.
Yes, a lot of money, but I suspect the cost of the machine would be paid back in a few months from improved forecasts. Last year we had quite a few (over ten) billion-dollar storms….imagine the benefits of forecasting even a few of them better. Or the benefits to the wind energy and utility industries, or U.S. aviation, of even modestly improved forecasts. And there is no doubt such computer resources would improve weather prediction. The list of benefits is nearly endless. Recent estimates suggest that normal weather events cost the U.S. economy nearly 1/2 trillion dollars a year. Add to that hurricanes, tornadoes, floods, and other extreme weather. The business case is there.
As someone with an insider’s view of the process, it is clear to me that the current players are not going to move effectively without some external pressure. In fact, the budgetary pressure on the NWS is very intense right now and they are cutting away muscle and bone at this point (like reducing IT staff in the forecast offices by over 120 people and cutting back on extramural research). I believe it is time for weather sensitive industries and local government, together with t he general public, to let NOAA management and our congressional representatives know that this acute problem needs to be addressed and addressed soon. We are acquiring huge computer resources for climate simulations, but only a small fraction of that for weather prediction…which can clearly save lives and help the economy. Enough is enough.
Você precisa fazer login para comentar.