Media files on the Anthropocene

Confronting the ‘Anthropocene’ (Dot Earth, New York Times)

London from orbit

NASA Donald R. Pettit, an astronaut, took this photograph of London while living in the International Space Station.

LONDON — I’m participating in a one-day meeting at the Geological Society of London exploring the evidence for, and meaning of, the Anthropocene. This is the proposed epoch of Earth history that, proponents say, has begun with the rise of the human species as a globally potent biogeophysical force, capable of leaving a durable imprint in the geological record.

This recent TEDx video presentation by Will Steffen, the executive director of the Australian National University’s Climate Change Institute, lays out the basic idea:

There’s more on the basic concept in National Geographic and from the BBC. Paul Crutzen, the Nobel laureate in chemistry who, with others, proposed the term in 2000, and Christian Schwägerl, the author of “The Age of Man” (German), described the value of this new framing for current Earth history in January in Yale Environment 360:

Students in school are still taught that we are living in the Holocence, an era that began roughly 12,000 years ago at the end of the last Ice Age. But teaching students that we are living in the Anthropocene, the Age of Men, could be of great help. Rather than representing yet another sign of human hubris, this name change would stress the enormity of humanity’s responsibility as stewards of the Earth. It would highlight the immense power of our intellect and our creativity, and the opportunities they offer for shaping the future. [Read the rest.]

I’m attending because of a quirky role I played almost 20 years ago in laying the groundwork for this concept of humans as a geological force. A new paper from Steffen and three coauthors reviewing the conceptual and historic basis for the Anthropocene includes an appropriately amusing description of my role:

Biologist Eugene F. Stoermer wrote: ‘I began using the term “anthropocene” in the 1980s, but never formalized it until Paul [Crutzen] contacted me’. About this time other authors were exploring the concept of the Anthropocene, although not using the term. More curiously, a popular book about Global Warming, published in 1992 by Andrew C. Revkin, contained the following prophetic words: ‘Perhaps earth scientists of the future will name this new post-Holocene period for its causative element—for us. We are entering an age that might someday be referred to as, say, the Anthrocene [sic]. After all, it is a geological age of our own making’. Perhaps many readers ignored the minor linguistic difference and have read the new term as Anthro(po)cene!

If you’ve been tracking my work for a while, you’re aware of my focus on the extraordinary nature of this moment in both Earth and human history. As far as science can tell, there’s never, until now, been a point when a species became a planetary powerhouse and also became aware of that situation.

As I first wrote in 1992, cyanobacteria are credited with oxygenating the atmosphere some 2 billion years ago. That was clearly a more profound influence on a central component of the planetary system than humans raising the concentration of carbon dioxide 40 percent since the start of the industrial revolution. But, as far as we know, cyanobacteria (let alone any other life form from that period) were neither bemoaning nor celebrating that achievement.

It was easier to be in a teen-style resource binge before science began to delineate an edge to our petri dish.

We no longer have the luxury of ignorance.

We’re essentially in a race between our potency, our awareness of the expressed and potential ramifications of our actions and our growing awareness of the deeply embedded perceptual and behavioral traits that shape how we do, or don’t, address certain kinds of risks. (Explore “Boombustology” and “Disasters by Design” to be reminded how this habit is not restricted to environmental risks.)

This meeting in London is two-pronged. It is in part focused on deepening basic inquiry into stratigraphy and other branches of earth science and clarifying how this human era could qualify as a formal chapter in Earth’s physical biography. As Erle C. Ellis, an ecologist at the University of Maryland, Baltimore County, put it in his talk, it’s unclear for the moment whether humanity’s impact will be long enough to represent an epoch, or will more resemble “an event.” Ellis’s presentation was a mesmerizing tour of the planet’s profoundly humanized ecosystems, which he said would be better described as “anthromes” than “biomes.”

Ellis said it was important to approach this reality not as a woeful situation, but an opportunity to foster a new appreciation of the lack of separation of people and their planet and a bright prospect for enriching that relationship. In this his views resonate powerfully with those of Rene Dubos, someone I’ll be writing about here again soon.

Through the talks by Ellis and others, it was clear that the scientific effort to define a new geological epoch, while important, paled beside the broader significance of this juncture in human history.

In my opening comments at the meeting, I stressed the need to expand the discussion from the physical and environmental sciences into disciplines ranging from sociology to history, philosophy to the arts.

I noted that while the “great acceleration” described by Steffen and others is already well under way, it’s entirely possible for humans to design their future, at least in a soft way, boosting odds that the geological record will have two phases — perhaps a “lesser” and “greater” Anthropocene, as someone in the audience for my recent talk with Brad Allenby at Arizona State University put it.

I also noted that the term “Anthropocene,” like phrases such as “global warming,” is sufficiently vague to guarantee it will be interpreted in profoundly different ways by people with different world views. (As I explained, this is as true for Nobel laureates in physics as it is for the rest of us.)

Some will see this period as a “shame on us” moment. Others will deride this effort as a hubristic overstatement of human powers. Some will argue for the importance of living smaller and leaving no scars. Others will revel in human dominion as a normal and natural part of our journey as a species.

A useful trait will be to get comfortable with that diversity.

Before the day is done I also plan on pushing Randy Olson’s notion of moving beyond the “nerd loop” and making sure this conversation spills across all disciplinary and cultural boundaries from the get-go.

There’s much more to explore of course, and I’ll post updates as time allows. You might track the meeting hash tag, #anthrop11, on Twitter.

*   *   *

8/16/2014 @ 1:05PM – James Conca

The Anthropocene Part 1: Tracking Human-Induced Catastrophe On A Planetary Scale (Forbes)

For almost 30 years, we geologists have been having a debate about what Geologic Epoch we find ourselves in right now. It is presently called the Holocene, but some want to add another epoch and call it the Anthropocene.

Anthropocene combines the Greek words for humanand recent time period, to denote the period of time since human activity went global and we became an important geologic process in our own right (In These Times; see also UN video http://vimeo.com/39048998).

In other words, what should we call this period of time when we started trashing the planet? And when did it begin?

You might know some of the big Geologic Ages, Epochs, Periods, Eras and Eons. The dinosaurs died out 66 million years ago in the Maastrichtian Age in the Late Epoch of the Cretaceous Period in the Mesozoic Era of the Phanerozoic Eon.

The sum of Earth’s history, divided up into these different sections, is called the Geologic Time Scale and geologists have been defining and refining it for about 150 years (Geologic Society of America). The latest deliberation concerns the epoch from the present stretching back to about 12,000 years.  The term Anthropocene was popularized fourteen years ago by atmospheric scientists Paul Crutzen and Eugene Stoermer, to more decisively focus discussion on human global effects to the whole planet.

The Geologic Time Scale - the sectioning up of 4.6 billion years of Earth’s history into manageable time periods bounded by significant geological events. The present debate is - do we want to call the present epoch we are in the Anthropocene to reflect humanity’s global effect on the planet? Click on the link in the text for the full image. Source: The Geological Society of America

To Judith Wright, a chemostratigrapher, this is not just an academic exercise for the Ivory Tower (Ocean Redox Variations Through Time). There are obvious political and social implications, not the least being the role of human activities in climate change, wholesale extinctions of species unlike anytime in history, and the accelerating environmental destruction on a planetary scale that could spell our own doom.

What we call something matters. It sets the scale of importance and pushes the discussion in the direction that we need these debates to go.

The scientific decision on whether to incorporate the term Anthropocene into the geologic lexicon falls to a carefully deliberative group called the International Commission on Stratigraphy, particularly its Subcommission on Quaternary Stratigraphy which has formed an Anthropocene Working Group. There is a rumor they may arrive at a decision after only several years of debate, making this deliberation downright hasty.

This discussion concerns not just what to name it, but when it started. There is always some defining characteristic to one of these epochs or periods. Often a huge extinction event marks the end of an epoch, like the end of the Triassic Period 200 million years ago when half of all life perished. Or the appearance of something new in evolution, like the beginning of the Cambrian Period when organisms learned how to grow shells, bones, teeth and other hard parts from the increased dissolved minerals in seawater, providing huge survival advantages.

Sometimes there is a chemical marker, or layer, that is unique to that time or process, like the iridium layer that is one of the singular markers of the huge meteorite impact that finally stressed the dinosaurs’ environment to the point of extinction.

But why do stratigraphers get to decide this question? In the geologic sciences, stratigraphy is the study of rock layers on the Earth – how they relate to each other and to relative and absolute time, how they got there, and what they tell us about Earth history. Stratigraphy can be traced to the beginnings of modern geology in the 17th century with Nicholas Steno and his three stratigraphic rules:

–   the law of superposition (younger rocks lay on top of older ones)

–   the principle of original horizontality (all sedimentary and volcanic layers are originally laid down horizontally, or normal to the Earth’s gravitational field)

–   the principle of lateral continuity (sedimentary layers generally continue for a long distance as most were laid down in the ocean, and an abrupt edge indicates that something happened to break it off, like a fault or surface erosion).

These were profound observations and fundamentally changed how we understood time and geologic processes like flooding and earthquakes. In fact, the unique perspective that geology brings to humans is the understanding of time at all levels.

According to Patricia Corcoran and co-workers (GSA), material being laid down across the Earth in our present time is the most bizarre mix of chemicals and materials the Earth has ever seen, some compounds of which have never occurred in nature. Maybe compounds that could not have been produced naturally would make a good marker for the Anthropocene.

But defining a geologic time period requires a sufficiently large, clear and distinctive set of geologic characteristics to be useful as a global geologic boundary and that will also survive throughout geologic time.

We presently find ourselves in the Holocene Epoch of the Quaternary Period in the Cenozoic Era of the Phanerozoic Eon, defined as starting from the end of the last glacial retreat, an obvious event that led to our present global range in climate and other characteristics we define as the Earth today. The problem is, humans have dominated this entire epoch in many ways, from the dawn of agriculture, to smelting of iron and lead, burning of forests and finally the effects of the industrial revolution.

As in all environmental issues, it’s the number of people on Earth that’s the problem. For over 100,000 years, the global human population was steady at about 10 million. Then civilization appeared, fueled by the many significant developments that humans had begun to apply en masse including agriculture, domesticated animals, tools, serious engineering, and various uses of fire, fibers and the wheel.

Our population began to increase about 2000 years ago, at the beginning of the Common Era, rising to 300 million during the Middle Ages and to a billion at the beginning of the Industrial Age. Then 2 billion in 1927, 3 billion in 1960, 4 billion in 1974, 5 billion in 1987, 6 billion in 1999 and 7 billion in 2011. This exponential rise is textbook for a bacterial colony in a petri dish, right before it dies from outpacing its food sources and generating too much waste. It’s also eerily analogous for people on the petri dish of Earth.

Humans now comprise the largest mass of vertebrate matter on land on the entire Earth. The rest is almost all our food and friends, mainly the animals we domesticated over the last 50,000 years, plus a bunch of xenobiotics we’ve transported far from their habitats (Cornell University). Only a small percentage of all vertebrate mass on land is wild or natural (In These Times).

Let that sink in for a minute. Most of what people see in National Geographic or on the Discovery Channel or in movies about animals, IS ALMOST ALL GONE. Humans have dammed a third of the world’s rivers, have covered, destroyed or altered almost half of the world’s land surface. We use up most of the fresh water faster than it can be replenished. And we extinct about 30,000 species every year.

Whether from deforestation, agriculture, urbanization, roads, mining activities, aquatic farming, moving xenobiotic species around the world that destroy native species, dumping huge amounts of waste on land, in the ocean and in the atmosphere, and all other human activities, we have decimated the natural environment without thinking about what effect it has on global ecosystems and what it takes for our own species to survive.

There is a point where humans, our pets, our food animals and our food crops cannot survive without some aspects of a wild nature. Much of our crops need pollinators. The oxygen on this planet comes mainly from organisms in the top 300 feet of the ocean. Biodiversity is not just an environmental catch-phrase, it’s a necessity for survival.

So when did the Anthropocene begin?

Was it when agriculture began, when we started burning forests to clear land? Was it during the Iron Age when we clear-cut the northern forests to smelt iron ore?

Was it the advent of civilization, particularly the rise in agriculture and mining activities around the Mediterranean by the Phoenicians, Greeks and Romans, signified by a rise in environmental Pb levels (Shotyk et al, 1998) that continued until just recently?

Was it the 19th century when our population passed a billion and we began burning fossil fuels that has made carbon itself a global marker? Was it the 20thcentury when humans passed plate tectonics as the primary mechanism for moving rock and dirt on this planet?

Access to huge amounts of chemical energy trapped in fossil fuels allowed human populations to explode and allowed human effects to really go global. This is why most researchers point to the mid-19th century as the obvious time to start the Anthropocene.

One popular idea for the start of the Anthropocene is the beginning of the atomic age. Above-ground nuclear tests spread unique radionuclides like Pu around the world, elements that have not been seen in our solar nebula for six billion years, but now show up in surface sediments and ice cores, albeit in minute concentrations.

Crutzen recently gave his support for this point in time. However, the atomic age is just another aspect of the modern age of humans not associated with a particular change in a global characteristic. It cannot be seen in the field and marks no special geologic event, and would be more of a political or sociological marker than a geological one.

Perhaps we should wait until the end of this century when the worst effects will be upon us and the world will barely be recognizable to anyone living today. We might be able to just point to the time “when there used to be forests.”

In the end, this debate might be shear hubris since when we are gone, future geologists, of whatever species, will decide for themselves where they want to place the beginning of this particular catastrophe.

So…what’s your vote for when we start the Anthropocene?

*   *   *

2/29/2012 @ 11:55PM 487 views – Jayne Jung

On The Anthropocene Age (Forbes)

Original caption from NASA: "S103-E-5037 ...

Image via Wikipedia

The term, the Anthropocene Age, is now common. It represents the time since the early 1800s when man began to have an impact on the Earth’s climate. There is still some debate about its use though. Scientists have traditionally called the current period the Holocene Age, meaning entirely new. The Holocene Age started around 10,000 BC, after the last glacial period when there was significant glacier movement.

From the German newspaper Spiegel Online, interview with British geologist Jan Zalasiewicz:

SPIEGEL ONLINE: The idea of mankind as the owner of Earth — is that not disconcerting?

Zalasiewicz: While we now may be said to “own” the planet, that is not the same as controlling it. Our global experiment with CO2 is something that virtually every government would like to see brought under control, and yet collectively we are, at present, unable to do so. That seems to call for feelings of something other than hubris.

SPIEGEL ONLINE: Do you see more to Anthropocene than just a geological term? Is it a new way of thinking about mankind’s role on Earth?

Zalasiewicz: That is almost certainly part of the attraction of the term for the wider public. The term does encapsulate and integrate a wide range of phenomena that are often considered separately. It also provides a sense of the scale and significance of anthropogenic global change. It emphasizes the importance of the Earth’s long geological history as a context within which to better understand what is happening today.

Interview conducted by Christian Schwägerl

From the New York Times‘ editorial “The Anthropocene”:

Other species are embedded in the fossil record of the epochs they belong to. Some species, like ammonites and brachiopods, even serve as guides — or index fossils — to the age of the rocks they’re embedded in. But we are the only species to have defined a geological period by our activity — something usually performed by major glaciations, mass extinction and the colossal impact of objects from outer space, like the one that defines the upper boundary of the Cretaceous.

Humans were inevitably going to be part of the fossil record. But the true meaning of the Anthropocene is that we have affected nearly every aspect of our environment — from a warming atmosphere to the bottom of an acidifying ocean.

From Yale’s website “The Anthropocene Debate: Marking Humanity’s Impact” by Elizabeth Kobert:

One argument against the idea that a new human-dominated epoch has recently begun is that humans have been changing the planet for a long time already, indeed practically since the start of the Holocene. People have been farming for 8,000 or 9,000 years, and some scientists — most notably William Ruddiman, of the University of Virginia — have proposed that this development already represents an impact on a geological scale. Alternatively, it could be argued that the Anthropocene has not yet arrived because human impacts on the planet are destined to be even greater 50 or a hundred years from now.

“We’re still now debating whether we’ve actually got to the event horizon, because potentially what’s going to happen in the 21st century could be even more significant,” observed Mark Williams, a member of the Anthropocene Working Group who is also a geologist at the University of Leicester.

I personally do not want to know what that “event horizon” is.

*   *   *

The Dawning of the Age of Anthropocene (In These Times)

By altering the earth, have humans ushered in a new epoch?

BY JESSICA STITES

By inviting awe rather than—or along with—terror, the Anthropocene may offer a way to grapple with climate change rather than deny it.

Scientists have been clanging the alarm about human-caused climate change, trying to bring around the 33 percent of Americans who don’t believe the earth is warming and the 18 percent of believers in global warming who think the process is “natural.” But as climate scientists race against time to convince the resisters, another branch of science, geology, is taking the tortoise approach. For more than a decade, geologists have been debating whether to officially declare the existence of a new epoch, the “Anthropocene,”to acknowledge that humans are radically reshaping the earth’s surface. Some geologists see the Anthropocene model as a way to widen the lens on human impact on the earth and cut through both the alarmism and the resistance to spur concrete policy solutions.

Geologists divide the earth’s 4.5 billion-year history into eons, which are subdivided into eras, periods and epochs. Each division is marked by a discernable change in the earth’s strata, such as a new type of fossil representing a major evolutionary shift. On this global scale, humans are relative infants, just 200,000 years old. Civilization has sprung up only during the most recent epoch, the relatively warm interglacial Holocene, spanning the past 11,700 years.

But the Holocene’s days may be numbered. In 2000, the late biologist Eugene F. Stoermer and Nobel Prize-winning atmospheric chemist Paul Crutzen made a radical proposal in the International Geosphere-Biosphere Programme newsletter: that the Industrial Revolution’s explosion of carbon emissions had ushered in a new epoch, the Anthropocene, marked by human impact. In 2009, the International Commission on Stratigraphy’s Subcommission of Quaternary Stratigraphy assembled an Anthropocene Working Group, which in 2016 will issue a recommendation on whether to formally adopt the term.

For stratigraphers, this is “breakneck speed,” says University of Leicester paleobiologist Jan Zalasiewicz, the head of the working group. The geological time scale is as fundamental to the discipline as the periodic table is to chemistry. The last official change, when the Ediacaran Period supplanted the Vendian in 2004, came after 20 years of debate and shook up a scale that had been static for 120 years.

Whether or not the term is formalized, Zalasiewicz believes in the Anthropocene’s potential to alter perception, for geologists and non-geologists alike. “ ‘Anthropocene’ places our environmental situation in a different perspective,” he says. “You look at it sideways, as it were.”

Certainly, you look at it from a broader perspective. Zooming out to an aeonic scale is a sober way to assess the significance of climate change. Overall, human emissions have driven up global temperatures about 0.85 degrees Celsius since the 19th century, according to the UN’s Intergovernmental Panel on Climate Change. Deniers are fond of noting that Earth’s current temperatures are not the highest they’ve been, and that the earth cyclically warms and cools. That’s true. There have been several “hyperthermal” events when carbon dioxide has been released by the ton from the ground or the ocean, and the earth’s temperature has spiked. However, as Zalasiewicz and colleague Mark Williams note in a recent paper in the Italian journal Rendiconti Lincei, carbon is being released today more than 10 times faster than during any of these events. And when the other hyperthermal events began, the earth was already relatively warm, without today’s ice-capped poles that threaten to melt and pump up sea levels. Already, warming has eroded Arctic sea ice and altered the territory of climate-sensitive species—effects potentially stark enough by themselves to declare an Anthropocene.

But even if we set aside climate change, there are still plenty of arguments for an Anthropocene, says Zalasiewicz. Take mineral traces: He calculates that humans have purified a half-billion tons of aluminum, enough to cover the United States in tinfoil. We’ve also made never-before-seen alloys, like the boron carbide used in bullet-proof vests and the tungsten carbide that forms the balls in ballpoint pens. And then there are plastics—more than 250 million tons are now produced annually. Zalasiewicz and his colleagues term these various human-made objects “technofossils.”

Technofossils aren’t the only things that humans scatter; we’ve also spread germs, seaweed, strawberries, sparrows, rats, jicama, cholera … the list goes on. You might see us as bees pollinating the earth or, less flatteringly, dung beetles scuttling our shit around. Today, according to a 1999 Cornell study, 98 percent of U.S. food—both animal-and plant-derived—is non-native.

Then there’s the population explosion. Humans now make up nearly one-fourth of total land vertebrate biomass, and most of the rest is our food animals, with wild animals—the lions, tigers and bears—making up a mere 3 percent. Those proliferating people suck up resources: According to calculations by Stanford geology PhD student Miles Traer, humans use the equivalent of 13 barrels of oil per person every year, and the United States consumes 345 billion gallons of fresh water a day, enough to drain all of Lake Michigan every 10 years.

In the face of all this, few geologists deny that stratigraphers a billion years from now (either human or alien, depending on your level of optimism) will see our current period as a major shift in the geological record. So the main question facing the Anthropocene Working Group is when the Anthropocene began, or in geological parlance, where to plant “the golden spike.” Four possibilities are under consideration.

One camp favors the 1800s, when megacities began to emerge and form unique strata of compressed rubble and trash heaps. London was the first, with a population of 3 million in 1850. With megacities came industrialization and a sudden escalation in CO2 emissions that began to push up global temperatures.

A second group, including many archeologists, argues for an earlier date. University of Virginia paleoclimatology professor William F. Ruddiman calculates that the first Agricultural Revolution, which began some 12,000 years ago, caused greater climate effects than have yet to be seen from the Industrial Revolution. Due to the widespread preindustrial use of fire to clear forests, Ruddiman believes that emissions of greenhouse gases, coupled with the loss of forests’ cooling effect, caused the earth’s temperature to be 1.3 to 1.4 degrees Celsius higher than if humans had never existed, warding off an overdue Ice Age.

However, where to plant the golden spike in this model isn’t entirely clear, since the changes happened over thousands of years. One option would be the mass extinction of large animals in the Americas some 12,500 years ago, believed to be caused by humans. In that case, the Anthropocene would supplant the Holocene.

A third school believes the Anthropocene should be dated to the mid-20th century, when the “Great Acceleration” began and population, urbanization, globalization and CO2 emissions took off. The past 60-odd years have seen a doubling of the human population and the release of three-quarters of all human-caused CO2 emissions. Zalasiewicz favors this hypothesis because of the sharpness of the acceleration and the synchronicity of the changes. The automobile, for instance, proliferated worldwide in less than a century, the blink of a geologic eye. This time period contains a dramatic option for the “golden spike”: the first A-bomb tests of the 1940s, which left radionuclidic traces across the earth that are readily detectable in ice core samples pulled from the poles. Astrobiologist David Grinspoon, a proponent of the nuclear-testing golden spike, writes, “The symbolism is so potent—the moment we grasped that terrible Promethean fire that, uncontrolled, could consume the world.”

The fourth and final camp is made up of the “not yets,” who point out that everything is still accelerating—population, technofossil production, CO2 emissions—with no reason to believe things will stabilize and some reason to expect dramatic upheavals. Many geologists predict that the next two centuries will bring a mass extinction event of a magnitude seen only five times before (e.g., the dinosaur die-off ). A number of drastic changes triggered by climate change may lie in store, according to an IPCC report released March 31—not only extinctions, but also food and water shortages, irreversible loss of coral reefs and Arctic ice, and “abrupt or drastic changes” like Greenland melting or the Amazon rainforest drying.

It’s here that the conversation veers from the scientific into the panicked, which does not sit well with Mike Osborne, a Stanford Ph.D student in earth sciences and the creator of the “Generation Anthropocene” podcast. More comfortable with data and description than prediction, Osborne eschews apocalyptic thinking, and he doesn’t like the vitriol and politicization of the climate-change debate.

But of course, the issue is political, because it’s inseparable from human choices. A forthcoming peer-reviewed study funded by NASA’s Goddard Space Flight Center warns of the potential for “irreversible” collapse of civilization in the next few decades and stresses that the key factors are both environmental and social. The study says that the enormous consumption of resources is dangerous specifically because it is paired with “the economic stratification of society into Elites [rich] and Masses (or “Commoners”) [poor]”—noting that these two features were common to the toppling of numerous empires, from Han to Roman, over the last 5,000 years.

The issue is also emotional. One side calls the other “alarmist” and “hysterical,” which may be code for “I can’t handle hearing this.” Even for believers, there seems to be a gap between apprehending the problem and believing in a solution. Pew found that 62 percent of Americans understand that climate change is happening and 44 percent believe it’s human-caused, but when Pew asked Americans in another survey what our government’s top priority should be, climate change ranked 19th of 20 issues, just above global trade.

Osborne says he “hesitate[s] to be overly optimistic or pessimistic,” even though, with his first child just born, the earth’s future weighs on his mind. “The Anthropocene’s great utility for me in terms of imagination,” he says, “is that at its best, it comes with a sense of awe: Holy cow, the world is freaking huge and amazing and beautiful and scary.”

By inviting awe rather than—or along with—terror, the Anthropocene may offer a way to grapple with climate change rather than deny it.“The pace of change seems to be ever accelerating, but so does the response. I am loathe to underestimate human ingenuity,” says Osborne. Zalasiewicz and Osborne both think that certain effects of the Anthropocene, like warming and biodiversity loss, warrant environmentalist solutions, such as measures to curb CO2 emissions, and wildlife preserves on both land and sea.

Whatever the working group proposes in 2016—which must then be affirmed by three higher bodies—Zalasiewicz believes the concept of the Anthropocene is here to stay. “If it wasn’t useful, if it was a catchphrase, it would have quite quickly fallen by the wayside,” he says. “It packages up a whole range of different isolated phenomena—ocean acidification, landscape change, biodiversity loss—and integrates them into a common signal or trend or pattern.”

Perhaps the data-driven, methodical approach of geology is what this debate needs. To Osborne, the Anthropocene is exciting because it forces us to look closely at what’s really happening and challenges our tendency to think in a nature/human divide. In the Anthropocene model, the earth’s workings and fate are intertwined with our own. Indeed, science theorist Bruno Latour goes so far as to say that the Anthropocene could mark “the final rejection of the separation between Nature and Human that has paralyzed science and politics since the dawn of modernism.”

*   *   *

Nasa-funded study: industrial civilisation headed for ‘irreversible collapse’? (The Guardian)

Natural and social scientists develop new model of how ‘perfect storm’ of crises could unravel global system

This NASA Earth Observatory released on

This Nasa Earth Observatory image shows a storm system circling around an area of extreme low pressure in 2010, which many scientists attribute to climate change. Photograph: AFP/Getty Images

A new study partly-sponsored by Nasa’s Goddard Space Flight Center has highlighted the prospect that global industrial civilisation could collapse in coming decades due to unsustainable resource exploitation and increasingly unequal wealth distribution.

Noting that warnings of ‘collapse’ are often seen to be fringe or controversial, the study attempts to make sense of compelling historical data showing that “the process of rise-and-collapse is actually a recurrent cycle found throughout history.” Cases of severe civilisational disruption due to “precipitous collapse – often lasting centuries – have been quite common.”

The independent research project is based on a new cross-disciplinary ‘Human And Nature DYnamical’ (HANDY) model, led by applied mathematician Safa Motesharrei of the US National Science Foundation-supported National Socio-Environmental Synthesis Center, in association with a team of natural and social scientists. The HANDY model was created using a minor Nasa grant, but the study based on it was conducted independently. The study based on the HANDY model has been accepted for publication in the peer-reviewed Elsevier journal, Ecological Economics.

It finds that according to the historical record even advanced, complex civilisations are susceptible to collapse, raising questions about the sustainability of modern civilisation:

“The fall of the Roman Empire, and the equally (if not more) advanced Han, Mauryan, and Gupta Empires, as well as so many advanced Mesopotamian Empires, are all testimony to the fact that advanced, sophisticated, complex, and creative civilizations can be both fragile and impermanent.”

By investigating the human-nature dynamics of these past cases of collapse, the project identifies the most salient interrelated factors which explain civilisational decline, and which may help determine the risk of collapse today: namely, Population, Climate, Water, Agriculture, and Energy.

These factors can lead to collapse when they converge to generate two crucial social features: “the stretching of resources due to the strain placed on the ecological carrying capacity”; and “the economic stratification of society into Elites [rich] and Masses (or “Commoners”) [poor]” These social phenomena have played “a central role in the character or in the process of the collapse,” in all such cases over “the last five thousand years.”

Currently, high levels of economic stratification are linked directly to overconsumption of resources, with “Elites” based largely in industrialised countries responsible for both:

“… accumulated surplus is not evenly distributed throughout society, but rather has been controlled by an elite. The mass of the population, while producing the wealth, is only allocated a small portion of it by elites, usually at or just above subsistence levels.”

The study challenges those who argue that technology will resolve these challenges by increasing efficiency:

“Technological change can raise the efficiency of resource use, but it also tends to raise both per capita resource consumption and the scale of resource extraction, so that, absent policy effects, the increases in consumption often compensate for the increased efficiency of resource use.”

Productivity increases in agriculture and industry over the last two centuries has come from “increased (rather than decreased) resource throughput,” despite dramatic efficiency gains over the same period.

Modelling a range of different scenarios, Motesharrei and his colleagues conclude that under conditions “closely reflecting the reality of the world today… we find that collapse is difficult to avoid.” In the first of these scenarios, civilisation:

“…. appears to be on a sustainable path for quite a long time, but even using an optimal depletion rate and starting with a very small number of Elites, the Elites eventually consume too much, resulting in a famine among Commoners that eventually causes the collapse of society. It is important to note that this Type-L collapse is due to an inequality-induced famine that causes a loss of workers, rather than a collapse of Nature.”

Another scenario focuses on the role of continued resource exploitation, finding that “with a larger depletion rate, the decline of the Commoners occurs faster, while the Elites are still thriving, but eventually the Commoners collapse completely, followed by the Elites.”

In both scenarios, Elite wealth monopolies mean that they are buffered from the most “detrimental effects of the environmental collapse until much later than the Commoners”, allowing them to “continue ‘business as usual’ despite the impending catastrophe.” The same mechanism, they argue, could explain how “historical collapses were allowed to occur by elites who appear to be oblivious to the catastrophic trajectory (most clearly apparent in the Roman and Mayan cases).”

Applying this lesson to our contemporary predicament, the study warns that:

“While some members of society might raise the alarm that the system is moving towards an impending collapse and therefore advocate structural changes to society in order to avoid it, Elites and their supporters, who opposed making these changes, could point to the long sustainable trajectory ‘so far’ in support of doing nothing.”

However, the scientists point out that the worst-case scenarios are by no means inevitable, and suggest that appropriate policy and structural changes could avoid collapse, if not pave the way toward a more stable civilisation.

The two key solutions are to reduce economic inequality so as to ensure fairer distribution of resources, and to dramatically reduce resource consumption by relying on less intensive renewable resources and reducing population growth:

“Collapse can be avoided and population can reach equilibrium if the per capita rate of depletion of nature is reduced to a sustainable level, and if resources are distributed in a reasonably equitable fashion.”

The NASA-funded HANDY model offers a highly credible wake-up call to governments, corporations and business – and consumers – to recognise that ‘business as usual’ cannot be sustained, and that policy and structural changes are required immediately.

Although the study based on HANDY is largely theoretical – a ‘thought-experiment’ – a number of other more empirically-focused studies – by KPMG and the UK Government Office of Science for instance – have warned that the convergence of food, water and energy crises could create a ‘perfect storm’ within about fifteen years. But these ‘business as usual’ forecasts could be very conservative.

Dr Nafeez Ahmed is executive director of the Institute for Policy Research & Development and author of A User’s Guide to the Crisis of Civilisation: And How to Save It among other books. Follow him on Twitter @nafeezahmed

  • This article was amended on 26 March 2014 to reflect the nature of the study and Nasa’s relationship to it more clearly.
Anúncios