Arquivo mensal: abril 2015

Ora pois, uma língua bem brasileira (Pesquisa Fapesp)

Análise de textos antigos e de entrevistas expõe as marcas próprias do idioma no país, o alcance do R caipira e os lugares que preservam modos antigos de falar

CARLOS FIORAVANTI | ED. 230 | ABRIL 2015

Estudo para Partida da monção, 1897, de Almeida Júnior (Acervo Pinacoteca do Estado de SP). Os bandeirantes saíam de Porto Feliz rumo ao Centro-Oeste

Estudo para Partida da monção, 1897, de Almeida Júnior (Acervo Pinacoteca do Estado de SP). Os bandeirantes saíam de Porto Feliz rumo ao Centro-Oeste

A possibilidade de ser simples, dispensar elementos gramaticais teoricamente essenciais e responder “sim, comprei”, quando alguém pergunta “você comprou o carro?”, é uma das características que conferem flexibilidade e identidade ao português brasileiro. A análise de documentos antigos e de entrevistas de campo ao longo dos últimos 30 anos está mostrando que o português brasileiro já pode ser considerado único, diferente do português europeu, do mesmo modo que o inglês americano é distinto do inglês britânico. O português brasileiro ainda não é, porém, uma língua autônoma: talvez seja – na previsão de especialistas, em cerca de 200 anos – quando acumular peculiaridades que nos impeçam de entender inteiramente o que um nativo de Portugal diz.

A expansão do português no Brasil, as variações regionais com suas possíveis explicações, que fazem o urubu de São Paulo ser chamado de corvo no Sul do país, e as raízes das inovações da linguagem estão emergindo por meio do trabalho de cerca de 200 linguistas. De acordo com estudos da Universidade de São Paulo (USP), uma inovação do português brasileiro, por enquanto sem equivalente em Portugal, é o Rcaipira, às vezes tão intenso que parece valer por dois ou três, como em porrrta ou carrrne.

Associar o R caipira apenas ao interior paulista, porém, é uma imprecisão geográfica e histórica, embora o R desavergonhado tenha sido uma das marcas do estilo matuto do ator Amácio Mazzaropi em seus 32 filmes, produzidos de 1952 a 1980. Seguindo as rotas dos bandeirantes paulistas em busca de ouro, os linguistas encontraram o Rsupostamente típico de São Paulo em cidades de Minas Gerais, Mato Grosso, Mato Grosso do Sul, Paraná e oeste de Santa Catarina e do Rio Grande do Sul, formando um modo de falar similar ao português do século XVIII. Quem tiver paciência e ouvido apurado poderá encontrar também na região central do Brasil – e em cidades do litoral – o S chiado, uma característica hoje típica do falar carioca que veio com os portugueses em 1808 e era um sinal de prestígio por representar o falar da Corte. Mesmo os portugueses não eram originais: os especialistas argumentam que o Schiado, que faz da esquina uma shquina, veio dos nobres franceses, que os portugueses admiravam.

A história da língua portuguesa no Brasil está trazendo à tona as características preservadas do português, como a troca do L pelo R, resultando em pranta em vez deplanta. Camões registrou essa troca em Os lusíadas – lá está um frautas no lugar de flautas – e o cantor e compositor paulista Adoniran Barbosa a deixou registrada em diversas composições, em frases como “frechada do teu olhar”, do samba Tiro ao Álvaro. Em levantamentos de campo, pesquisadores da USP observaram que moradores do interior tanto do Brasil quanto de Portugal, principalmente os menos escolarizados, ainda falam desse modo. Outro sinal de preservação da língua identificado por especialistas do Rio de Janeiro e de São Paulo, dessa vez em documentos antigos, foi a gente ou as gentes como sinônimo de “nós” e hoje uma das marcas próprias do português brasileiro.

Célia Lopes, da Universidade Federal do Rio de Janeiro (UFRJ), encontrou registros de a gente em documentos do século XVI e, com mais frequência, a partir do século XIX. Era uma forma de indicar a primeira pessoa do plural, no sentido de todo mundo com a inclusão necessária do eu. Segundo ela, o emprego de a gente pode passar descompromisso e indefinição: quem diz a gente em geral não deixa claro se pretende se comprometer com o que está falando ou se se vê como parte do grupo, como em “a gente precisa fazer”. Já o pronome nós, como em “nós precisamos fazer”, expressa responsabilidade e compromisso. Nos últimos 30 anos, ela notou, a gente instalou-se nos espaços antes ocupados pelo nós e se tornou um recurso bastante usado por todas as idades e classes sociais no país inteiro, embora nos livros de gramática permaneça na marginalidade.

Linguistas de vários estados do país estão desenterrando as raízes do português brasileiro ao examinar cartas pessoais e administrativas, testamentos, relatos de viagens, processos judiciais, cartas de leitores e anúncios de jornais desde o século XVI, coletados em instituições como a Biblioteca Nacional e o Arquivo Público do Estado de São Paulo. A equipe de Célia Lopes tem encontrado também na feira de antiguidades do sábado da Praça XV de Novembro, no centro do Rio, cartas antigas e outros tesouros linguísticos, nem sempre valorizados. “Um estudante me trouxe cartas maravilhosas encontradas no lixo”, ela contou.

Sem título da série Estudo para bandeirantes, sem data, de Henrique Bernardelli, (Acervo Pinacoteca do Estado de SP) paulistas expandiram a língua portuguesa conquistando  outras regiões

De vossa mercê para 
Os documentos antigos evidenciam que o português falado no Brasil começou a se diferenciar do europeu há pelo menos quatro séculos. Uma indicação dessa separação é o Memórias para a história da capitania de São Vicente, de 1793, escrito por frei Gaspar da Madre de Deus, nascido em São Vicente, e depois reescrito pelo português Marcelino Pereira Cleto, que foi juiz em Santos. Comparando as duas versões, José Simões, da USP, encontrou 30 diferenças entre o português brasileiro e o europeu. Uma delas é encontrada ainda hoje: como usuários do português brasileiro, preferimos explicitar os sujeitos das frases, como em “o rapaz me vendeu o carro, depois ele saiu correndo e ao atravessar a rua ele foi atropelado”. Em português europeu, seria mais natural omitir o sujeito, já definido pelo tempo verbal – “o rapaz vendeu-me o carro, depois saiu a correr…” –, resultando em uma construção gramaticalmente impecável, embora nos soe um pouco estranha.

Um morador de Portugal, se lhe perguntarem se comprou um carro, responderá com naturalidade “sim, comprei-o”, explicitando o complemento do verbo, “mesmo entre falantes pouco escolarizados”, observa Simões. Ele nota que os portugueses usam mesóclise – “dar-lhe-ei um carro, com certeza!” –, que soaria pernóstica no Brasil. Outra diferença é a distância entre a língua falada e a escrita no Brasil. Ninguém fala muito, mas muinto. O pronome você, que já é uma redução de vossa mercê e de vosmecê, encolheu ainda mais, para , e grudou no verbo: cevai?

“A língua que falamos não é a que escrevemos”, diz Simões, com base em exemplos como esses. “O português escrito e o falado em Portugal são mais próximos, embora também existam diferenças regionais.” Simões complementa as análises textuais com suas andanças por Portugal. “Há 10 anos meus parentes de Portugal diziam que não entendiam o que eu dizia”, ele observa. “Hoje, provavelmente por causa da influência das novelas brasileiras na televisão, dizem que já estou falando um português mais correto.”

“Conservamos o ritmo da fala, enquanto os europeus começaram a falar mais rápido a partir do século XVIII”, observa Ataliba Castilho, professor emérito da USP, que, nos últimos 40 anos, planejou e coordenou vários projetos de pesquisa sobre o português falado e a história do português do Brasil. “Até o século XVI”, diz ele, “o português brasileiro e o europeu eram como o espanhol, com um corte silábico duro. A palavra falada era muito próxima da escrita”. Célia Lopes acrescenta outra diferença: o português brasileiro conserva a maioria das vogais, enquanto os europeus em geral as omitem, ressaltando as consoantes, e diriam tulfón para se referir ao telefone.

Há também muitas palavras com sentidos diferentes de um lado e de outro do Atlântico. Os estudantes das universidades privadas não pagam mensalidade, mas propina. Bolsista é bolseiro. Como os europeus não adotaram algumas palavras usadas no Brasil, a exemplo de bunda, de origem africana, podem surgir situações embaraçosas. Vanderci Aguilera, professora sênior da Universidade Estadual de Londrina (Uel) e uma das linguistas empenhadas no resgate da história do português brasileiro, levou uma amiga portuguesa a uma loja. Para ver se um vestido que acabava de experimentar caía bem às costas, a amiga lhe perguntou: “O que achas do meu rabo?”.

016-023_CAPA_Portugues_230O soldado e a filha do fazendeiro
No acervo de documentos sobre a evolução do português paulista, está uma carta de 1807, escrita pelo soldado Manoel Coelho, que teria seduzido a filha de um fazendeiro. Quando soube, o pai da moça, enfurecido, forçou o rapaz a se casar com ela. O soldado, porém, bateu o pé: não se casaria, como ele escreveu, “nem por bem nem por mar”. Simões estranhou a citação ao mar, já que o quiproquó se passava na então vila de São Paulo, mas depois percebeu: “Olha o Rcaipira! Ele quis dizer ‘nem por bem nem por mal!’”. O soldado escrevia como falava, não se sabe se casou com a filha do fazendeiro, mas deixou uma prova valiosa de como se falava no início do século XIX.

“O R caipira era uma das características da língua falada na vila de São Paulo, que aos poucos, com a crescente urbanização e a chegada de imigrantes europeus, foi expulsa para a periferia ou para outras cidades”, diz Simões. “Era a língua dos bandeirantes.” Os especialistas acreditam que os primeiros moradores da vila de São Paulo, além de porrta, pulavam consoantes no meio das palavras, falando muié em vez de mulher, por exemplo. Para aprisionar índios e, mais tarde, para encontrar ouro, os bandeirantes conquistaram inicialmente o interior paulista, levando seu vocabulário e seu modo de falar. O R exagerado ainda pode ser ouvido nas cidades do chamado Médio Tietê como Santana de Parnaíba, Pirapora do Bom Jesus, Sorocaba, Itu, Tietê, Porto Feliz e Piracicaba, cujos moradores, principalmente os do campo, o pintor ituano José Ferraz de Almeida Júnior retratou, até ser assassinado pelo marido de sua amante em Piracicaba. Os bandeirantes seguiram depois para outras matas da imensa Capitania de São Paulo, constituída em 1709 com os territórios dos atuais estados de São Paulo, Mato Grosso do Sul, Mato Grosso, Rondônia, Tocantins, Minas Gerais, Paraná e Santa Catarina (ver mapa).

Manoel Mourivaldo Almeida, também da USP, encontrou sinais do português paulista antigo em Cuiabá, a capital de Mato Grosso, que permaneceu com relativamente pouca interação linguística e cultural com outras cidades depois do fim do auge da mineração de ouro, há dois séculos. “O português culto dos séculos XVI ao XVII tinha um Schiado”, conclui Almeida. “Os paulistas, quando foram para o Centro-Oeste, falavam como os cariocas hoje!” O ator e diretor teatral cuiabano Justino Astrevo de Aguiar reconhece a herança paulista e carioca, mas considera um traço mais evidente do falar local o hábito de acrescentar um J ou um T antes ou no meio das palavras, como em djeitocadju ou tchuva, uma característica da pronúncia típica do século XVII, que Almeida identificou também entre moradores de Goiás, Minas Gerais, Maranhão e na região da Galícia, na Espanha.

Almeida apurou o ouvido para as variações do português no Brasil por conta de sua própria história. Filho de portugueses, nasceu em Piritiba, interior da Bahia, saiu de lá aos 7 anos, morou em Jaciara, interior de Mato Grosso, e depois 25 anos em Cuiabá, foi professor da universidade federal e se mudou para São Paulo em 2003. Ele reconhece que fala como paulista nos momentos mais formais – embora prefira falar éxtra em vez de êxtra como os paulistas –, mas quando descontrai assume o ritmo de falar baiano e o vocabulário matogrossense. Ele estuda o modo de falar cuiabano desde 1991, por sugestão de um colega professor, Leônidas Querubim Avelino, especialista em Camões, que havia verificado sinais do português arcaico por lá. Avelino lhe contou que um roceiro cego de Livramento, a 30 quilômetros de Cuiabá, comentou que ele estava “andando pusilo”, no sentido de fraco. Avelino reconheceu uma forma reduzida de pusilânime, que não era mais usada em Portugal.

“Os moradores de Cuiabá e de algumas outras cidades, como Cáceres e Barão de Melgado, em Mato Grosso, e Corumbá, em Mato Grosso do Sul, preservam o português paulista do século XVIII mais do que os próprios paulistas. Paulistas do interior e também da capital hoje falam dia, com um d seco, enquanto na maior parte do Brasil se diz djia”, observou Almeida. “O modo de falar pode mudar dependendo do acesso à cultura, da motivação e da capacidade de perceber e articular sons de modo diferente. Quem procurar nos lugares mais distantes dos grandes centros urbanos vai encontrar sinais de preservação do português antigo.”

Rua 25 de março, 1894, de Antonio Ferrigno (Acervo Pinacoteca do Estado de SP). A cidade de São Paulo tinha um sotaque próprio

Rua 25 de março, 1894, de Antonio Ferrigno (Acervo Pinacoteca do Estado de SP). A cidade de São Paulo tinha um sotaque próprio

De 1998 a 2003, uma equipe coordenada por Heitor Megale, da USP, seguiu a rota das bandeiras do século XVI em busca de traços da língua portuguesa antiga que tenham permanecido ao longo de quatro séculos. As entrevistas com moradores com 60 anos a 90 anos de quase 40 cidades ou povoados de Minas Gerais, Goiás e Mato Grosso trouxeram à tona termos esquecidos como mamparra(fingimento) e mensonha (mentira), uma palavra de um dos poemas de Francisco de Sá de Miranda do século XV, treição, usada no interior de Goiás no sentido de surpresa, e termos da linguagem popular ainda usados em Portugal, como despoispercisão e tristura, comuns no sul de Minas. O que parecia anacronismo ganhou valor. Dizer sancristia em vez de sacristia não era um erro, “mas uma influência preservada do passado, quando a pronúncia era assim”, relatou o Jornal da Manhã, de Paracatu, Minas, em 20 de dezembro de 2001.

Ao norte, a língua portuguesa expandiu-se para o interior a partir da cidade de Salvador, que foi a capital do Brasil Colônia durante três séculos. Salvador era também um centro de fermentação da língua, por receber multidões de escravos africanos, que aprendiam o português como língua estrangeira, mas também ofereciam seu vocabulário, ao qual já haviam se somado as palavras indígenas.

Para impedir que a língua de Camões se desfigurasse ao cruzar com os dialetos nativos, Sebastião José de Carvalho e Melo, o Marquês de Pombal, secretário de Estado do reino, resolveu agir. Em 1757, Pombal expulsou os jesuítas, entre outras razões de ordem política, porque estavam ensinando a doutrina cristã em língua indígena, e, por decreto, fez do português a língua oficial do Brasil. O português se impôs sobre as línguas nativas e ainda hoje é a língua oficial, embora os linguistas alertem que não possa ser chamada de nacional por causa das 180 línguas indígenas faladas no país (eram 1.200, estima-se, quando os portugueses chegaram). A miscigenação linguística, que reflete a mistura de povos formadores do país, explica em boa parte as variações regionais de vocabulário e de ritmos, sintetizadas em um mapa dos falares do Museu da Língua Portuguesa, em São Paulo. É fácil encontrar variações em um mesmo estado: os moradores do norte de Minas falam como os baianos, os da região central mantêm o autêntico mineirês, no sul a influência paulista é intensa e a leste o modo de falar assemelha-se ao sotaque carioca.

A pandorga e o bigato
Há 10 anos um grupo de linguistas estuda um dos resultados da miscigenação linguística: os diferentes nomes com que um mesmo objeto pode ser chamado, registrados por meio de entrevistas com 1.100 pessoas em 250 localidades. Brasil afora, o brinquedo feito de papel e varetas que se empina ao vento por meio de uma linha é chamado de papagaio, pipa, raia ou pandorga – ou ainda coruja em Natal e João Pessoa –, de acordo com o primeiro volume do Atlas linguístico do Brasil, publicado em outubro de 2014 com os resultados das entrevistas nas capitais (Editora UEL). Já o aparelho com luzes vermelha, amarela e verde usado em cruzamentos de ruas para regular o trânsito é chamado apenas de sinal no Rio de Janeiro e em Belo Horizonte e também de semáforo nas capitais do Norte e Nordeste. Goiânia registrou os quatro nomes para o mesmo objeto: sinal, semáforo, sinaleiro e farol.

Começa agora a busca de explicações para essas diferenças. “Onde nasci, em Sertanópolis, a 42 quilômetros de Londrina”, disse Vanderci Aguilera, uma das coordenadoras do Atlas, “chamamos bicho de goiaba de bigato por influência dos colonizadores, que eram imigrantes italianos vindos do interior paulista”. Segundo ela, os moradores dos três estados do Sul chamam urubu de corvo por influência dos europeus, enquanto os do Sudeste mantiveram o nome tupi, urubu.

Cena de família de Adolfo Augusto Pinto, 1891, de Almeida Júnior (Acervo Pinacoteca do Estado de SP).No final do século XIX o pronome você já era mais formal que o tu

Cena de família de Adolfo Augusto Pinto, 1891, de Almeida Júnior (Acervo Pinacoteca do Estado de SP). No final do século XIX o pronome você já era mais formal que o tu

Cada estado – ou região – tem seu próprio patrimônio linguístico, que deve ser respeitado, enfatizam os especialistas. Os professores de português, alerta Vanderci, não deveriam repreender os alunos por chamarem beija-flor de cuitelo, como é comum no interior do Paraná, nem recriminar os que dizem carochurascoou baranco, como é comum entre os descendentes de poloneses e alemães no Sul, mas ensinar outras formas de falar e deixar a meninada se expressar como quiser quando estiver com a família ou com os amigos. “Ninguém fala errado”, ela enfatiza. “Todo mundo fala de acordo com sua história de vida, com o que foi transmitido pelos pais e depois modificado pela escola. Nossa fala é nossa identidade, não temos por que nos envergonhar.”

A diversidade do português brasileiro é tão grande que, apesar do empenho dos locutores de telejornais de alcance nacional em tentar criar uma língua neutra, despida de sotaques locais, “não há um padrão nacional”, assegura Castilho. “Há diferenças de vocabulário, gramática, sintaxe e pronúncia mesmo entre pessoas que adotam a norma culta”, diz ele. Insatisfeito com as teorias importadas, Castilho criou a abordagem multissistêmica da linguagem, segundo a qual qualquer expressão linguística mobiliza simultaneamente quatro planos (léxico, semântica, discurso e gramática), que deveriam ser vistos de modo integrado e não mais separadamente. Ao lado de Verena Kewitz, da USP, ele tem debatido essa abordagem com estudantes de pós-graduação e com outros especialistas do Brasil e no exterior.

Também está claro que o português brasileiro se refaz continuamente. As palavras podem morrer ou ganhar novos sentidos. Almeida contou que Celciane Vasconcelos, uma das estudantes de seu grupo, verificou que somente os moradores mais antigos do litoral paranaense conheciam a palavra sumaca, um tipo de barco antes comum, que hoje não se constrói mais, tirando a antiga serventia da palavra que hoje nomeia uma praia em Paraty (RJ). Os modos antigos de falar podem ressurgir. O R caipira, asseguram os linguistas, está voltando, até mesmo em São Paulo, e readquirindo status, na esteira dos cantores de música sertaneja. “Hoje ser caipira é chique”, assegura Vanderci. Ou ao menos é aceitável e parte do estilo pessoal, como o da apresentadora de TV Sabrina Sato.

Bilhetes de amor
Os linguistas têm notado a expansão do tratamento informal. “Tenho 78 anos e devia ser tratado por senhor, mas meus alunos mais jovens me tratam por você”, diz Castilho, aparentemente sem se incomodar com a informalidade, inconcebível em seus tempos de estudante. O você, porém, não reinará sozinho. Célia Lopes, com sua equipe da UFRJ, verificou que o tu predomina em Porto Alegre e convive com o você no Rio de Janeiro e em Recife, enquanto você é o tratamento predominante em São Paulo, Curitiba, Belo Horizonte e Salvador. O tu já era mais próximo e menos formal que vocênas quase 500 cartas do acervo on-line da UFRJ, quase todas de poetas, políticos e outras personalidades do final do século XIX e início do XX.

Como ainda faltava a expressão do falar das pessoas comuns, Célia e sua equipe exultaram ao encontrar 13 bilhetes escritos em 1908 por Robertina de Souza para seu amante e para seu marido. Esse material era parte de um processo-crime movido contra o marido, que expulsou de sua casa um amigo e a própria mulher ao saber que tinham tido um caso extraconjungal e depois matou o ex-amigo. Em um dos 11 bilhetes para o amante, Álvaro Mattos, Robertina, que assinava como Chininha, escreveu: “Eu te adoro te amo até a morte sou tua só tu é meu só o meu coracao e teu e o teu coracao é meu. Chininha e todinha tua ate a morte”. Já o marido, Arthur Noronha, que recebeu apenas dois bilhetes, ela tratava de modo mais formal: “Eu rezo pedindo a Deus para você me perdoar, mas creio que voce não tem coragem de ver morrer um filho o filha”. E mais adiante: “Não posso me separar de voce e do meu filho a não ser com a morte”. Não se sabe se ela voltou para casa, mas o marido foi absolvido, por alegar que matou o outro homem em defesa da honra.

Outro sinal da evolução do português brasileiro são as construções híbridas, com um verbo que não concorda mais com o pronome, do tipo tu não sabe?, e a mistura dos pronomes de tratamento você e tu, como em “se você precisar, vou te ajudar”. Os portugueses europeus poderiam alegar que se trata de mais uma prova de nossa capacidade de desfigurar a língua lusitana, mas talvez não tenham tanta razão para se queixar. Célia Lopes encontrou a mistura de pronomes de tratamento, que ela e outros linguistas não consideram mais um erro, em cartas do marquês do Lavradio, que foi vice-rei do Brasil de 1769 a 1796, e, mais de dois séculos depois, em uma entrevista do ex-presidente Fernando Henrique Cardoso.

Projeto
Projeto de história do português paulista (PHPP – Projeto Caipira) (nº 11/51787-5); Modalidade Projeto Temático; Pesquisador responsável Manoel Mourivaldo Santiago Almeida(USP); Investimento R$ 87.372,10 (FAPESP).

Media files on the Anthropocene

Confronting the ‘Anthropocene’ (Dot Earth, New York Times)

London from orbit

NASA Donald R. Pettit, an astronaut, took this photograph of London while living in the International Space Station.

LONDON — I’m participating in a one-day meeting at the Geological Society of London exploring the evidence for, and meaning of, the Anthropocene. This is the proposed epoch of Earth history that, proponents say, has begun with the rise of the human species as a globally potent biogeophysical force, capable of leaving a durable imprint in the geological record.

This recent TEDx video presentation by Will Steffen, the executive director of the Australian National University’s Climate Change Institute, lays out the basic idea:

There’s more on the basic concept in National Geographic and from the BBC. Paul Crutzen, the Nobel laureate in chemistry who, with others, proposed the term in 2000, and Christian Schwägerl, the author of “The Age of Man” (German), described the value of this new framing for current Earth history in January in Yale Environment 360:

Students in school are still taught that we are living in the Holocence, an era that began roughly 12,000 years ago at the end of the last Ice Age. But teaching students that we are living in the Anthropocene, the Age of Men, could be of great help. Rather than representing yet another sign of human hubris, this name change would stress the enormity of humanity’s responsibility as stewards of the Earth. It would highlight the immense power of our intellect and our creativity, and the opportunities they offer for shaping the future. [Read the rest.]

I’m attending because of a quirky role I played almost 20 years ago in laying the groundwork for this concept of humans as a geological force. A new paper from Steffen and three coauthors reviewing the conceptual and historic basis for the Anthropocene includes an appropriately amusing description of my role:

Biologist Eugene F. Stoermer wrote: ‘I began using the term “anthropocene” in the 1980s, but never formalized it until Paul [Crutzen] contacted me’. About this time other authors were exploring the concept of the Anthropocene, although not using the term. More curiously, a popular book about Global Warming, published in 1992 by Andrew C. Revkin, contained the following prophetic words: ‘Perhaps earth scientists of the future will name this new post-Holocene period for its causative element—for us. We are entering an age that might someday be referred to as, say, the Anthrocene [sic]. After all, it is a geological age of our own making’. Perhaps many readers ignored the minor linguistic difference and have read the new term as Anthro(po)cene!

If you’ve been tracking my work for a while, you’re aware of my focus on the extraordinary nature of this moment in both Earth and human history. As far as science can tell, there’s never, until now, been a point when a species became a planetary powerhouse and also became aware of that situation.

As I first wrote in 1992, cyanobacteria are credited with oxygenating the atmosphere some 2 billion years ago. That was clearly a more profound influence on a central component of the planetary system than humans raising the concentration of carbon dioxide 40 percent since the start of the industrial revolution. But, as far as we know, cyanobacteria (let alone any other life form from that period) were neither bemoaning nor celebrating that achievement.

It was easier to be in a teen-style resource binge before science began to delineate an edge to our petri dish.

We no longer have the luxury of ignorance.

We’re essentially in a race between our potency, our awareness of the expressed and potential ramifications of our actions and our growing awareness of the deeply embedded perceptual and behavioral traits that shape how we do, or don’t, address certain kinds of risks. (Explore “Boombustology” and “Disasters by Design” to be reminded how this habit is not restricted to environmental risks.)

This meeting in London is two-pronged. It is in part focused on deepening basic inquiry into stratigraphy and other branches of earth science and clarifying how this human era could qualify as a formal chapter in Earth’s physical biography. As Erle C. Ellis, an ecologist at the University of Maryland, Baltimore County, put it in his talk, it’s unclear for the moment whether humanity’s impact will be long enough to represent an epoch, or will more resemble “an event.” Ellis’s presentation was a mesmerizing tour of the planet’s profoundly humanized ecosystems, which he said would be better described as “anthromes” than “biomes.”

Ellis said it was important to approach this reality not as a woeful situation, but an opportunity to foster a new appreciation of the lack of separation of people and their planet and a bright prospect for enriching that relationship. In this his views resonate powerfully with those of Rene Dubos, someone I’ll be writing about here again soon.

Through the talks by Ellis and others, it was clear that the scientific effort to define a new geological epoch, while important, paled beside the broader significance of this juncture in human history.

In my opening comments at the meeting, I stressed the need to expand the discussion from the physical and environmental sciences into disciplines ranging from sociology to history, philosophy to the arts.

I noted that while the “great acceleration” described by Steffen and others is already well under way, it’s entirely possible for humans to design their future, at least in a soft way, boosting odds that the geological record will have two phases — perhaps a “lesser” and “greater” Anthropocene, as someone in the audience for my recent talk with Brad Allenby at Arizona State University put it.

I also noted that the term “Anthropocene,” like phrases such as “global warming,” is sufficiently vague to guarantee it will be interpreted in profoundly different ways by people with different world views. (As I explained, this is as true for Nobel laureates in physics as it is for the rest of us.)

Some will see this period as a “shame on us” moment. Others will deride this effort as a hubristic overstatement of human powers. Some will argue for the importance of living smaller and leaving no scars. Others will revel in human dominion as a normal and natural part of our journey as a species.

A useful trait will be to get comfortable with that diversity.

Before the day is done I also plan on pushing Randy Olson’s notion of moving beyond the “nerd loop” and making sure this conversation spills across all disciplinary and cultural boundaries from the get-go.

There’s much more to explore of course, and I’ll post updates as time allows. You might track the meeting hash tag, #anthrop11, on Twitter.

*   *   *

8/16/2014 @ 1:05PM – James Conca

The Anthropocene Part 1: Tracking Human-Induced Catastrophe On A Planetary Scale (Forbes)

For almost 30 years, we geologists have been having a debate about what Geologic Epoch we find ourselves in right now. It is presently called the Holocene, but some want to add another epoch and call it the Anthropocene.

Anthropocene combines the Greek words for humanand recent time period, to denote the period of time since human activity went global and we became an important geologic process in our own right (In These Times; see also UN video http://vimeo.com/39048998).

In other words, what should we call this period of time when we started trashing the planet? And when did it begin?

You might know some of the big Geologic Ages, Epochs, Periods, Eras and Eons. The dinosaurs died out 66 million years ago in the Maastrichtian Age in the Late Epoch of the Cretaceous Period in the Mesozoic Era of the Phanerozoic Eon.

The sum of Earth’s history, divided up into these different sections, is called the Geologic Time Scale and geologists have been defining and refining it for about 150 years (Geologic Society of America). The latest deliberation concerns the epoch from the present stretching back to about 12,000 years.  The term Anthropocene was popularized fourteen years ago by atmospheric scientists Paul Crutzen and Eugene Stoermer, to more decisively focus discussion on human global effects to the whole planet.

The Geologic Time Scale - the sectioning up of 4.6 billion years of Earth’s history into manageable time periods bounded by significant geological events. The present debate is - do we want to call the present epoch we are in the Anthropocene to reflect humanity’s global effect on the planet? Click on the link in the text for the full image. Source: The Geological Society of America

To Judith Wright, a chemostratigrapher, this is not just an academic exercise for the Ivory Tower (Ocean Redox Variations Through Time). There are obvious political and social implications, not the least being the role of human activities in climate change, wholesale extinctions of species unlike anytime in history, and the accelerating environmental destruction on a planetary scale that could spell our own doom.

What we call something matters. It sets the scale of importance and pushes the discussion in the direction that we need these debates to go.

The scientific decision on whether to incorporate the term Anthropocene into the geologic lexicon falls to a carefully deliberative group called the International Commission on Stratigraphy, particularly its Subcommission on Quaternary Stratigraphy which has formed an Anthropocene Working Group. There is a rumor they may arrive at a decision after only several years of debate, making this deliberation downright hasty.

This discussion concerns not just what to name it, but when it started. There is always some defining characteristic to one of these epochs or periods. Often a huge extinction event marks the end of an epoch, like the end of the Triassic Period 200 million years ago when half of all life perished. Or the appearance of something new in evolution, like the beginning of the Cambrian Period when organisms learned how to grow shells, bones, teeth and other hard parts from the increased dissolved minerals in seawater, providing huge survival advantages.

Sometimes there is a chemical marker, or layer, that is unique to that time or process, like the iridium layer that is one of the singular markers of the huge meteorite impact that finally stressed the dinosaurs’ environment to the point of extinction.

But why do stratigraphers get to decide this question? In the geologic sciences, stratigraphy is the study of rock layers on the Earth – how they relate to each other and to relative and absolute time, how they got there, and what they tell us about Earth history. Stratigraphy can be traced to the beginnings of modern geology in the 17th century with Nicholas Steno and his three stratigraphic rules:

–   the law of superposition (younger rocks lay on top of older ones)

–   the principle of original horizontality (all sedimentary and volcanic layers are originally laid down horizontally, or normal to the Earth’s gravitational field)

–   the principle of lateral continuity (sedimentary layers generally continue for a long distance as most were laid down in the ocean, and an abrupt edge indicates that something happened to break it off, like a fault or surface erosion).

These were profound observations and fundamentally changed how we understood time and geologic processes like flooding and earthquakes. In fact, the unique perspective that geology brings to humans is the understanding of time at all levels.

According to Patricia Corcoran and co-workers (GSA), material being laid down across the Earth in our present time is the most bizarre mix of chemicals and materials the Earth has ever seen, some compounds of which have never occurred in nature. Maybe compounds that could not have been produced naturally would make a good marker for the Anthropocene.

But defining a geologic time period requires a sufficiently large, clear and distinctive set of geologic characteristics to be useful as a global geologic boundary and that will also survive throughout geologic time.

We presently find ourselves in the Holocene Epoch of the Quaternary Period in the Cenozoic Era of the Phanerozoic Eon, defined as starting from the end of the last glacial retreat, an obvious event that led to our present global range in climate and other characteristics we define as the Earth today. The problem is, humans have dominated this entire epoch in many ways, from the dawn of agriculture, to smelting of iron and lead, burning of forests and finally the effects of the industrial revolution.

As in all environmental issues, it’s the number of people on Earth that’s the problem. For over 100,000 years, the global human population was steady at about 10 million. Then civilization appeared, fueled by the many significant developments that humans had begun to apply en masse including agriculture, domesticated animals, tools, serious engineering, and various uses of fire, fibers and the wheel.

Our population began to increase about 2000 years ago, at the beginning of the Common Era, rising to 300 million during the Middle Ages and to a billion at the beginning of the Industrial Age. Then 2 billion in 1927, 3 billion in 1960, 4 billion in 1974, 5 billion in 1987, 6 billion in 1999 and 7 billion in 2011. This exponential rise is textbook for a bacterial colony in a petri dish, right before it dies from outpacing its food sources and generating too much waste. It’s also eerily analogous for people on the petri dish of Earth.

Humans now comprise the largest mass of vertebrate matter on land on the entire Earth. The rest is almost all our food and friends, mainly the animals we domesticated over the last 50,000 years, plus a bunch of xenobiotics we’ve transported far from their habitats (Cornell University). Only a small percentage of all vertebrate mass on land is wild or natural (In These Times).

Let that sink in for a minute. Most of what people see in National Geographic or on the Discovery Channel or in movies about animals, IS ALMOST ALL GONE. Humans have dammed a third of the world’s rivers, have covered, destroyed or altered almost half of the world’s land surface. We use up most of the fresh water faster than it can be replenished. And we extinct about 30,000 species every year.

Whether from deforestation, agriculture, urbanization, roads, mining activities, aquatic farming, moving xenobiotic species around the world that destroy native species, dumping huge amounts of waste on land, in the ocean and in the atmosphere, and all other human activities, we have decimated the natural environment without thinking about what effect it has on global ecosystems and what it takes for our own species to survive.

There is a point where humans, our pets, our food animals and our food crops cannot survive without some aspects of a wild nature. Much of our crops need pollinators. The oxygen on this planet comes mainly from organisms in the top 300 feet of the ocean. Biodiversity is not just an environmental catch-phrase, it’s a necessity for survival.

So when did the Anthropocene begin?

Was it when agriculture began, when we started burning forests to clear land? Was it during the Iron Age when we clear-cut the northern forests to smelt iron ore?

Was it the advent of civilization, particularly the rise in agriculture and mining activities around the Mediterranean by the Phoenicians, Greeks and Romans, signified by a rise in environmental Pb levels (Shotyk et al, 1998) that continued until just recently?

Was it the 19th century when our population passed a billion and we began burning fossil fuels that has made carbon itself a global marker? Was it the 20thcentury when humans passed plate tectonics as the primary mechanism for moving rock and dirt on this planet?

Access to huge amounts of chemical energy trapped in fossil fuels allowed human populations to explode and allowed human effects to really go global. This is why most researchers point to the mid-19th century as the obvious time to start the Anthropocene.

One popular idea for the start of the Anthropocene is the beginning of the atomic age. Above-ground nuclear tests spread unique radionuclides like Pu around the world, elements that have not been seen in our solar nebula for six billion years, but now show up in surface sediments and ice cores, albeit in minute concentrations.

Crutzen recently gave his support for this point in time. However, the atomic age is just another aspect of the modern age of humans not associated with a particular change in a global characteristic. It cannot be seen in the field and marks no special geologic event, and would be more of a political or sociological marker than a geological one.

Perhaps we should wait until the end of this century when the worst effects will be upon us and the world will barely be recognizable to anyone living today. We might be able to just point to the time “when there used to be forests.”

In the end, this debate might be shear hubris since when we are gone, future geologists, of whatever species, will decide for themselves where they want to place the beginning of this particular catastrophe.

So…what’s your vote for when we start the Anthropocene?

*   *   *

2/29/2012 @ 11:55PM 487 views – Jayne Jung

On The Anthropocene Age (Forbes)

Original caption from NASA: "S103-E-5037 ...

Image via Wikipedia

The term, the Anthropocene Age, is now common. It represents the time since the early 1800s when man began to have an impact on the Earth’s climate. There is still some debate about its use though. Scientists have traditionally called the current period the Holocene Age, meaning entirely new. The Holocene Age started around 10,000 BC, after the last glacial period when there was significant glacier movement.

From the German newspaper Spiegel Online, interview with British geologist Jan Zalasiewicz:

SPIEGEL ONLINE: The idea of mankind as the owner of Earth — is that not disconcerting?

Zalasiewicz: While we now may be said to “own” the planet, that is not the same as controlling it. Our global experiment with CO2 is something that virtually every government would like to see brought under control, and yet collectively we are, at present, unable to do so. That seems to call for feelings of something other than hubris.

SPIEGEL ONLINE: Do you see more to Anthropocene than just a geological term? Is it a new way of thinking about mankind’s role on Earth?

Zalasiewicz: That is almost certainly part of the attraction of the term for the wider public. The term does encapsulate and integrate a wide range of phenomena that are often considered separately. It also provides a sense of the scale and significance of anthropogenic global change. It emphasizes the importance of the Earth’s long geological history as a context within which to better understand what is happening today.

Interview conducted by Christian Schwägerl

From the New York Times‘ editorial “The Anthropocene”:

Other species are embedded in the fossil record of the epochs they belong to. Some species, like ammonites and brachiopods, even serve as guides — or index fossils — to the age of the rocks they’re embedded in. But we are the only species to have defined a geological period by our activity — something usually performed by major glaciations, mass extinction and the colossal impact of objects from outer space, like the one that defines the upper boundary of the Cretaceous.

Humans were inevitably going to be part of the fossil record. But the true meaning of the Anthropocene is that we have affected nearly every aspect of our environment — from a warming atmosphere to the bottom of an acidifying ocean.

From Yale’s website “The Anthropocene Debate: Marking Humanity’s Impact” by Elizabeth Kobert:

One argument against the idea that a new human-dominated epoch has recently begun is that humans have been changing the planet for a long time already, indeed practically since the start of the Holocene. People have been farming for 8,000 or 9,000 years, and some scientists — most notably William Ruddiman, of the University of Virginia — have proposed that this development already represents an impact on a geological scale. Alternatively, it could be argued that the Anthropocene has not yet arrived because human impacts on the planet are destined to be even greater 50 or a hundred years from now.

“We’re still now debating whether we’ve actually got to the event horizon, because potentially what’s going to happen in the 21st century could be even more significant,” observed Mark Williams, a member of the Anthropocene Working Group who is also a geologist at the University of Leicester.

I personally do not want to know what that “event horizon” is.

*   *   *

The Dawning of the Age of Anthropocene (In These Times)

By altering the earth, have humans ushered in a new epoch?

BY JESSICA STITES

By inviting awe rather than—or along with—terror, the Anthropocene may offer a way to grapple with climate change rather than deny it.

Scientists have been clanging the alarm about human-caused climate change, trying to bring around the 33 percent of Americans who don’t believe the earth is warming and the 18 percent of believers in global warming who think the process is “natural.” But as climate scientists race against time to convince the resisters, another branch of science, geology, is taking the tortoise approach. For more than a decade, geologists have been debating whether to officially declare the existence of a new epoch, the “Anthropocene,”to acknowledge that humans are radically reshaping the earth’s surface. Some geologists see the Anthropocene model as a way to widen the lens on human impact on the earth and cut through both the alarmism and the resistance to spur concrete policy solutions.

Geologists divide the earth’s 4.5 billion-year history into eons, which are subdivided into eras, periods and epochs. Each division is marked by a discernable change in the earth’s strata, such as a new type of fossil representing a major evolutionary shift. On this global scale, humans are relative infants, just 200,000 years old. Civilization has sprung up only during the most recent epoch, the relatively warm interglacial Holocene, spanning the past 11,700 years.

But the Holocene’s days may be numbered. In 2000, the late biologist Eugene F. Stoermer and Nobel Prize-winning atmospheric chemist Paul Crutzen made a radical proposal in the International Geosphere-Biosphere Programme newsletter: that the Industrial Revolution’s explosion of carbon emissions had ushered in a new epoch, the Anthropocene, marked by human impact. In 2009, the International Commission on Stratigraphy’s Subcommission of Quaternary Stratigraphy assembled an Anthropocene Working Group, which in 2016 will issue a recommendation on whether to formally adopt the term.

For stratigraphers, this is “breakneck speed,” says University of Leicester paleobiologist Jan Zalasiewicz, the head of the working group. The geological time scale is as fundamental to the discipline as the periodic table is to chemistry. The last official change, when the Ediacaran Period supplanted the Vendian in 2004, came after 20 years of debate and shook up a scale that had been static for 120 years.

Whether or not the term is formalized, Zalasiewicz believes in the Anthropocene’s potential to alter perception, for geologists and non-geologists alike. “ ‘Anthropocene’ places our environmental situation in a different perspective,” he says. “You look at it sideways, as it were.”

Certainly, you look at it from a broader perspective. Zooming out to an aeonic scale is a sober way to assess the significance of climate change. Overall, human emissions have driven up global temperatures about 0.85 degrees Celsius since the 19th century, according to the UN’s Intergovernmental Panel on Climate Change. Deniers are fond of noting that Earth’s current temperatures are not the highest they’ve been, and that the earth cyclically warms and cools. That’s true. There have been several “hyperthermal” events when carbon dioxide has been released by the ton from the ground or the ocean, and the earth’s temperature has spiked. However, as Zalasiewicz and colleague Mark Williams note in a recent paper in the Italian journal Rendiconti Lincei, carbon is being released today more than 10 times faster than during any of these events. And when the other hyperthermal events began, the earth was already relatively warm, without today’s ice-capped poles that threaten to melt and pump up sea levels. Already, warming has eroded Arctic sea ice and altered the territory of climate-sensitive species—effects potentially stark enough by themselves to declare an Anthropocene.

But even if we set aside climate change, there are still plenty of arguments for an Anthropocene, says Zalasiewicz. Take mineral traces: He calculates that humans have purified a half-billion tons of aluminum, enough to cover the United States in tinfoil. We’ve also made never-before-seen alloys, like the boron carbide used in bullet-proof vests and the tungsten carbide that forms the balls in ballpoint pens. And then there are plastics—more than 250 million tons are now produced annually. Zalasiewicz and his colleagues term these various human-made objects “technofossils.”

Technofossils aren’t the only things that humans scatter; we’ve also spread germs, seaweed, strawberries, sparrows, rats, jicama, cholera … the list goes on. You might see us as bees pollinating the earth or, less flatteringly, dung beetles scuttling our shit around. Today, according to a 1999 Cornell study, 98 percent of U.S. food—both animal-and plant-derived—is non-native.

Then there’s the population explosion. Humans now make up nearly one-fourth of total land vertebrate biomass, and most of the rest is our food animals, with wild animals—the lions, tigers and bears—making up a mere 3 percent. Those proliferating people suck up resources: According to calculations by Stanford geology PhD student Miles Traer, humans use the equivalent of 13 barrels of oil per person every year, and the United States consumes 345 billion gallons of fresh water a day, enough to drain all of Lake Michigan every 10 years.

In the face of all this, few geologists deny that stratigraphers a billion years from now (either human or alien, depending on your level of optimism) will see our current period as a major shift in the geological record. So the main question facing the Anthropocene Working Group is when the Anthropocene began, or in geological parlance, where to plant “the golden spike.” Four possibilities are under consideration.

One camp favors the 1800s, when megacities began to emerge and form unique strata of compressed rubble and trash heaps. London was the first, with a population of 3 million in 1850. With megacities came industrialization and a sudden escalation in CO2 emissions that began to push up global temperatures.

A second group, including many archeologists, argues for an earlier date. University of Virginia paleoclimatology professor William F. Ruddiman calculates that the first Agricultural Revolution, which began some 12,000 years ago, caused greater climate effects than have yet to be seen from the Industrial Revolution. Due to the widespread preindustrial use of fire to clear forests, Ruddiman believes that emissions of greenhouse gases, coupled with the loss of forests’ cooling effect, caused the earth’s temperature to be 1.3 to 1.4 degrees Celsius higher than if humans had never existed, warding off an overdue Ice Age.

However, where to plant the golden spike in this model isn’t entirely clear, since the changes happened over thousands of years. One option would be the mass extinction of large animals in the Americas some 12,500 years ago, believed to be caused by humans. In that case, the Anthropocene would supplant the Holocene.

A third school believes the Anthropocene should be dated to the mid-20th century, when the “Great Acceleration” began and population, urbanization, globalization and CO2 emissions took off. The past 60-odd years have seen a doubling of the human population and the release of three-quarters of all human-caused CO2 emissions. Zalasiewicz favors this hypothesis because of the sharpness of the acceleration and the synchronicity of the changes. The automobile, for instance, proliferated worldwide in less than a century, the blink of a geologic eye. This time period contains a dramatic option for the “golden spike”: the first A-bomb tests of the 1940s, which left radionuclidic traces across the earth that are readily detectable in ice core samples pulled from the poles. Astrobiologist David Grinspoon, a proponent of the nuclear-testing golden spike, writes, “The symbolism is so potent—the moment we grasped that terrible Promethean fire that, uncontrolled, could consume the world.”

The fourth and final camp is made up of the “not yets,” who point out that everything is still accelerating—population, technofossil production, CO2 emissions—with no reason to believe things will stabilize and some reason to expect dramatic upheavals. Many geologists predict that the next two centuries will bring a mass extinction event of a magnitude seen only five times before (e.g., the dinosaur die-off ). A number of drastic changes triggered by climate change may lie in store, according to an IPCC report released March 31—not only extinctions, but also food and water shortages, irreversible loss of coral reefs and Arctic ice, and “abrupt or drastic changes” like Greenland melting or the Amazon rainforest drying.

It’s here that the conversation veers from the scientific into the panicked, which does not sit well with Mike Osborne, a Stanford Ph.D student in earth sciences and the creator of the “Generation Anthropocene” podcast. More comfortable with data and description than prediction, Osborne eschews apocalyptic thinking, and he doesn’t like the vitriol and politicization of the climate-change debate.

But of course, the issue is political, because it’s inseparable from human choices. A forthcoming peer-reviewed study funded by NASA’s Goddard Space Flight Center warns of the potential for “irreversible” collapse of civilization in the next few decades and stresses that the key factors are both environmental and social. The study says that the enormous consumption of resources is dangerous specifically because it is paired with “the economic stratification of society into Elites [rich] and Masses (or “Commoners”) [poor]”—noting that these two features were common to the toppling of numerous empires, from Han to Roman, over the last 5,000 years.

The issue is also emotional. One side calls the other “alarmist” and “hysterical,” which may be code for “I can’t handle hearing this.” Even for believers, there seems to be a gap between apprehending the problem and believing in a solution. Pew found that 62 percent of Americans understand that climate change is happening and 44 percent believe it’s human-caused, but when Pew asked Americans in another survey what our government’s top priority should be, climate change ranked 19th of 20 issues, just above global trade.

Osborne says he “hesitate[s] to be overly optimistic or pessimistic,” even though, with his first child just born, the earth’s future weighs on his mind. “The Anthropocene’s great utility for me in terms of imagination,” he says, “is that at its best, it comes with a sense of awe: Holy cow, the world is freaking huge and amazing and beautiful and scary.”

By inviting awe rather than—or along with—terror, the Anthropocene may offer a way to grapple with climate change rather than deny it.“The pace of change seems to be ever accelerating, but so does the response. I am loathe to underestimate human ingenuity,” says Osborne. Zalasiewicz and Osborne both think that certain effects of the Anthropocene, like warming and biodiversity loss, warrant environmentalist solutions, such as measures to curb CO2 emissions, and wildlife preserves on both land and sea.

Whatever the working group proposes in 2016—which must then be affirmed by three higher bodies—Zalasiewicz believes the concept of the Anthropocene is here to stay. “If it wasn’t useful, if it was a catchphrase, it would have quite quickly fallen by the wayside,” he says. “It packages up a whole range of different isolated phenomena—ocean acidification, landscape change, biodiversity loss—and integrates them into a common signal or trend or pattern.”

Perhaps the data-driven, methodical approach of geology is what this debate needs. To Osborne, the Anthropocene is exciting because it forces us to look closely at what’s really happening and challenges our tendency to think in a nature/human divide. In the Anthropocene model, the earth’s workings and fate are intertwined with our own. Indeed, science theorist Bruno Latour goes so far as to say that the Anthropocene could mark “the final rejection of the separation between Nature and Human that has paralyzed science and politics since the dawn of modernism.”

*   *   *

Nasa-funded study: industrial civilisation headed for ‘irreversible collapse’? (The Guardian)

Natural and social scientists develop new model of how ‘perfect storm’ of crises could unravel global system

This NASA Earth Observatory released on

This Nasa Earth Observatory image shows a storm system circling around an area of extreme low pressure in 2010, which many scientists attribute to climate change. Photograph: AFP/Getty Images

A new study partly-sponsored by Nasa’s Goddard Space Flight Center has highlighted the prospect that global industrial civilisation could collapse in coming decades due to unsustainable resource exploitation and increasingly unequal wealth distribution.

Noting that warnings of ‘collapse’ are often seen to be fringe or controversial, the study attempts to make sense of compelling historical data showing that “the process of rise-and-collapse is actually a recurrent cycle found throughout history.” Cases of severe civilisational disruption due to “precipitous collapse – often lasting centuries – have been quite common.”

The independent research project is based on a new cross-disciplinary ‘Human And Nature DYnamical’ (HANDY) model, led by applied mathematician Safa Motesharrei of the US National Science Foundation-supported National Socio-Environmental Synthesis Center, in association with a team of natural and social scientists. The HANDY model was created using a minor Nasa grant, but the study based on it was conducted independently. The study based on the HANDY model has been accepted for publication in the peer-reviewed Elsevier journal, Ecological Economics.

It finds that according to the historical record even advanced, complex civilisations are susceptible to collapse, raising questions about the sustainability of modern civilisation:

“The fall of the Roman Empire, and the equally (if not more) advanced Han, Mauryan, and Gupta Empires, as well as so many advanced Mesopotamian Empires, are all testimony to the fact that advanced, sophisticated, complex, and creative civilizations can be both fragile and impermanent.”

By investigating the human-nature dynamics of these past cases of collapse, the project identifies the most salient interrelated factors which explain civilisational decline, and which may help determine the risk of collapse today: namely, Population, Climate, Water, Agriculture, and Energy.

These factors can lead to collapse when they converge to generate two crucial social features: “the stretching of resources due to the strain placed on the ecological carrying capacity”; and “the economic stratification of society into Elites [rich] and Masses (or “Commoners”) [poor]” These social phenomena have played “a central role in the character or in the process of the collapse,” in all such cases over “the last five thousand years.”

Currently, high levels of economic stratification are linked directly to overconsumption of resources, with “Elites” based largely in industrialised countries responsible for both:

“… accumulated surplus is not evenly distributed throughout society, but rather has been controlled by an elite. The mass of the population, while producing the wealth, is only allocated a small portion of it by elites, usually at or just above subsistence levels.”

The study challenges those who argue that technology will resolve these challenges by increasing efficiency:

“Technological change can raise the efficiency of resource use, but it also tends to raise both per capita resource consumption and the scale of resource extraction, so that, absent policy effects, the increases in consumption often compensate for the increased efficiency of resource use.”

Productivity increases in agriculture and industry over the last two centuries has come from “increased (rather than decreased) resource throughput,” despite dramatic efficiency gains over the same period.

Modelling a range of different scenarios, Motesharrei and his colleagues conclude that under conditions “closely reflecting the reality of the world today… we find that collapse is difficult to avoid.” In the first of these scenarios, civilisation:

“…. appears to be on a sustainable path for quite a long time, but even using an optimal depletion rate and starting with a very small number of Elites, the Elites eventually consume too much, resulting in a famine among Commoners that eventually causes the collapse of society. It is important to note that this Type-L collapse is due to an inequality-induced famine that causes a loss of workers, rather than a collapse of Nature.”

Another scenario focuses on the role of continued resource exploitation, finding that “with a larger depletion rate, the decline of the Commoners occurs faster, while the Elites are still thriving, but eventually the Commoners collapse completely, followed by the Elites.”

In both scenarios, Elite wealth monopolies mean that they are buffered from the most “detrimental effects of the environmental collapse until much later than the Commoners”, allowing them to “continue ‘business as usual’ despite the impending catastrophe.” The same mechanism, they argue, could explain how “historical collapses were allowed to occur by elites who appear to be oblivious to the catastrophic trajectory (most clearly apparent in the Roman and Mayan cases).”

Applying this lesson to our contemporary predicament, the study warns that:

“While some members of society might raise the alarm that the system is moving towards an impending collapse and therefore advocate structural changes to society in order to avoid it, Elites and their supporters, who opposed making these changes, could point to the long sustainable trajectory ‘so far’ in support of doing nothing.”

However, the scientists point out that the worst-case scenarios are by no means inevitable, and suggest that appropriate policy and structural changes could avoid collapse, if not pave the way toward a more stable civilisation.

The two key solutions are to reduce economic inequality so as to ensure fairer distribution of resources, and to dramatically reduce resource consumption by relying on less intensive renewable resources and reducing population growth:

“Collapse can be avoided and population can reach equilibrium if the per capita rate of depletion of nature is reduced to a sustainable level, and if resources are distributed in a reasonably equitable fashion.”

The NASA-funded HANDY model offers a highly credible wake-up call to governments, corporations and business – and consumers – to recognise that ‘business as usual’ cannot be sustained, and that policy and structural changes are required immediately.

Although the study based on HANDY is largely theoretical – a ‘thought-experiment’ – a number of other more empirically-focused studies – by KPMG and the UK Government Office of Science for instance – have warned that the convergence of food, water and energy crises could create a ‘perfect storm’ within about fifteen years. But these ‘business as usual’ forecasts could be very conservative.

Dr Nafeez Ahmed is executive director of the Institute for Policy Research & Development and author of A User’s Guide to the Crisis of Civilisation: And How to Save It among other books. Follow him on Twitter @nafeezahmed

  • This article was amended on 26 March 2014 to reflect the nature of the study and Nasa’s relationship to it more clearly.

California Imposes First Mandatory Water Restrictions to Deal With Drought (New York Times)

PHILLIPS, Calif. — Gov. Jerry Brown on Wednesday ordered mandatory water use reductions for the first time in California’s history, saying the state’s four-year drought had reached near-crisis proportions after a winter of record-low snowfalls.

Mr. Brown, in an executive order, directed the State Water Resources Control Board to impose a 25 percent reduction on the state’s 400 local water supply agencies, which serve 90 percent of California residents, over the coming year. The agencies will be responsible for coming up with restrictions to cut back on water use and for monitoring compliance. State officials said the order would impose varying degrees of cutbacks on water use across the board — affecting homeowners, farms and other businesses, as well as the maintenance of cemeteries and golf courses.

While the specifics of how this will be accomplished are being left to the water agencies, it is certain that Californians across the state will have to cut back on watering gardens and lawns — which soak up a vast amount of the water this state uses every day — as well as washing cars and even taking showers.

“People should realize we are in a new era,” Mr. Brown said at a news conference here on Wednesday, standing on a patch of brown and green grass that would normally be thick with snow at this time of year. “The idea of your nice little green lawn getting watered every day, those days are past.”

Owners of large farms, who obtain their water from sources outside the local water agencies, will not fall under the 25 percent guideline. State officials noted that many farms had already seen a cutback in their water allocations because of the drought. In addition, the owners of large farms will be required, under the governor’s executive order, to offer detailed reports to state regulators about water use, ideally as a way to highlight incidents of water diversion or waste.

Because of this system, state officials said, they did not expect the executive order to result — at least in the immediate future — in an increase in farm or food prices.

Weeds grow in what used to be the bottom of Lake McClure. CreditJustin Sullivan/Getty Images 

State officials said that they were prepared to enforce punitive measures, including fines, to ensure compliance, but that they were hopeful it would not be necessary to do so. That said, the state had trouble reaching the 20 percent reduction target that Mr. Brown set in January 2014 when he issued a voluntary reduction order as part of declaring a drought emergency. The state water board has the power to impose fines on local water suppliers that fail to meet the reduction targets set by the board over the coming weeks.

The governor announced what amounts to a dramatic new chapter in the state’s response to the drought while attending the annual April 1 measuring of the snowpack here in the Sierra Nevada. Snowpacks are critical to the state’s water system: They store water that falls during the wet season, and release it through the summer.

In a typical year, the measure in Phillips is around five or six feet, as Frank Gehrke, chief of the California Cooperative Snow Survey Program, indicated by displaying the measuring stick brought out annually. But on Wednesday, Mr. Brown was standing on an utterly dry field after he and Mr. Gehrke went through the motions of measuring a snowpack. State officials said they now expected the statewide snowpack measure to be about 6 percent of normal.

“We are standing on dry grass, and we should be standing on five feet of snow,” Mr. Brown said. “We are in an historic drought.”

Water has long been a precious resource in California, the subject of battles pitting farmer against city-dweller and northern communities against southern ones; books and movies have been made about its scarcity and plunder. Water is central to the state’s identity and economy, and a symbol of how wealth and ingenuity have tamed nature: There are golf courses in the deserts of Palm Springs, lush gardens and lawns in Los Angeles, and vast expanses of irrigated fields of farmland throughout the Central Valley.

Given that backdrop, any effort to force reductions in water use could be politically contentious, as Mr. Brown himself acknowledged. “This will be somewhat of a burden — it’s going to be very difficult,” he said. “People will say, ‘What about the farmers?’ Farmers will say, ‘What about the people who water their lawns?’ ”

Within hours of Mr. Brown’s announcement, Representative Kevin McCarthy, the California Republican who is the House majority leader, announced plans to renew efforts in Congress to pass legislation requiring the building of two huge water facilities in the state. The efforts had been blocked by Democrats concerned that the water projects would harm the environment and damage endangered species of fish.

“The current drought in California is devastating,” Mr. McCarthy said. “Today’s order from the governor should not only alarm Californians, but the entire nation should take notice that the most productive agriculture state in the country has entered uncharted territory.”

“I’m from the Central Valley, and we know that we cannot conserve or ration our way out of this drought,” he said.

Skiers avoid dry patches at Squaw Valley Ski Resort. CreditMax Whittaker/Getty Images 

The newly mandated 25 percent cut is in relation to total water use in the state in 2013. Cuts will vary from community to community, reflecting that per capita water use reduction has been better in some areas than others. In addition, the state and local governments will offer temporary rebate programs for homeowners who replace dishwashers and washing machines with water-efficient models.

Mr. Brown said the state would impose water-use restrictions on golf courses and cemeteries and require that nonpotable water be used on median dividers.

Lawns consume much of the water used every year in California, and the executive order calls for the state, working with local governments, to replace 50 million acres of ornamental turf with planting that consumes less water.

A hotel worker hosed a sidewalk in San Francisco last year. Such water use can now bring fines. CreditJustin Sullivan/Getty Images 

The order also instructs water authorities to raise rates on heavy water users. Those pricing systems, intended to reward conservers and punish wasters, are used in some parts of this state and have proved effective, state officials said.

Felicia Marcus, the chairwoman of the State Water Resources Control Board, said that California would leave it to local water providers to decide how to enforce the reductions on homeowners and businesses. She said she anticipated that most of the restrictions would be aimed at the outdoor use of water; many communities have already imposed water restrictions on lawns and gardens, but Ms. Marcus suggested they had not gone far enough.

“We are in a drought unlike one we’ve seen before, and we have to take actions that we haven’t taken before,” she said. “We are not getting the level of effort that the situation clearly warrants.”

Mark W. Cowin, the director of the California Department of Water Resources, said the state would tightly monitor compliance, in the hope that would be enough to accomplish the 25 percent reduction. If it is not, the order authorizes water suppliers to penalize offenders.

“We are looking for success, not to be punitive,” Mr. Cowin said. “In the end, if people and communities don’t comply, there will be repercussions, including fines.”

Além do butim (Pesquisa Fapesp)

01.04.2015 

A pirataria ganhou força no século XVI, segundo o historiador Jean Marcel Carvalho França, professor da Universidade Estadual Paulista (Unesp) em Franca. As frotas de Espanha e Portugal eram atacadas com frequência, resultando em perdas imensas de ouro, pau-brasil e marfim. Mesmo que não tenham conseguido se fixar no Brasil, franceses e ingleses formaram colônias nas Américas Central e do Norte. Mais do que uma simples aventura, esse tipo de invasão representava uma contestação de governos da Europa à divisão das terras do Novo Mundo entre Espanha e Portugal, formalizada por meio do Tratado de Tordesilhas em 1494. Veja no vídeo produzido pela equipe de Pesquisa FAPESP como reinos europeus apoiavam os ataques de corsários à costa brasileira.

Preternatural machines (AEON)

by 

Robots came to Europe before the dawn of the mechanical age. To a medieval world, they were indistinguishable from magic

E R Truitt is a medieval historian at Bryn Mawr College in Pennsylvania. Her book, Medieval Robots: Mechanism, Magic, Nature, and Art, is out in June.

Edited by Ed Lake

In 807 the Abbasid caliph in Baghdad, Harun al-Rashid, sent Charlemagne a gift the like of which had never been seen in the Christian empire: a brass water clock. It chimed the hours by dropping small metal balls into a bowl. Instead of a numbered dial, the clock displayed the time with 12 mechanical horsemen that popped out of small windows, rather like an Advent calendar. It was a thing of beauty and ingenuity, and the Frankish chronicler who recorded the gift marvelled how it had been ‘wondrously wrought by mechanical art’. But given the earliness of the date, what’s not clear is quite what he might have meant by that.

Certain technologies are so characteristic of their historical milieux that they serve as a kind of shorthand. The arresting title credit sequence to the TV series Game of Thrones (2011-) proclaims the show’s medieval setting with an assortment of clockpunk gears, waterwheels, winches and pulleys. In fact, despite the existence of working models such as Harun al-Rashid’s gift, it was another 500 years before similar contraptions started to emerge in Europe. That was at the turn of the 14th century, towards the end of the medieval period – the very era, in fact, whose political machinations reportedly inspired the plot of Game of Thrones.

When mechanical clockwork finally took off, it spread fast. In the first decades of the 14th century, it became so ubiquitous that, in 1324, the treasurer of Lincoln Cathedral offered a substantial donation to build a new clock, to address the embarrassing problem that ‘the cathedral was destitute of what other cathedrals, churches, and convents almost everywhere in the world are generally known to possess’. It’s tempting, then, to see the rise of the mechanical clock as a kind of overnight success.

But technological ages rarely have neat boundaries. Throughout the Latin Middle Ages we find references to many apparent anachronisms, many confounding examples of mechanical art. Musical fountains. Robotic servants. Mechanical beasts and artificial songbirds. Most were designed and built beyond the boundaries of Latin Christendom, in the cosmopolitan courts of Baghdad, Damascus, Constantinople and Karakorum. Such automata came to medieval Europe as gifts from foreign rulers, or were reported in texts by travellers to these faraway places.

In the mid-10th century, for instance, the Italian diplomat Liudprand of Cremona described the ceremonial throne room in the Byzantine emperor’s palace in Constantinople. In a building adjacent to the Great Palace complex, Emperor Constantine VII received foreign guests while seated on a throne flanked by golden lions that ‘gave a dreadful roar with open mouth and quivering tongue’ and switched their tails back and forth. Next to the throne stood a life-sized golden tree, on whose branches perched dozens of gilt birds, each singing the song of its particular species. When Liudprand performed the customary prostration before the emperor, the throne rose up to the ceiling, potentate still perched on top. At length, the emperor returned to earth in a different robe, having effected a costume change during his journey into the rafters.

The throne and its automata disappeared long ago, but Liudprand’s account echoes a description of the same marvel that appears in a Byzantine manual of courtly etiquette, written – by the Byzantine emperor himself, no less – at around the same time. The contrast between the two accounts is telling. The Byzantine one is preoccupied with how the special effects slotted into certain rigid courtly rituals. It was during the formal introduction of an ambassador, the manual explains, that ‘the lions begin to roar, and the birds on the throne and likewise those in the trees begin to sing harmoniously, and the animals on the throne stand upright on their bases’. A nice refinement of royal protocol. Liudprand, however, marvelled at the spectacle. He hazarded a guess that a machine similar to a winepress might account for the rising throne; as for the birds and lions, he admitted: ‘I could not imagine how it was done.’

Other Latin Christians, confronted with similarly exotic wonders, were more forthcoming with theories. Engineers in the West might have lacked the knowledge to copy these complex machines or invent new ones, but thanks to gifts such as Harun al-Rashid’s clock and travel accounts such as Liudprand’s, different kinds of automata became known throughout the Christian world. In time, scholars and philosophers used their own scientific ideas to account for them. Their framework did not rely on a thorough understanding of mechanics. How could it? The kind of mechanical knowledge that had flourished since antiquity in the East had been lost to Europe following the decline of the western Roman Empire.

Instead, they talked about what they knew: the hidden powers of Nature, the fundamental sympathies between celestial bodies and earthly things, and the certainty that demons existed and intervened in human affairs. Arthur C Clarke’s dictum that any sufficiently advanced technology is indistinguishable from magic was rarely more apposite. Yet the very blurriness of that boundary made it fertile territory for the medieval Christian mind. In time, the mechanical age might have disenchanted the world – but its eventual victory was much slower than the clock craze might suggest. And in the meantime, there were centuries of magical machines.

In the medieval Latin world, Nature could – and often did – act predictably. But some phenomena were sufficiently weird and rare that they could not be considered of a piece with the rest of the natural world. They therefore were classified as preternatural: literally, praeter naturalis or ‘beyond nature’.

What might fall into this category? Pretty much any freak occurrence or deviation from the ordinary course of things: a two-headed sheep, for example. Then again, some phenomena qualified as preternatural because their causes were not readily apparent and were thus difficult to know. Take certain hidden – but essential – characteristics of objects, such as the supposedly fire-retardant skin of the salamander, or the way that certain gems were thought to detect or counteract poison. Magnets were, of course, a clear case of the preternatural at work.

If the manifestations of the preternatural were various, so were its causes. Nature herself might be responsible – just because she often behaved predictably did not mean that she was required to do so – but so, equally, might demons and angels. People of great ability and learning could use their knowledge, acquired from ancient texts, to predict preternatural events such as eclipses. Or they might harness the secret properties of plants or natural laws to bring about certain desired outcomes. Magic was largely a matter of manipulating this preternatural domain: summoning demons, interpreting the stars, and preparing a physic could all fall under the same capacious heading.

All of which is to say, there were several possible explanations for the technological marvels that were arriving from the east and south. Robert of Clari, a French knight during the disastrous Fourth Crusade of 1204, described copper statues on the Hippodrome that moved ‘by enchantment’. Several decades later, Franciscan missionaries to the Mongol Empire reported on the lifelike artificial birds at the Khan’s palace and speculated that demons might be the cause (though they didn’t rule out superior engineering as an alternative theory).

Does a talking statue owe its powers to celestial influence or demonic intervention?

Moving, speaking statues might also be the result of a particular alignment of planets. While he taught at the cathedral school in Reims, Gerbert of Aurillac, later Pope Sylvester II (999-1003), introduced tools for celestial observation (the armillary sphere and the star sphere) and calculation (the abacus and Arabic numerals) to the educated elites of northern Europe. His reputation for learning was so great that, more than 100 years after his death, he was also credited with making a talking head that foretold the future. According to some accounts, he accomplished this through demonic magic, which he had learnt alongside the legitimate subjects of science and mathematics; according to others, he used his superior knowledge of planetary motion to cast the head at the precise moment of celestial conjunction so that it would reveal the future. (No doubt he did his calculations with an armillary sphere.)

Because the category of the preternatural encompassed so many objects and phenomena, and because there were competing, rationalised explanations for preternatural things, it could be difficult to discern the correct cause. Does a talking statue owe its powers to celestial influence or demonic intervention? According to one legend, Albert the Great – a 13th-century German theologian, university professor, bishop, and saint – used his knowledge to make a prophetic robot. One of Albert’s brothers in the Dominican Order went to visit him in his cell, knocked on the door, and was told to enter. When the friar went inside he saw that it was not Brother Albert who had answered his knock, but a strange, life-like android. Thinking that the creature must be some kind of demon, the monk promptly destroyed it, only to be scolded for his rashness by a weary and frustrated Albert, who explained that he had been able to create his robot because of a very rare planetary conjunction that happened only once every 30,000 years.

In legend, fiction and philosophy, writers offered explanations for the moving statues, artificial animals and musical figures that they knew were part of the world beyond Latin Christendom. Like us, they used technology to evoke particular places or cultures. The golden tree with artificial singing birds that confounded Liudprand on his visit to Constantinople appears to have been a fairly common type of automaton: it appears in the palaces of Samarra and Baghdad and, later, in the courts of central India. In the early 13th century, the sultan of Damascus sent a metal tree with mechanical songbirds as a gift to the Holy Roman Emperor Frederick II. But this same object also took root in the Western imagination: we find writers of fiction in medieval Europe including golden trees with eerily lifelike artificial birds in many descriptions of courts in Babylon and India.

In one romance from the early 13th century, sorcerers use gemstones with hidden powers combined with necromancy to make the birds hop and chirp. In another, from the late 12th century, the king harnesses the winds to make the golden branches sway and the gilt birds sing. There were several different species of birds represented on the king’s fabulous tree, each with its own birdsong, so exact that real birds flocked to the tree in hopes of finding a mate. ‘Thus the blackbirds, skylarks, jaybirds, starlings, nightingales, finches, orioles and others which flocked to the park in high spirits on hearing the beautiful birdsong, were quite unhappy if they did not find their partner!’

Of course, the Latin West did not retain its innocence of mechanical explanations forever. Three centuries after Gerbert taught his students how to understand the heavens with an armillary sphere, the enthusiasm for mechanical clocks began to sweep northern Europe. These giant timepieces could model the cosmos, chime the hour, predict eclipses and represent the totality of human history, from the fall of humankind in the Garden of Eden to the birth and death of Jesus, and his promised return.

Astronomical instruments, like astrolabes and armillary spheres, oriented the viewer in the cosmos by showing the phases of the moon, the signs of the zodiac and the movements of the planets. Carillons, programmed with melodies, audibly marked the passage of time. Large moving figures of people, weighted with Christian symbolism, appear as monks, Jesus, the Virgin Mary. They offered a master narrative that fused past, present and future (including salvation). The monumental clocks of the late medieval period employed cutting-edge technology to represent secular and sacred chronology in one single timeline.

Secular powers were no slower to embrace the new technologies. Like their counterparts in distant capitals, European rulers incorporated mechanical marvels into their courtly pageantry. The day before his official coronation in Westminster Abbey in 1377, Richard II of England was ‘crowned’ by a golden mechanical angel – made by the goldsmiths’ guild – during his coronation pageant in Cheapside.

And yet, although medieval Europeans had figured out how to build the same kinds of complex automata that people in other places had been designing and constructing for centuries, they did not stop believing in preternatural causes. They merely added ‘mechanical’ to the list of possible explanations. Just as one person’s ecstatic visions might equally be attributed to divine inspiration or diabolical trickery, a talking or moving statue could be ascribed to artisanal or engineering know-how, the science of the stars, or demonic art. Certainly the London goldsmiths in 1377 were in no doubt about how the marvellous angel worked. But because a range of possible causes could animate automata, reactions to them in this late medieval period tended to depend heavily on the perspective of the individual.

At a coronation feast for the queen at the court of Ferdinand I of Aragon in 1414, theatrical machinery – of the kind used in religious Mystery Plays – was used for part of the entertainment. A mechanical device called a cloud, used for the arrival of any celestial being (gods, angels and the like), swept down from the ceiling. The figure of Death, probably also mechanical, appeared above the audience and claimed a courtier and jester named Borra for his own. Other guests at the feast had been forewarned, but nobody told Borra. A chronicler reported on this marvel with dry exactitude:
Death threw down a rope, they [fellow guests] tied it around Borra, and Death hanged him. You would not believe the racket that he made, weeping and expressing his terror, and he urinated into his underclothes, and urine fell on the heads of the people below. He was quite convinced he was being carried off to Hell. The king marvelled at this and was greatly amused.

Such theatrical tricks sound a little gimcrack to us, but if the very stage machinery might partake of uncanny forces, no wonder Borra was afraid.

Nevertheless, as mechanical technology spread throughout Europe, mechanical explanations of automata (and machines in general) gradually prevailed over magical alternatives. By the end of the 17th century, the realm of the preternatural had largely vanished. Technological marvels were understood to operate within the boundaries of natural laws rather than at the margins of them. Nature went from being a powerful, even capricious entity to an abstract noun denoted with a lower-case ‘n’: predictable, regular, and subject to unvarying law, like the movements of a mechanical clock.

This new mechanistic world-view prevailed for centuries. But the preternatural lingered, in hidden and surprising ways. In the 19th century, scientists and artists offered a vision of the natural world that was alive with hidden powers and sympathies. Machines such as the galvanometer – to measure electricity – placed scientists in communication with invisible forces. Perhaps the very spark of life was electrical.

Even today, we find traces of belief in the preternatural, though it is found more often in conjunction with natural, rather than artificial, phenomena: the idea that one can balance an egg on end more easily at the vernal equinox, for example, or a belief in ley lines and other Earth mysteries. Yet our ongoing fascination with machines that escape our control or bridge the human-machine divide, played out countless times in books and on screen, suggest that a touch of that old medieval wonder still adheres to the mechanical realm.

30 March 2015

Imagining the Anthropocene (AEON)

The Anthropocene idea has been embraced by Earth scientists and English professors alike. But how useful is it?

Jedediah Purdy is Professor of Law at Duke University in North Carolina. His forthcoming book is After Nature: A Politics for the Anthropocene.

Edited by Ross Andersen

Officially, for the past 11,700 years we have been living in the Holocene epoch. From the Greek for ‘totally new’, the Holocene is an eyeblink in geological time. In its nearly 12,000 years, plate tectonics has driven the continents a little more than half a mile: a reasonably fit person could cover the scale of planetary change in a brisk eight-minute walk. It has been a warm time, when temperature has mattered as much as tectonics. Sea levels rose 115 feet from ice melt, and northern landscapes rose almost 600 feet, as they shrugged off the weight of their glaciers.

But the real news in the Holocene has been people. Estimates put the global human population between 1 million and 10 million at the start of the Holocene, and keep it in that range until after the agricultural revolution, some 5,000 years ago. Since then, we have made the world our anthill: the geological layers we are now laying down on the Earth’s surface are marked by our chemicals and industrial waste, the pollens of our crops, and the absence of the many species we have driven to extinction. Rising sea levels are now our doing. As a driver of global change, humanity has outstripped geology.

This is why, from the earth sciences to English departments, there’s a veritable academic stampede to declare that we live in a new era, the Anthropocene – the age of humans. Coined by the ecologist Eugene Stoermer in the 1980s and brought to public attention in 2000 by the Nobel Prize-winning atmospheric scientist Paul Crutzen, the term remains officially under consideration at the Stratigraphy Commission of the Geological Society of London.

The lack of an official decision has set up the Anthropocene as a Rorschach blot for discerning what commentators think is the epochal change in the human/nature relationship. The rise of agriculture in China and the Middle East? The industrial revolution and worldwide spread of farming in the Age of Empire? The Atomic bomb? From methane levels to carbon concentration, from pollen residue to fallout, each of these changes leaves its mark in the Earth’s geological record. Each is also a symbol of a new set of human powers and a new way of living on Earth.

The most radical thought identified with the Anthropocene is this: the familiar contrast between people and the natural world no longer holds. There is no more nature that stands apart from human beings. There is no place or living thing that we haven’t changed. Our mark is on the cycle of weather and seasons, the global map of bioregions, and the DNA that organises matter into life. The question is no longer how to preserve a wild world from human intrusion; it is what shape we will give to a world we can’t help changing.

The discovery that nature is henceforth partly a human creation makes the Anthropocene the latest of three great revolutions: three kinds of order once thought to be given and self-sustaining have proved instead to be fragile human creations. The first to fall was politics. Long seen as part of divine design, with kings serving as the human equivalents of eagles in the sky and oaks in the forest, politics proved instead a dangerous but inescapable form of architecture – a blueprint for peaceful co‑existence, built with crooked materials. Second came economics. Once presented as a gift of providence or an outgrowth of human nature, economic life, like politics, turned out to be a deliberate and artificial achievement. (We are still debating the range of shapes it can take, from Washington to Greece to China.) Now, in the Anthropocene, nature itself has joined the list of those things that are not natural. The world we inhabit will henceforth be the world we have made.

The revolution in ideas that the Anthropocene represents is rooted in hundreds of eminently practical problems. The conversation about climate change has shifted from whether we can keep greenhouse-gas concentrations below key thresholds to how we are going to adapt when they cross those thresholds. Geo‑engineering, deliberately intervening in planetary systems, used to be the unspeakable proposal in climate policy. Now it is in the mix and almost sure to grow more prominent. As climate change shifts ecological boundaries, issues such as habitat preservation come to resemble landscape architecture. We can’t just pen in animals to save them; they need landscape-scale corridors and other help in migrating as their habitats move. There is open talk in law-and-policy circles about triage in species preservation – asking what we can save, and what we most want to save.

What work is this idea of the Anthropocene doing in culture and politics? As much as a scientific concept, the Anthropocene is a political and ethical gambit. Saying that we live in the Anthropocene is a way of saying that we cannot avoid responsibility for the world we are making. So far so good. The trouble starts when this charismatic, all-encompassing idea of the Anthropocene becomes an all-purpose projection screen and amplifier for one’s preferred version of ‘taking responsibility for the planet’.

Peter Kareiva, the controversial chief scientist of the Nature Conservancy, uses the theme ‘Conservation in the Anthropocene’ to trash environmentalism as philosophically naïve and politically backward. Kareiva urges conservationists to give up on wilderness and embrace what the writer Emma Marris calls the ‘rambunctious garden’. Specifically, Kareiva wants to rank ecosystems by the quality of ‘ecosystem services’ they provide for human beings instead of ‘pursuing the protection of biodiversity for biodiversity’s sake’. He wants a pro‑development stance that assumes that ‘nature is resilient rather than fragile’. He insists that: ‘Instead of scolding capitalism, conservationists should partner with corporations in a science-based effort to integrate the value of nature’s benefits into their operations and cultures.’ In other words, the end of nature is the signal to carry on with green-branded business as usual, and the business of business is business, as the Nature Conservancy’s partnerships with Dow, Monsanto, Coca-Cola, Pepsi, J P Morgan, Goldman Sachs and the mining giant Rio Tinto remind us.

Kareiva is a favourite of Andrew Revkin, the roving environmental maven of The New York Times Magazine, who touts him as a paragon of responsibility-taking, a leader among ‘scholars and doers who see that new models for thinking and acting are required in this time of the Anthropocene’. This pair and their friends at the Breakthrough Institute in California can be read as making a persistent effort to ‘rebrand’ environmentalism as humanitarian and development-friendly (and capture speaking and consultancy fees, which often seem to be the major ecosystem services of the Anthropocene). This is itself a branding strategy, an opportunity to slosh around old plonk in an ostentatiously shiny bottle.

Elsewhere in The New York Times Magazine, you can enjoy the other end of the Anthropocene projection screen, from business-as-usual to this-changes-everything. In his essay ‘Learning How to Die in the Anthropocene’ (2013), the Princeton scholar and former soldier Roy Scranton writes: ‘this civilisation is already dead’ (emphasis original) and insists that the only way forward is ‘to realise there’s nothing we can do to save ourselves’ and therefore ‘get down to the hard work … without attachment or fear’. He concludes: ‘If we want to learn to live in the Anthropocene, we must first learn how to die.’

Other humanists bring their own preoccupations to a sense of gathering apocalypse. In his influential essay ‘The Climate of History’ (2008), Dipesh Chakrabarty, a theory-minded historian at the University of Chicago, proposes that the Anthropocene throws into question all received accounts of human history, from Whiggish optimism to his own post-colonial postmodernism. He asks anxiously: ‘Has the period from 1750 to the present been one of freedom or that of the Anthropocene?’ and concludes that the age requires a new paradigm of thought, a ‘negative universal history’.

In their introduction to Ecocriticism (2012), a special issue of American Literature, the English scholars Monique Allewaert of the University of Wisconsin-Madison and Michael Ziser of the University of California Davis describe the Anthropocene as best captured in ‘a snapshot of the anxious affect of the modern world as it destroys itself – and denies even its own traces’.

The Anthropocene does not seem to change many minds. But it does turn them up to 11

All of these people (except for the branding opportunists) are trying, with more or less success, to ask how the Anthropocene changes the projects to which they’ve given chunks of their lives. Some far-ranging speculation and sweeping summaries are to be expected, and forgiven. Nonetheless, something in the Anthropocene idea seems to provoke heroic thinking, a mood and rhetoric of high stakes, of the human mind pressed up against the wall of apocalypse or arrived at the end of nature and history.

In this provocative defect, Anthropocene talk is a discourse of responsibility, to borrow a term from Mark Greif’s study of mid-20th-century American thought, The Age of the Crisis of Man (2015). Greif argues that a high-minded (but often middle-brow) strain of rhetoric responded to the horrors of the world wars and the global struggles thereafter with a blend of urgent language and sweeping concepts (or pseudo-concepts): responsibility, the fate of man, the urgency of now. Greif describes discourses of responsibility as attempts to turn words and thoughts, uttered in tones of utmost seriousness, into a high form of action. All of this is recognisable in Anthropocene talk. The Anthropocene does not seem to change many minds, strictly speaking, on point of their cherished convictions. But it does turn them up to 11.

On the whole, this is the inevitable and often productive messiness that accompanies a new way of seeing, one that unites many disparate events into a single pattern. As an offer to unify what might seem unrelated, ‘the Anthropocene’ is an attempt to do the same work that ‘the environment’ did in the 1960s and early ’70s: meld problems as far-flung as extinction, sprawl, litter, national parks policy, and the atom bomb into a single phenomenon called ‘the ecological crisis’. Such a classification is always somewhat arbitrary, though often only in the trivial sense that there are many ways to carve up the world. However arbitrary, it becomes real if people treat it as real, by forming movements, proposing changes, and passing laws aimed at ‘the environment’.

We know what the concept ‘the environment’ has wrought but what will the Anthropocene be like? To put this over-dramatised idea in the least heroic garb possible, what will the weather be like in the Anthropocene? And how will we talk about the weather there?

For all the talk of crisis that swirls around the Anthropocene, it is unlikely that a changing Earth will feel catastrophic or apocalyptic. Some environmentalists still warn of apocalypse to motivate could-be, should-be activists; but geologic time remains far slower than political time, even when human powers add a wobble to the planet. Instead, the Anthropocene will be like today, only more so: many systems, from weather to soil to your local ecosystem, will be in a slow-perennial crisis. And where apocalyptic change is a rupture in time, a slow crisis feels normal. It feels, in fact, natural.

So the Anthropocene will feel natural. I say this not so much because of the controversial empirics-cum-mathematics of the climate-forecasting models as because of a basic insight of modernity that goes back to Rousseau: humanity is the adaptable species. What would have been unimaginable or seemed all but unintelligible 100 years ago, let alone 500 (a sliver of time in the evolutionary life of a species), can become ordinary in a generation. That is how long it took to produce ‘digital natives’, to accustom people to electricity and television, and so on for each revolution in our material and technological world. It takes a great deal of change to break through this kind of adaptability.

This is all the more so because rich-country humanity already lives in a constant technological wrestling match with exogenous shocks, which are going to get more frequent and more intense in the Anthropocene. Large parts of North America regularly experience droughts and heat waves that would devastate a simpler society than today’s US. Because the continent is thoroughly engineered, from the water canals of the West to the irrigation systems of the Great Plains to air conditioning nearly everywhere, these are experienced as inconvenience, as mere ‘news’. The same events, in poorer places, are catastrophes.

Planetary changes will amplify the inequalities that sort out those who get news from those who get catastrophes; but these inequalities, arising as they do from a post-natural nature, will feel as if they were built into the world itself. Indeed, nature has always served to launder the inequalities that humans produce. Are enslaved people kept illiterate and punished brutally when they are not servile? Then ignorance and servility must be in their nature, an idea that goes back in a continuous line to Aristotle. The same goes for women, with some edits to their nature: docile, nurturing, delicate, hysterical, etc. It was not until Harriet Taylor and John Stuart Mill worked together on The Subjection of Women (published under his name alone in 1869), that English-language philosophy produced a basic challenge to millennia of nature-talk about sexual difference.

The expulsion of Native Americans was ‘justified’ on several versions of nature. Maybe they were racially different. Maybe their climate made them weak and irrational, unable to cultivate the land or resist European settlement. (Colonists briefly embraced this idea, then grew uneasy when they realised that the North American climate was now theirs; by the time of American independence, they raced to reject climatic theories of racial character.) Maybe Native Americans had simply failed to fulfil the natural duty of all mankind, to clear and plant the wilderness and make it bloom like an English garden, an idea that many theorists of natural law advanced in the 17th and 18th centuries. One way or another, nature was a kind of ontological insurance policy for human injustice.

And now? Well, it’s common wisdom that rising sea levels will first affect some of the world’s poorest people, notably in Bangladesh and coastal India. But it’s much worse than that grim geographic coincidence. Wealth has always meant some protection from nature’s cruel measures. In fact, that is the first spur to technology and development of all kinds: not to be killed. Tropical diseases with changing range will find some populations well-equipped with vaccination and medicine, others struggling with bad government and derelict health systems. When seas rise fast, even the feckless but rich US will begin adapting fast, and coastal flooding will be classified in the rich-world mind as a catastrophe of the poor.

So will starvation. A legal regime of unequal Anthropocene vulnerability is well underway. Take the vast, long-term leases that Chinese companies have entered into for some of Africa’s richest farmland. When drought, soil exhaustion or crop crisis puts a pinch on global food supply, contracts and commerce will pull trillions of calories to fat-and-happy Beijing. This is, of course, only the latest chapter in centuries of imperialism and post-imperial, officially voluntary global inequality. But it is the chapter that we the living are writing.

Neoliberal environmentalism aims to bring nature fully into the market, merging ecology and economy

For the moment, Anthropocene inequality has a special affinity with neoliberalism, the global extension of a dogmatic market logic and increasingly homogenous market forms, along with an accompanying ideology insisting that, if the market is not beyond reproach, it is at least beyond reform: there is no alternative. Where previous episodes of global ecological inequality took place under direct imperial administration – witness the Indian famines of the late 19th century, suffered under British rule – ours is emerging under the sign of free contract. Anthropocene inequality is thus being doubly laundered: first as natural, second as the voluntary (and presumptively efficient) product of markets. Because human activity now shapes the ‘natural’ world at every point, it is especially convenient for that world-shaping activity to proceed in its own pseudo-natural market.

But Anthropocene problems also put pressure on the authority of economics. Much of environmental economics has been built on the concept of the externality, economist-speak for a side-effect, a harm or benefit that has no price tag, and so is ignored in market decisions. Air pollution – free to the polluter – is the classic bad side-effect, or ‘negative externality’. Wetlands – not valued on the real-estate market, but great sources of filtration, purification and fertility, which would otherwise cost a lot to replicate – are the model positive externality. So neoliberal environmentalism, which Kareiva’s Nature Conservancy has been cultivating, aims to bring nature fully into the market, finding a place in the bottom line for all former side-effects, and fully merging ecology and economy.

In a climate-changed Anthropocene, the side-effects overwhelm the ‘regular’ market in scale and consequence. And there is no ‘neutral’, purely market-based way to put a value on side-effects. Take the example of carbon emissions. It is possible to create a market for emissions, as Europe, California and other jurisdictions have done; but at the base of that market is a political decision about how to value the economic activity that emits carbon against all the (uncertain and even speculative) effects of the emissions. The same point holds for every (post-)natural system on an Anthropocene planet. Ultimately, the question is the value of life, and ways of life. There is no correct technocratic answer.

The shape of the Anthropocene is a political, ethical and aesthetic question. It will answer questions about what life is worth, what people owe one another, and what in the world is awesome or beautiful enough to preserve or (re)create. Either the answers will reproduce and amplify existing inequality or they will set in motion a different logic of power. Either the Anthropocene will be democratic or it will be horrible.

A democratic Anthropocene would start from a famous observation of the economics Nobel Prize laureate Amartya Sen: no minimally democratic society has ever suffered a famine. Natural catastrophes are the joint products of natural and human systems. Your vulnerability to disaster is often a direct expression of your standing in a political (and economic) order. The Anthropocene stands for the intensifying merger of ecology, economics and politics, and one’s standing in those systems will increasingly be a single question.

But talk of democracy here is – like much about the Anthropocene – in danger of becoming abstract and moralising. Reflecting on a democratic Anthropocene becomes an inadvertent meditation on the devastating absence of any agent – a state, or even a movement – that could act on the scale of the problem. Indeed, it reveals that there is no agent that could even define the problem. If the Anthropocene is about the relationship between humanity and the planet, well, there is no ‘humanity’ that agrees on any particular meaning and imperative of climate change, extinction, toxification, etc. To think about the Anthropocene is to think about being able to do nothing about everything. No wonder the topic inspires compensatory fantasies that the solution lies in refining the bottom line or honing personal enlightenment – always, to be sure, in the name of some fictive ‘we’.

This returns us to the basic problem that the Anthropocene drives home: as Hannah Arendt observed in The Origins of Totalitarianism (1951), the idea of human rights – such as the right to democratic standing in planetary change – is a chimera and a cruel taunt without a political community that can make it good through robust institutions and practices. The Anthropocene shows how far the world is from being such a polity, or a federation of such polities, and how much is at stake in that absence. The world is too much with us. Worse, there is no ‘we’ to be with it.

In the face of all these barriers, what could all this talk about the Anthropocene possibly accomplish? Ironically, a useful comparison lies in Arendt’s target, the mere idea of human rights. While mere ideas are in fact sorry comforts in an unmanageable situation, they can be the beginning of demands, projects, even utopias, that enable people to organise in new ways to pursue them. The idea of human rights has gained much of its force this way, as a prism through which many efforts are focused and/or refracted.

A democratic Anthropocene is just a thought for now, but it can also be a tool that activists, thinkers and leaders use to craft challenges and invitations that bring some of us a little closer to a better possible world, or a worse one. The idea that the world people get to inhabit will only be the one they make is, in fact, imperative to the development of a political and institutional programme, even if the idea itself does not tell anyone how to do that. There might not be a world to win, or even save, but there is a humanity to be shaped and reshaped, freely and always in partial and provisional ways, that can begin intending the world it shapes.

31 March 2015

Popular now

Climate change: Embed the social sciences in climate policy (Nature)

David Victor

01 April 2015

David G. Victor calls for the IPCC process to be extended to include insights into controversial social and behavioural issues.

Illustration by David Parkins

The Intergovernmental Panel on Climate Change (IPCC) is becoming irrelevant to climate policy. By seeking consensus and avoiding controversy, the organization is suffering from the streetlight effect — focusing ever more attention on a well-lit pool of the brightest climate science. But the insights that matter are out in the darkness, far from the places that the natural sciences alone can illuminate.

With the ink barely dry on the IPCC’s latest reports, scientists and governments are planning reforms for the next big assessment12. Streamlining the review and writing processes could, indeed, make the IPCC more nimble and relevant. But decisions made at February’s IPCC meeting in Nairobi showed that governments have little appetite for change.

The basic report-making process and timing will remain intact. Minor adjustments such as greater coverage of cross-cutting topics and more administration may make the IPCC slower. Similar soul searching, disagreement, indecision and trivial procedural tweaks have followed each of the five IPCC assessments over the past 25 years3.

This time needs to be different. The IPCC must overhaul how it engages with the social sciences in particular (see go.nature.com/vp7zgm). Fields such as sociology, political science and anthropology are central to understanding how people and societies comprehend and respond to environmental changes, and are pivotal in making effective policies to cut emissions and collaborate across the globe.

The IPCC has engaged only a narrow slice of social-sciences disciplines. Just one branch — economics — has had a major voice in the assessment process. In Working Group III, which assesses climate-change mitigation and policy, nearly two-thirds of 35 coordinating lead authors hailed from the field, and from resource economics in particular. The other social sciences were mostly absent. There was one political scientist: me. Among the few bright spots in that report compared with earlier ones is greater coverage of behavioural economics and risk analysis. In Working Group II, which assesses impacts and adaptation, less than one-third of the 64 coordinating lead authors were social scientists, and about half of those were economists.

Bringing the broader social sciences into the IPCC will be difficult, but it is achievable with a strategy that reflects how the fields are organized and which policy-relevant questions these disciplines know well. It will require big reforms in the IPCC, and the panel will have to relinquish part of the assessment process to other organizations that are less prone to paralysis in the face of controversy.

Tunnel vision

The IPCC walks a wavering line between science, which requires independence, and diplomacy, which demands responsiveness to government preference. Although scientists supply and hone the material for reports, governments have a say in all stages of assessment: they adopt the outline for each chapter, review drafts and approve the final reports.

“Insights such as which policies work (or fail) in practice are skirted.”

Such tight oversight creates incentives for scientists to stick to the agreed scope and strip out controversial topics. These pressures are especially acute in the social sciences because governments want to control statements about social behaviour, which implicate policy. This domain covers questions such as which countries will bear the costs of climate change; schemes for allocating the burden of cutting emissions; the design of international agreements; how voters respond to information about climate policy; and whether countries will go to war over climate-related stress. The social sciences can help to provide answers to these questions, key for effective climate policy. In practice, few of these insights are explored much by the IPCC.

The narrowness of what governments will allow the IPCC to publish is particularly evident in the summary for policy-makers produced at the end of each assessment. Governments approve this document line-by-line with consensus. Disagreements range from those over how to phrase concepts such as a ‘global commons’ that requires collective action to those about whole graphs, which might present data in ways that some governments find inconvenient.

For example, during the approval of the summary from Working Group III last April, a small group of nations vetoed graphs that showed countries’ emissions grouped according to economic growth. Although this format is good science — economic growth is the main driver of emissions — it is politically toxic because it could imply that some countries that are developing rapidly need to do more to control emissions4.

Context dependent

The big problem with the IPCC’s output is not the widely levelled charge that it has become too policy prescriptive or is captivated by special interests5. Its main affliction is pabulum — a surfeit of bland statements that have no practical value for policy. Abstract, global numbers from stylized, replicable models get approved because they do not implicate any country or action. Insights such as which policies work (or fail) in practice are skirted. Caveats are buried or mangled.

Readers of the Working Group III summary for policy-makers might learn, for instance, that annual economic growth might decrease by just 0.06 percentage points by 2050 if governments were to adopt policies that cut emissions in line with the widely discussed goal of 2 °C above pre-industrial levels6. They would have to wade through dense tables to realize that only a fraction of the models say that the goal is achievable, and through the main report to learn that the small cost arises only under simplified assumptions that are far from messy reality.

Source: Ref. 6

That said, the social sciences are equally culpable. Because societies are complex and are in many ways harder to study than cells in a petri dish, the intellectual paradigms across most of the social sciences are weak. Beyond a few exceptions — such as mainstream economics — the major debates in social science are between paradigms rather than within them.

Consider the role of international law. Some social scientists treat law like a contract; others believe that it works mainly through social pressures. The first set would advise policy-makers to word climate deals precisely — to include targets and timetables for emissions cuts — and to apply mechanisms to ensure that countries honour their agreements. The second group would favour bold legal norms with clear focal points — striving for zero net emissions, for example7. Each approach could be useful in the right context.

Multiple competing paradigms make it hard to organize social-science knowledge or to determine which questions and methods are legitimate. Moreover, the incentives within the social sciences discourage focusing on particular substantive topics such as climate change — especially when they require interdisciplinary collaboration. In political science, for example, research on political mobilization, administrative control and international cooperation among other specialities are relevant. Yet no leading political-science department has a tenured professor who works mainly on climate change8.

The paradigm problem need not be paralysing. Social scientists should articulate why different intellectual perspectives and contexts lead to different conclusions. Leading researchers in each area can map out disagreement points and their relevance.

Climate scientists and policy-makers should talk more about how disputes are rooted in different values and assumptions — such as about whether government institutions are capable of directing mitigation. Such disputes help to explain why there are so many disagreements in climate policy, even in areas in which the facts seem clear9.

Unfortunately, the current IPCC report structure discourages that kind of candour about assumptions, values and paradigms. It focuses on known knowns and known unknowns rather than on deeper and wider uncertainties. The bias is revealed in how the organization uses official language to describe findings — half of the statements in the Working Group III summary were given a ‘high confidence’ rating (see ‘Confidence bias’).

Wider vista

Building the social sciences into the IPCC and the climate-change debate more generally is feasible over the next assessment cycle, which starts in October and runs to 2022, with efforts on the following three fronts.

First, the IPCC must ask questions that social scientists can answer. If the panel looks to the social-sciences literature on climate change, it will find little. But if it engages the fields on their own terms it will find a wealth of relevant knowledge — for example, about how societies organize, how individuals and groups perceive threats and respond to catastrophic stresses, and how collective action works best.

Dieter Telemans/Panos

The solar-powered Barefoot College in Rajasthan, India, trains rural villagers in how to install, build and repair solar technologies.

As soon as the new IPCC leadership is chosen later this year, the team should invite major social-sciences societies such as the American Political Science Association, the American and European societies of international law, the American Sociological Association and the Society for Risk Analysis to propose relevant topics that they can assess and questions they can answer. Multidisciplinary scientific organizations in diverse countries — such as the Royal Society in London and the Third World Academy of Sciences — would round out the picture, because social-sciences societies tend to be national and heavily US-based.

These questions should guide how the IPCC scopes its next reports. The agency should also ask such societies to organize what they know about climate by discipline — how sociology examines issues related to the topic, for example — and feed that into the assessment.

Second, the IPCC must become a more attractive place for social-science and humanities scholars who are not usually involved in the climate field and might find IPCC involvement daunting. The IPCC process is dominated by insiders who move from assessment to assessment and are tolerant of the crushing rounds of review and layers of oversight that consume hundreds of hours and require travel to the corners of the globe. Practically nothing else in science service has such a high ratio of input to output. The IPCC must use volunteers’ time more efficiently.

Third, all parties must recognize that a consensus process cannot handle controversial topics such as how best to design international agreements or how to govern the use of geoengineering technologies. For these, a parallel process will be needed to address the most controversial policy-relevant questions.

This supporting process should begin with a small list of the most important questions that the IPCC cannot handle on its own. A network of science academies or foundations sympathetic to the UN’s mission could organize short reports — drawing from IPCC assessments and other literature — and manage a review process that is truly independent of government meddling. Oversight from prominent social scientists, including those drawn from the IPCC process, could give the effort credibility as well as the right links to the IPCC itself.

The list of topics to cover in this parallel mechanism includes how to group countries in international agreements — beyond the crude kettling adopted in 1992 that split the world into industrialized nations and the rest. The list also includes which kinds of policies have had the biggest impact on emissions, and how different concepts of justice and ethics could guide new international agreements that balance the burdens of mitigation and adaptation. There will also need to be a sober re-assessment of policy goals when it becomes clear that stopping warming at 2 °C is no longer feasible10.

The IPCC has proved to be important — it is the most legitimate body that assesses the climate-related sciences. But it is too narrow and must not monopolize climate assessment. Helping the organization to reform itself while moving contentious work into other forums is long overdue.

Nature 520, 27–29 (02 April 2015), doi:10.1038/520027a

References

  1. IPCC. Future Work of the IPCC: Chairman’s Vision Paper on the Future of the IPCC (IPCC, 2015).
  2. IPCC. Future Work of the IPCC: Consideration of the Recommendations by the Task Group on Future Work of the IPCC (IPCC, 2015).
  3. Committee to Review the Intergovernmental Panel on Climate ChangeClimate Change Assessments: Review of the Processes and Procedures of the IPCC (InterAcademy Council, 2010).
  4. Victor, D. G.Gerlagh, R. & Baiocchi, G. Science 3453436 (2014).
  5. Hulme, M. et alNature 463730732 (2010).
  6. IPCCSummary for Policymakers in Climate Change 2014: Mitigation of Climate Change. Contribution of Working Group III to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change (eds Edenhofer, O. et al.) (Cambridge Univ. Press, 2014).
  7. Hafner-Burton, E. M.Victor, D. G. & Lupu, Y. Am. J. Intl Law 1064797 (2012).
  8. Keohane, R. O. PS: Political Sci. & Politics 481926 (2015).
  9. Hulme, M. Why We Disagree About Climate Change: Understanding Controversy, Inaction and Opportunity (Cambridge Univ. Press, 2009).
  10. Victor, D. G. & Kennel C. F. Nature 5143031 (2014).

Anthropocene: The human age (Nature)

Momentum is building to establish a new geological epoch that recognizes humanity’s impact on the planet. But there is fierce debate behind the scenes.

Richard Monastersky

11 March 2015

Illustration by Jessica Fortner

Almost all the dinosaurs have vanished from the National Museum of Natural History in Washington DC. The fossil hall is now mostly empty and painted in deep shadows as palaeobiologist Scott Wing wanders through the cavernous room.

Wing is part of a team carrying out a radical, US$45-million redesign of the exhibition space, which is part of the Smithsonian Institution. And when it opens again in 2019, the hall will do more than revisit Earth’s distant past. Alongside the typical displays of Tyrannosaurus rex and Triceratops, there will be a new section that forces visitors to consider the species that is currently dominating the planet.

“We want to help people imagine their role in the world, which is maybe more important than many of them realize,” says Wing.

This provocative exhibit will focus on the Anthropocene — the slice of Earth’s history during which people have become a major geological force. Through mining activities alone, humans move more sediment than all the world’s rivers combined. Homo sapiens has also warmed the planet, raised sea levels, eroded the ozone layer and acidified the oceans.

Given the magnitude of these changes, many researchers propose that the Anthropocene represents a new division of geological time. The concept has gained traction, especially in the past few years — and not just among geoscientists. The word has been invoked by archaeologists, historians and even gender-studies researchers; several museums around the world have exhibited art inspired by the Anthropocene; and the media have heartily adopted the idea. “Welcome to the Anthropocene,” The Economist announced in 2011.

The greeting was a tad premature. Although the term is trending, the Anthropocene is still an amorphous notion — an unofficial name that has yet to be accepted as part of the geological timescale. That may change soon. A committee of researchers is currently hashing out whether to codify the Anthropocene as a formal geological unit, and when to define its starting point.

But critics worry that important arguments against the proposal have been drowned out by popular enthusiasm, driven in part by environmentally minded researchers who want to highlight how destructive humans have become. Some supporters of the Anthropocene idea have even been likened to zealots. “There’s a similarity to certain religious groups who are extremely keen on their religion — to the extent that they think everybody who doesn’t practise their religion is some kind of barbarian,” says one geologist who asked not to be named.

The debate has shone a spotlight on the typically unnoticed process by which geologists carve up Earth’s 4.5 billion years of history. Normally, decisions about the geological timescale are made solely on the basis of stratigraphy — the evidence contained in layers of rock, ocean sediments, ice cores and other geological deposits. But the issue of the Anthropocene “is an order of magnitude more complicated than the stratigraphy”, says Jan Zalasiewicz, a geologist at the University of Leicester, UK, and the chair of the Anthropocene Working Group that is evaluating the issue for the International Commission on Stratigraphy (ICS).

Written in stone

For geoscientists, the timescale of Earth’s history rivals the periodic table in terms of scientific importance. It has taken centuries of painstaking stratigraphic work — matching up major rock units around the world and placing them in order of formation — to provide an organizing scaffold that supports all studies of the planet’s past. “The geologic timescale, in my view, is one of the great achievements of humanity,” says Michael Walker, a Quaternary scientist at the University of Wales Trinity St David in Lampeter, UK.

Walker’s work sits at the top of the timescale. He led a group that helped to define the most recent unit of geological time, the Holocene epoch, which began about 11,700 years ago.

Sources: Dams/Water/Fertilizer, IGBP; Fallout, Ref. 5; Map, E. C. Ellis Phil. Trans. R. Soc. A 369, 1010–1035 (2011); Methane, Ref. 4

The decision to formalize the Holocene in 2008 was one of the most recent major actions by the ICS, which oversees the timescale. The commission has segmented Earth’s history into a series of nested blocks, much like the years, months and days of a calendar. In geological time, the 66 million years since the death of the dinosaurs is known as the Cenozoic era. Within that, the Quaternary period occupies the past 2.58 million years — during which Earth has cycled in and out of a few dozen ice ages. The vast bulk of the Quaternary consists of the Pleistocene epoch, with the Holocene occupying the thin sliver of time since the end of the last ice age.

When Walker and his group defined the beginning of the Holocene, they had to pick a spot on the planet that had a signal to mark that boundary. Most geological units are identified by a specific change recorded in rocks — often the first appearance of a ubiquitous fossil. But the Holocene is so young, geologically speaking, that it permits an unusual level of precision. Walker and his colleagues selected a climatic change — the end of the last ice age’s final cold snap — and identified a chemical signature of that warming at a depth of 1,492.45 metres in a core of ice drilled near the centre of Greenland1. A similar fingerprint of warming can be seen in lake and marine sediments around the world, allowing geologists to precisely identify the start of the Holocene elsewhere.

“The geologic timescale, in my view, is one of the great achievements of humanity.”

Even as the ICS was finalizing its decision on the start of the Holocene, discussion was already building about whether it was time to end that epoch and replace it with the Anthropocene. This idea has a long history. In the mid-nineteenth century, several geologists sought to recognize the growing power of humankind by referring to the present as the ‘anthropozoic era’, and others have since made similar proposals, sometimes with different names. The idea has gained traction only in the past few years, however, in part because of rapid changes in the environment, as well as the influence of Paul Crutzen, a chemist at the Max Plank Institute for Chemistry in Mainz, Germany.

Crutzen has first-hand experience of how human actions are altering the planet. In the 1970s and 1980s, he made major discoveries about the ozone layer and how pollution from humans could damage it — work that eventually earned him a share of a Nobel prize. In 2000, he and Eugene Stoermer of the University of Michigan in Ann Arbor argued that the global population has gained so much influence over planetary processes that the current geological epoch should be called the Anthropocene2. As an atmospheric chemist, Crutzen was not part of the community that adjudicates changes to the geological timescale. But the idea inspired many geologists, particularly Zalasiewicz and other members of the Geological Society of London. In 2008, they wrote a position paper urging their community to consider the idea3.

Those authors had the power to make things happen. Zalasiewicz happened to be a member of the Quaternary subcommission of the ICS, the body that would be responsible for officially considering the suggestion. One of his co-authors, geologist Phil Gibbard of the University of Cambridge, UK, chaired the subcommission at the time.

Although sceptical of the idea, Gibbard says, “I could see it was important, something we should not be turning our backs on.” The next year, he tasked Zalasiewicz with forming the Anthropocene Working Group to look into the matter.

A new beginning

Since then, the working group has been busy. It has published two large reports (“They would each hurt you if they dropped on your toe,” says Zalasiewicz) and dozens of other papers.

The group has several issues to tackle: whether it makes sense to establish the Anthropocene as a formal part of the geological timescale; when to start it; and what status it should have in the hierarchy of the geological time — if it is adopted.

When Crutzen proposed the term Anthropocene, he gave it the suffix appropriate for an epoch and argued for a starting date in the late eighteenth century, at the beginning of the Industrial Revolution. Between then and the start of the new millennium, he noted, humans had chewed a hole in the ozone layer over Antarctica, doubled the amount of methane in the atmosphere and driven up carbon dioxide concentrations by 30%, to a level not seen in 400,000 years.

When the Anthropocene Working Group started investigating, it compiled a much longer long list of the changes wrought by humans. Agriculture, construction and the damming of rivers is stripping away sediment at least ten times as fast as the natural forces of erosion. Along some coastlines, the flood of nutrients from fertilizers has created oxygen-poor ‘dead zones’, and the extra CO2 from fossil-fuel burning has acidified the surface waters of the ocean by 0.1 pH units. The fingerprint of humans is clear in global temperatures, the rate of species extinctions and the loss of Arctic ice.

The group, which includes Crutzen, initially leaned towards his idea of choosing the Industrial Revolution as the beginning of the Anthropocene. But other options were on the table.

Some researchers have argued for a starting time that coincides with an expansion of agriculture and livestock cultivation more than 5,000 years ago4, or a surge in mining more than 3,000 years ago (see ‘Humans at the helm’). But neither the Industrial Revolution nor those earlier changes have left unambiguous geological signals of human activity that are synchronous around the globe (see ‘Landscape architecture’).

This week in Nature, two researchers propose that a potential marker for the start of the Anthropocene could be a noticeable drop in atmospheric CO2 concentrations between 1570 and 1620, which is recorded in ice cores (see page 171). They link this change to the deaths of some 50 million indigenous people in the Americas, triggered by the arrival of Europeans. In the aftermath, forests took over 65 million hectares of abandoned agricultural fields — a surge of regrowth that reduced global CO2.

Landscape architecture

A model of land use, based on human-population estimates, suggests that people modified substantial parts of the continents even thousands of years ago.

Land used intensively by humans.

8,000 years before present (bp)

8,000 years before present (bp)

1,000 years before present (bp)

anthropocene-slideshow-5

Present

anthropocene-slideshow-10

Source: E. C. Ellis Phil. Trans. R. Soc. A 369, 1010–1035 (2011).

In the working group, Zalasiewicz and others have been talking increasingly about another option — using the geological marks left by the atomic age. Between 1945 and 1963, when the Limited Nuclear Test Ban Treaty took effect, nations conducted some 500 above-ground nuclear blasts. Debris from those explosions circled the globe and created an identifiable layer of radioactive elements in sediments. At the same time, humans were making geological impressions in a number of other ways — all part of what has been called the Great Acceleration of the modern world. Plastics started flooding the environment, along with aluminium, artificial fertilizers, concrete and leaded petrol, all of which have left signals in the sedimentary record.

In January, the majority of the 37-person working group offered its first tentative conclusion. Zalasiewicz and 25 other members reported5 that the geological markers available from the mid-twentieth century make this time “stratigraphically optimal” for picking the start of the Anthropocene, whether or not it is formally defined. Zalasiewicz calls it “a candidate for the least-worst boundary”.

The group even proposed a precise date: 16 July 1945, the day of the first atomic-bomb blast. Geologists thousands of years in the future would be able to identify the boundary by looking in the sediments for the signature of long-lived plutonium from mid-century bomb blasts or many of the other global markers from that time.

A many-layered debate

The push to formalize the Anthropocene upsets some stratigraphers. In 2012, a commentary published by the Geological Society of America6 asked: “Is the Anthropocene an issue of stratigraphy or pop culture?” Some complain that the working group has generated a stream of publicity in support of the concept. “I’m frustrated because any time they do anything, there are newspaper articles,” says Stan Finney, a stratigraphic palaeontologist at California State University in Long Beach and the chair of the ICS, which would eventually vote on any proposal put forward by the working group. “What you see here is, it’s become a political statement. That’s what so many people want.”

Finney laid out some of his concerns in a paper7 published in 2013. One major question is whether there really are significant records of the Anthropocene in global stratigraphy. In the deep sea, he notes, the layer of sediments representing the past 70 years would be thinner than 1 millimetre. An even larger issue, he says, is whether it is appropriate to name something that exists mainly in the present and the future as part of the geological timescale.

“It’s become a political statement. That’s what so many people want.”

Some researchers argue that it is too soon to make a decision — it will take centuries or longer to know what lasting impact humans are having on the planet. One member of the working group, Erle Ellis, a geographer at the University of Maryland, Baltimore County, says that he raised the idea of holding off with fellow members of the group. “We should set a time, perhaps 1,000 years from now, in which we would officially investigate this,” he says. “Making a decision before that would be premature.”

That does not seem likely, given that the working group plans to present initial recommendations by 2016.

Some members with different views from the majority have dropped out of the discussion. Walker and others contend that human activities have already been recognized in the geological timescale: the only difference between the current warm period, the Holocene, and all the interglacial times during the Pleistocene is the presence of human societies in the modern one. “You’ve played the human card in defining the Holocene. It’s very difficult to play the human card again,” he says.

Walker resigned from the group a year ago, when it became clear that he had little to add. He has nothing but respect for its members, he says, but he has heard concern that the Anthropocene movement is picking up speed. “There’s a sense in some quarters that this is something of a juggernaut,” he says. “Within the geologic community, particularly within the stratigraphic community, there is a sense of disquiet.”

Zalasiewicz takes pains to make it clear that the working group has not yet reached any firm conclusions.“We need to discuss the utility of the Anthropocene. If one is to formalize it, who would that help, and to whom it might be a nuisance?” he says. “There is lots of work still to do.”

Any proposal that the group did make would still need to pass a series of hurdles. First, it would need to receive a supermajority — 60% support — in a vote by members of the Quaternary subcommission. Then it would need to reach the same margin in a second vote by the leadership of the full ICS, which includes chairs from groups that study the major time blocks. Finally, the executive committee of the International Union of Geological Sciences must approve the request.

At each step, proposals are often sent back for revision, and they sometimes die altogether. It is an inherently conservative process, says Martin Head, a marine stratigrapher at Brock University in St Catharines, Canada, and the current head of the Quaternary subcommission. “You are messing around with a timescale that is used by millions of people around the world. So if you’re making changes, they have to be made on the basis of something for which there is overwhelming support.”

Some voting members of the Quaternary subcommission have told Nature that they have not been persuaded by the arguments raised so far in favour of the Anthropocene. Gibbard, a friend of Zalasiewicz’s, says that defining this new epoch will not help most Quaternary geologists, especially those working in the Holocene, because they tend not to study material from the past few decades or centuries. But, he adds: “I don’t want to be the person who ruins the party, because a lot of useful stuff is coming out as a consequence of people thinking about this in a systematic way.”

If a proposal does not pass, researchers could continue to use the name Anthropocene on an informal basis, in much the same way as archaeological terms such as the Neolithic era and the Bronze Age are used today. Regardless of the outcome, the Anthropocene has already taken on a life of its own. Three Anthropocene journals have started up in the past two years, and the number of papers on the topic is rising sharply, with more than 200 published in 2014.

By 2019, when the new fossil hall opens at the Smithsonian’s natural history museum, it will probably be clear whether the Anthropocene exhibition depicts an official time unit or not. Wing, a member of the working group, says that he does not want the stratigraphic debate to overshadow the bigger issues. “There is certainly a broader point about human effects on Earth systems, which is way more important and also more scientifically interesting.”

As he walks through the closed palaeontology hall, he points out how much work has yet to be done to refashion the exhibits and modernize the museum, which opened more than a century ago. A hundred years is a heartbeat to a geologist. But in that span, the human population has more than tripled. Wing wants museum visitors to think, however briefly, about the planetary power that people now wield, and how that fits into the context of Earth’s history. “If you look back from 10 million years in the future,” he says, “you’ll be able to see what we were doing today.”

Nature 519, 144–147 (12 March 2015), doi:10.1038/519144a

References

  1. Walker, M. et alJ. Quat. Sci. 24317 (2009).
  2. Crutzen, P. J. & Stoermer, E. F. IGBP Newsletter 411718 (2000).
  3. Zalasiewicz. J. et alGSA Today 18(2), 48 (2008).
  4. Ruddiman, W. F. Ann. Rev. Earth. Planet. Sci. 414568 (2013).
  5. Zalasiewicz, J. et alQuatern. Int. http://dx.doi.org/10.1016/j.quaint.2014.11.045 (2015).
  6. Autin, W. J. & Holbrook, J. M. GSA Today 22(7), 6061 (2012).
  7. Finney, S. C. Geol. Soc. Spec. Publ. 3952328 (2013).

Age of (In)Security -Homi Bhabha & Saskia Sassen

synthetic zerø

“Living Side by Side: On Culture and Security: Does the concept of security assume a distinctive cultural form in the midst of deafening patriotic calls for protection and precaution?
This lecture explores the role of culture and the arts in cultivating an ethic and aesthetic of “living side by side” that contributes to our contemporary understanding of “cosmopolitan right” (Kant). Homi Bhabha will reflect on the global scale of cosmopolitan affiliations to ask what kinds of neighbourliness are possible in a time of partial sovereignties and paradoxical communities that constitute our Age of (In)Security.”

Ver o post original