Todos os posts de renzotaddei
Environmental Sociology in 5 minutes
Bob Carter on Climate Change and Politics – NZone Tonight
Al Gore’s 24 Hours of Reality video
Shooting the messenger (The Miami Herald)
Environment
Posted on Monday, 08.29.11
BY ANDREW DESSLER
Texas Gov. Rick Perry stirred up controversy on the campaign trail recently when he dismissed the problem of climate change and accused scientists of basically making up the problem.
As a born-and-bred Texan, it’s especially disturbing to hear this now, when our state is getting absolutely hammered by heat and drought. I’ve got to wonder how any resident of Texas – and particularly the governor who not so long ago was asking us to pray for rain – can be so cavalier about climate change.
As a climate scientist at Texas A&M University, I can also tell you from the data that the current heat wave and drought in Texas is so bad that calling it “extreme weather” does not do it justice. July was the single hottest month in the observational record, and the 12 months that ended in July were drier than any corresponding period in the record. I know that climate change does not cause any specific weather event. But I also know that humans have warmed the climate over the last century, and that this warming has almost certainly made the heat wave and drought more extreme than it would have otherwise been.
I am not alone in these views. There are dozens of atmospheric scientists at Texas institutions like Rice, the University of Texas, and Texas A&M, and none of them dispute the mainstream scientific view of climate change. This is not surprising, since there are only a handful of atmospheric scientists in the entire world who dispute the essential facts – and their ranks are not increasing, as Gov. Perry claimed.
And I can assure Gov. Perry that scientists are not just another special interest looking to line their own pockets. I left a job as an investment banker on Wall Street in 1988 to go to graduate school in chemistry. I certainly didn’t make that choice to get rich, and I didn’t do it to exert influence in the international arena either.
I went into science because I wanted to devote my life to the search for scientific knowledge. and to make the world a better place. That’s the same noble goal that motivates most scientists. The ultimate dream is to make a discovery so profound and revolutionary that it catapults one into the pantheon of the greatest scientific minds of history: Newton, Einstein, Maxwell, Planck, etc.
This is just one of the many reasons it is inconceivable for an entire scientific community to conspire en masse to mislead the public. In fact, if climate scientists truly wanted to maximize funding, we would be claiming that we had no idea why the climate is changing – a position that would certainly attract bipartisan support for increased research.
The economic costs of the Texas heat wave and drought are enormous. The cost to Texas alone will be many billion dollars (hundreds of dollars for every resident), and these costs will ripple through the economy so that everyone will eventually pay for it. Gov. Perry needs to squarely face the choice confronting us; either we pay to reduce emissions of greenhouse gases, or we pay for the impacts of a changing climate. There is no free lunch.
Economists have looked at this problem repeatedly over the last two decades, and virtually every mainstream economist has concluded that the costs of reducing emissions are less than the costs of unchecked climate change. The only disagreement is on the optimal level of emissions reductions.
I suppose it should not be surprising when politicians like Gov. Perry choose to shoot the messenger rather than face this hard choice. He may view this as a legitimate policy on climate change, but it’s not one that the facts support.
Read more here.
A Reality Check on Clouds and Climate (N.Y. Times)
September 6, 2011, 5:44 PM
Dot Earth
By ANDREW C. REVKINI am often in awe of clouds, as was the case when I shot this video of a remarkable thunderhead somewhere over the Midwest. But I’m tired of the recent burst of over-interpretation of a couple of papers examining aspects of clouds in the context of a changing climate.
I’ve long pointed out that anyone trumpeting a conclusion about greenhouse-driven climate change on the basis of a single paper should be treated with skepticism or outright suspicion. I trust climate science as an enterprise because — despite its flaws — it is a self-correcting process in which trajectory matters far more than individual steps in the road.
There is always a temptation, particularly for those with an agenda and for media in search of the “front-page thought,” to overemphasize studies that fit some template, no matter how tentative, or flawed.
The flood of celebratory coverage that followed publication of a recent paper by Roy Spencer and Danny Braswell — proposing a big reduction in the sensitivity of the climate to greenhouse gases — was far more about pushing an agenda than providing guidance on the state of climate science. There’s a lot more on this below.
The same goes for the stampede on clouds and climate following publication of an important, but preliminary, laboratory finding from the European Organization for Nuclear Research (better known by its acronym, CERN) about how cosmic rays can stimulate the formation of atmospheric particles(an ingredient in cloud formation). It’s a long road from that conclusion to an argument that variations in cosmic rays can explain a meaningful portion of recent climate change.
There’s a long history of assertions that clouds can be a substantial driver of climate change, distinct from their clear potential to amplify or blunt(depending on the type of cloud) a change set in motion by some other force. But there’s still scant evidence to back up such assertions.
In weighing the new results on cosmic rays and the atmosphere, I find a lot of merit in Hank Campbell’s conclusion at Science 2.0:
[I]t isn’t evidence that the Sun’s magnetic field is controlling cosmic rays and therefore our temperature far more than mankind and pollution are doing.
It is simply science at work – finally, after a decade and a half of circling the wagons, hypotheses that were dismissed as conspiratorial nonsense by zealots get a chance to live or die by the scientific method and not by aggressive posturing.
A new paper by Andrew Dessler of Texas A&M University bolsters the established view of clouds’ role as a feedback mechanism — but not driver — in climate dynamics through a decade of observation and analysis of El Nino and La Nina events (periodic warm and cool phases of the Pacific Ocean).
The paper directly challenges conclusions of Spencer and Braswell and anearlier paper positing a role of clouds in driving climate change.
Dessler, setting his findings and other work on clouds and climate in broader context, offered this observation this morning about the polarized, and distorted, public discourse:
To me, the real story here is that, every month, dozens if not hundreds of papers are published that are in agreement with the mainstream theory of climate science.
[ACR: I did a quick Google Scholar search for “CO2 climate change greenhouse” to put a rough upper bound on this and got ~9,000 papers so far in 2011.]
But, every year, one or two skeptical papers get published, and these are then trumpeted by sympathetic media outlets as if they’d discovered the wheel. It therefore appears to the general public that there’s a debate.
Here’s more from Dessler on his new paper:
A separate question has emerged around the Spencer-Braswell paper. Should it have been published in the first place?
As Retraction Watch (a fascinating and worthwhile blog) chronicled last week, the editor of Remote Sensing, the journal in which the paper appeared, emphatically — if after the fact — said no, emphasizing his view by very publicly resigning.
This move was hailed by defenders of the climate status quo in a piece run inThe Daily Climate and Climate Progress. Peter Gleick of the Pacific Institute, remarkably given space in Forbes, called the resignation “staggering news.”
But others, including the folks at Retraction Watch, wondered why the editor at Remote Sensing, Wolfgang Wagner, didn’t simply seek to have the paper retracted?
Roger A. Pielke, Jr., whose focus at the University of Colorado is climate in the context of political science, echoed that question, urging the new team at the journal to initiate retraction proceedings, adding:
If the charges of “error” and “false claims” are upheld the paper should certainly be retracted. If the charges are not upheld then the authors have every right to have such a judgment announced publicly.
Absent such an adjudication we are left with climate science played out as political theater in the media and on blogs — with each side claiming the righteousness of their views, while everyone else just sees the peer review process in climate science getting another black eye.
Over the weekend, I asked Kerry Emanuel at the Massachusetts Institute of Technology for his thoughts both on the Spencer-Braswell paper and the histrionic resignation by the editor. Here’s Emanuel:
About the paper: I read it when it first came out, and thought that some of their findings were significant and important. Basically, it presented evidence that feedbacks inferred from short-period and/or local climate change observations might not be relevant to long-period global change. I suppose I thought that rather obvious, but not everyone agrees. The one statement in the paper, to the effect that climate models might be overestimating positive feedback, struck me as unsubstantiated, but the authors themselves phrased it as speculative.
But the interesting and unusual thing about this is that that what pundits said about the paper, and indeed what Spencer said about it in press releases, etc., in my view had very little to do with the paper itself. I have seldom seen such a degree of disconnect between the substance of a paper and what has been said about it.
Gavin Schmidt of Real Climate and NASA has posted a thorough and useful dissection of the situation, “Resignations, retractions and the process of science,” that comes to what I see as the right conclusion:
I think (rightly) that people feel that the best way to deal with these papers is within the literature itself, and in this case it is happening this week in GRL (Dessler, 2011) [the Dessler paper discussed above], and in Remote Sensing in a few months. That’s the way it should be, and neither resignations nor retractions are likely to become more dominant – despite the amount of popcorn being passed around.
There’s more useful context and analysis from Keith Kloor, who notes the role played by the Drudge Report in amping up the story (blogging at the Yale Forum on Climate Change and the Media), Mike Lemonick, Judith Curry and many others.
As always happens after such episodes, the one clear finding is that clouds remain a complicating component in efforts to project warming from the building greenhouse effect.
Joni Mitchell’s classic, with a bit of mangling, sums things up well:
They’ve looked at clouds from all sides now, as feedback and forcing, and still somehow, it’s clouds’ illusions most often recalled. More work is needed to know clouds at all.
8:52 p.m. | Postscript |
There’s more coverage of the Spencer-Braswell paper at Knight Science Journalism Tracker and the blogs of Roger Pielke, Sr. and William M. Briggs. Roy Spencer has posted a piece titled “More Thoughts on the War Being Waged Against Us.”
Philosophers Notwithstanding, Kansas School Board Redefines Science (N.Y. Times)
By DENNIS OVERBYE
Published: November 15, 2005
Once it was the left who wanted to redefine science.
In the early 1990’s, writers like the Czech playwright and former president Vaclav Havel and the French philosopher Bruno Latour proclaimed “the end of objectivity.” The laws of science were constructed rather than discovered, some academics said; science was just another way of looking at the world, a servant of corporate and military interests. Everybody had a claim on truth.
The right defended the traditional notion of science back then. Now it is the right that is trying to change it.
On Tuesday, fueled by the popular opposition to the Darwinian theory of evolution, the Kansas State Board of Education stepped into this fraught philosophical territory. In the course of revising the state’s science standards to include criticism of evolution, the board promulgated a new definition of science itself.
The changes in the official state definition are subtle and lawyerly, and involve mainly the removal of two words: “natural explanations.” But they are a red flag to scientists, who say the changes obliterate the distinction between the natural and the supernatural that goes back to Galileo and the foundations of science.
The old definition reads in part, “Science is the human activity of seeking natural explanations for what we observe in the world around us.” The new one calls science “a systematic method of continuing investigation that uses observation, hypothesis testing, measurement, experimentation, logical argument and theory building to lead to more adequate explanations of natural phenomena.”
Adrian Melott, a physics professor at the University of Kansas who has long been fighting Darwin’s opponents, said, “The only reason to take out ‘natural explanations’ is if you want to open the door to supernatural explanations.”
Gerald Holton, a professor of the history of science at Harvard, said removing those two words and the framework they set means “anything goes.”
The authors of these changes say that presuming the laws of science can explain all natural phenomena promotes materialism, secular humanism, atheism and leads to the idea that life is accidental. Indeed, they say in material online at kansasscience2005.com, it may even be unconstitutional to promulgate that attitude in a classroom because it is not ideologically “neutral.”
But many scientists say that characterization is an overstatement of the claims of science. The scientist’s job description, said Steven Weinberg, a physicist and Nobel laureate at the University of Texas, is to search for natural explanations, just as a mechanic looks for mechanical reasons why a car won’t run.
“This doesn’t mean that they commit themselves to the view that this is all there is,” Dr. Weinberg wrote in an e-mail message. “Many scientists (including me) think that this is the case, but other scientists are religious, and believe that what is observed in nature is at least in part a result of God’s will.”
The opposition to evolution, of course, is as old as the theory itself. “This is a very long story,” said Dr. Holton, who attributed its recent prominence to politics and the drive by many religious conservatives to tar science with the brush of materialism.
How long the Kansas changes will last is anyone’s guess. The state board tried to abolish the teaching of evolution and the Big Bang in schools six years ago, only to reverse course in 2001.
As it happened, the Kansas vote last week came on the same day that voters in Dover, Pa., ousted the local school board that had been sued for introducing the teaching of intelligent design.
As Dr. Weinberg noted, scientists and philosophers have been trying to define science, mostly unsuccessfully, for centuries.
When pressed for a definition of what they do, many scientists eventually fall back on the notion of falsifiability propounded by the philosopher Karl Popper. A scientific statement, he said, is one that can be proved wrong, like “the sun always rises in the east” or “light in a vacuum travels 186,000 miles a second.” By Popper’s rules, a law of science can never be proved; it can only be used to make a prediction that can be tested, with the possibility of being proved wrong.
But the rules get fuzzy in practice. For example, what is the role of intuition in analyzing a foggy set of data points? James Robert Brown, a philosopher of science at the University of Toronto, said in an e-mail message: “It’s the widespread belief that so-called scientific method is a clear, well-understood thing. Not so.” It is learned by doing, he added, and for that good examples and teachers are needed.
One thing scientists agree on, though, is that the requirement of testability excludes supernatural explanations. The supernatural, by definition, does not have to follow any rules or regularities, so it cannot be tested. “The only claim regularly made by the pro-science side is that supernatural explanations are empty,” Dr. Brown said.
The redefinition by the Kansas board will have nothing to do with how science is performed, in Kansas or anywhere else. But Dr. Holton said that if more states changed their standards, it could complicate the lives of science teachers and students around the nation.
He added that Galileo – who started it all, and paid the price – had “a wonderful way” of separating the supernatural from the natural. There are two equally worthy ways to understand the divine, Galileo said. “One was reverent contemplation of the Bible, God’s word,” Dr. Holton said. “The other was through scientific contemplation of the world, which is his creation.
“That is the view that I hope the Kansas school board would have adopted.”
In the Land of Denial (N.Y. Times)
NY Times editorial
September 6, 2011
The Republican presidential contenders regard global warming as a hoax or, at best, underplay its importance. The most vocal denier is Rick Perry, the Texas governor and longtime friend of the oil industry, who insists that climate change is an unproven theory created by “a substantial number of scientists who have manipulated data so that they will have dollars rolling into their projects.”
Never mind that nearly all the world’s scientists regard global warming as a serious threat to the planet, with human activities like the burning of fossil fuels a major cause. Never mind that multiple investigations have found no evidence of scientific manipulation. Never mind that America needs a national policy. Mr. Perry has a big soapbox, and what he says, however fallacious, reaches a bigger audience than any scientist can command.
With one exception — make that one-and-one-half — the rest of the Republican presidential field also rejects the scientific consensus. The exception is Jon Huntsman Jr., a former ambassador to China and former governor of Utah, who recently wrote on Twitter: “I believe in evolution and trust scientists on global warming. Call me crazy.” The one-half exception is Mitt Romney, who accepted the science when he was governor of Massachusetts and argued for reducing emissions. Lately, he’s retreated into mush: “Do I think the world’s getting hotter? Yeah, I don’t know that, but I think that it is.” As for the human contribution: “It could be a little. It could be a lot.”
The others flatly repudiate the science. Ron Paul of Texas calls global warming “the greatest hoax I think that has been around for many, many years.” Michele Bachmann of Minnesota once said that carbon dioxide was nothing to fear because it is a “natural byproduct of nature” and has complained of “manufactured science.” Rick Santorum, a former senator from Pennsylvania, has called climate change “a beautifully concocted scheme” that is “just an excuse for more government control of your life.”
Newt Gingrich’s full record on climate change has been a series of epic flip-flops. In 2008, he appeared on television with Nancy Pelosi, the former House speaker, to say that “our country must take action to address climate change.” He now says the appearance was a mistake.
None of the candidates endorse a mandatory limit on emissions or, for that matter, a truly robust clean energy program. This includes Mr. Huntsman. In 2007, as Utah governor, he joined with Arnold Schwarzenegger, then the governor of California, in creating the Western Climate Initiative, a market-based cap-and-trade program aimed at reducing emissions in Western states. Cap-and-trade has since acquired a toxic political reputation, especially among Republicans, and Mr. Huntsman has backed away.
The economic downturn has made addressing climate change less urgent for voters. But the issue is not going away. The nation badly needs a candidate with a coherent, disciplined national strategy. So far, there is no Republican who fits that description.
Primeira cidade planejada do Ceará, Jaguaribara tem energia cortada por falta de pagamento (O Globo)
Publicada em 07/09/2011 às 08h24m
Globo.com/Portal Verdes Mares
SÃO PAULO – Primeira cidade totalmente planejada do Ceará, Jaguaribara está parcialmente no escuro há um mês. Falta luz em praças, ruas e até no cemitério. Por falta de pagamento, a Coelce, empresa de energia, ganhou na Justiça o direito de cortar o fornecimento de energia para o município. Nas casas e em locais de interesse público, como hospitais, a energia chega normalmente. A prefeitura da cidade admite que não pagou a energia há cinco ano. O prefeito diz que houve corte no repasse de verbas do governo do estado.
De acordo com o prefeito Edvaldo Silveira, o governo do estado deixou de repassar cerca de R$ 96 mil mensais, fruto de um convênio firmado em 2000, quando foi inaugurada a Nova Jaguaribara, a cidade planejada. A velha Jaguaribara foi inundada pelas águas do Açude Castanhão. Esse dinheiro, segundo o prefeito, era destinado ao pagamento da iluminação pública.
Segundo o prefeito, a nova cidade também foi projetada para ter 70 mil habitantes. Isso significa que a infraestrutura da cidade, que antes abrigava 9 mil pessoas, foi ampliada. O município ganhou uma vila olímpica em lugar da antiga quadra de esportes. Também foram construídas 14 praças públicas. Só o cemitério, recebeu 25 postes. Na prática, a conta de luz aumentou para a prefeitura. A cidade que era rural, hoje tem quase 70% dos moradores vivendo em áreas urbanas.
– A despesa com a ailuminação pública aumentou – afirma o prefeito.
Ele diz que a conta da iluminação pública não é repassada à população.
Depois de ficar às escuras, nesta semana a prefeitura também teve cortadas suas linhas telefônicas. O motivo é o mesmo: falta de pagamento.
O prefeito diz que aguarda verba do Governo do Estado para pagar a conta e regularizar a situação do município. Ele acredita que o problema e da energia e do telefone devem ser solucionados ainda esta semana.
Leia mais sobre esse assunto aqui.
© 1996 – 2011. Todos os direitos reservados a Infoglobo Comunicação e Participações S.A.
Flânerie bipolar (FSP)
A melancolia, da excentricidade romântica à patologia farmacêutica
Folha de S.Paulo, Ilustríssima
São Paulo, Domingo, 04 de Setembro de 2011
Por MARIA RITA KEHL
Descrita até a modernidade como um fenômeno da cultura, sinal de excentricidade e reclusão, a melancolia perdeu, com o advento da psicanálise, o caráter criativo. No século 21, se converte em patologia “bipolar”. Publicação de clássico do século 17 e filme de Lars von Trier trazem o melancólico de volta à cena.
O PLANETA MELANCHOLIA não é o Sol negro do poema de Nerval. É uma Lua incansável, cuja órbita desgovernada a aproxima da Terra indefesa até provocar uma colisão devastadora.
O filme de Lars von Trier mistura ficção científica com parábola moral, sofisticada e um tanto ingênua, como convém ao gênero. A destruição do mundo pela melancolia é precedida de um longo comentário sobre a perda de sentido da vida, pelo menos entre os habitantes da sociedade que Trier critica desde “Dançando no Escuro” (2000) e cujo imaginário o cineasta dinamarquês, confiante em seu método paranoico-crítico, conhece pelo cinema sem jamais ter pisado lá: os EUA.
Ao longo do filme, Trier semeia indicações de sua familiaridade com a história da melancolia no Ocidente. O cineasta, que se fez “persona non grata” em Cannes com provocações descabidas em defesa de Hitler, mostrou compreender a posição do melancólico como a de um sujeito em desacordo com o que se considera o Bem, no mundo em que vive. Em “Melancholia”, esta é a posição de Justine (Kirsten Dunst), prestes a se casar com um rapaz tão obsequioso em contentá-la que presenteia a noiva com a foto das macieiras em cuja sombra ela deverá ser feliz.
Feliz? A perspectiva do futuro congelado numa imagem perpétua congela também o desejo de Justine, que se desajusta de seu papel e estraga a festa caríssima organizada pela irmã, cheia de rituais destinados a produzir os efeitos de “happiness” exigidos dos filhos da sociedade da abundância.
SINTOMA SOCIAL Se não tivesse o mérito de desvendar a estupidez da fé contemporânea nos “efeitos de felicidade” como medida de todas as coisas, o filme de Trier já terá valido por reabilitar a figura da melancolia como indicador do sintoma social.
Por mais de dois milênios, as oscilações da sensibilidade melancólica indagaram a cultura ocidental a respeito da fronteira que separa o louco e o gênio. Desde a Antiguidade clássica, o melancólico, incapaz de corresponder à “demanda do Outro”, denunciava o que não ia bem, no laço social.
A crise que leva Justine a arrebentar seu compromisso amoroso, sua festa de casamento e seu emprego numa única noite é conduzida com precisão didática pelo diretor. Uma observação cruel da mãe (representação perfeita da mãe do melancólico freudiano), seguida da indiferença do pai, deflagra em Justine uma verdadeira crise de fé. De repente, a noiva se exclui da cena na qual deveria ser a principal protagonista. Não acredita mais. Despenca da rede imaginária que sustenta o que se costuma chamar de realidade, ficção coletiva capaz de dotar a vida de significado e valor.
Justine, incapaz de olhar o mundo através do véu de fantasia que conforta aos outros, “os tais sãos” (como no verso de Pessoa), enxerga o que a cena encobre. Ela não teme a chegada de Melancholia porque nunca foi capaz de se iludir sobre a finitude de tudo o que existe. Justine “vê coisas”. Árida vida a de quem vê demais porque não sabe fantasiar.
EXCEÇÃO Desde a Antiguidade o melancólico foi entendido, no Ocidente, como aquele que ocupa um lugar de exceção na cultura. O pathos melancólico foi explicado por Hipócrates e Galeno com base na teoria dos quatro humores que regulam o funcionamento do corpo e da alma. As oscilações da bile negra fariam do melancólico um ser inconstante, a um só tempo doentio e genial, impelido a criar para aplacar as oscilações de seu temperamento.
No cerne de sua reflexão “O Homem de Gênio e a Melancolia” (O Problema XXX), Aristóteles já discernira uma questão ética a respeito dos excessos emocionais do melancólico e uma questão estética sobre o gênio criador. Daí o incômodo papel que lhe coube: questionar os significantes que sustentam o imaginário de sua época.
SÉCULO 19 A tradição inaugurada por Aristóteles termina com Baudelaire já no século 19 -o último dos românticos, o primeiro dos modernos, segundo outro melancólico genial, Walter Benjamin. Para suportar os altos e baixos de seu temperamento e dar algum destino à sua excentricidade, alguns melancólicos dedicaram-se a tentar compreender seu mal.
O classicismo inglês produziu o mais completo compêndio sobre a melancolia de que se tem conhecimento, obra da vida inteira do bibliotecário de Oxford Robert Burton (1577-1640).
Sua “A Anatomia da Melancolia”, publicada em 1621 e reeditada várias vezes nas décadas seguintes, é um compêndio de mais de 1.400 páginas contendo tudo o que se podia saber sobre a “doença” de seu autor. A editora da Universidade Federal do Paraná acaba de lançar no Brasil o primeiro volume de “A Anatomia da Melancolia” [trad. Guilherme Gontijo Flores, 265 págs., preço não definido].
É pena que o primeiro volume se limite ao longo introito do autor a seus leitores. Esperamos que em breve a Editora UFPR publique uma seleção dos capítulos do livro, que inicia com as causas da melancolia -“Delírio, frenesi, loucura” […] “Solidão e ócio” […] “A força da imaginação”…- segue com a descrição dos paliativos para aliviar o sofrimento (“alegria, boa companhia, belos objetos…”) para ao final abordar a melancolia amorosa e a melancolia religiosa.
O autor assinou a obra como Demócrito Júnior, a afirmar sua identificação com o filósofo que, segundo a descrição de Hipócrates, afastou-se do convívio com os homens e, diante da vacuidade do mundo, costumava rir de tudo. O riso do melancólico é expressão do escárnio ante as ilusões alheias.
A empreitada de Burton só foi possível em uma época em que a melancolia era entendida não apenas como uma doença, mas como um fenômeno da cultura. O texto seminal de Aristóteles já continha uma reflexão sobre a capacidade criativa do melancólico, atribuída à instabilidade que o impele a expandir sua alma em todas as direções do universo.
FREUD Tal processo de desidentificação encontra-se também no diagnóstico freudiano, ao qual falta, entretanto, a contrapartida da mimesis. Solto da rede imaginária que o enlaça a si mesmo e ao mundo, o melancólico contemporâneo só conta de encarar o Real com a aridez do simbólico.
Algo se passou, na modernidade, para que a inconsistência imaginária do melancólico deixasse de estimulá-lo a reinventar as representações do mundo e ficasse à mercê da Coisa. A receita preparada para Justine tem gosto de cinzas; fios de lã invisíveis impedem suas pernas de andar. Diante desse horror, ela prefere a colisão com Melancholia.
A melancolia deixou de ser entendida como um desajuste referido às normas da vida pública quando Freud arrebatou o significante de seu sentido tradicional a fim de trazer para o campo da psicanálise o diagnóstico psiquiátrico da então chamada psicose maníaco-depressiva -que hoje a medicina retomou sob a designação de transtorno bipolar.
Freud não privatizou a melancolia por acaso: a própria psicanálise deve sua existência ao surgimento do sujeito neurótico gerado nas tramas da família burguesa, fechada sobre si mesma e fundada em compromissos de amor. A psicanálise freudiana é contemporânea ao acabamento da forma subjetiva do indivíduo e à privatização das tarefas de socialização das crianças.
Vem daí que o melancólico freudiano não se pareça em nada com seus colegas pré-modernos: o valente guerreiro exposto à vergonha diante de seus pares (Ajax), o anacoreta em crise de fé (santo Antônio), o pensador renascentista ocupado em restaurar a ordem de um mundo em constante transformação (como na gravura de Dürer). Nem faz lembrar, na aurora modernidade, o “flâneur” a recolher restos de um mundo em ruínas pelas ruas de uma grande cidade (Baudelaire) de modo a compor um monumento poético para fazer face à barbárie.
O melancólico freudiano é o bebê repudiado pela mãe, pobre eu transformado em dejeto sobre o qual caiu a sombra de um objeto mau. O que se perdeu na transição efetuada pela psicanálise foi o valor criativo que se atribuía ao melancólico, da Antiguidade ao romantismo. Perdeu-se o valor do polo maníaco do que hoje a medicina chama de transtorno bipolar.
Onde o melancólico pré-moderno, em seus momentos de euforia, era dado a expansões da imaginação poética, hoje a mania leva os pacientes “bipolares” a torrar dinheiro no cartão de crédito. O consumo é o ato que expressa os atuais clientes da psicofarmacologia, apartados da potência criadora que sua inadaptação ao mundo poderia lhes conferir.
DEPRESSÃO Já não existem melancólicos como os de antigamente? Os neurocientistas que o digam. A psiquiatria e a indústria farmacêutica já escolheram seu substituto no século 21: no lugar do significante melancolia, instala-se a depressão como grande sintoma do mal-estar na civilização do terceiro milênio. Quanto mais se sofistica a oferta de antidepressivos, mais a depressão se anuncia no horizonte como expressão privilegiada do mal-estar, a ameaçar sociedades que se dedicam a ignorar o saber que ela contém.
Tal produção ativa de ignorância a respeito do sentido da melancolia está no centro da parábola de Lars von Trier. John, cunhado de Justine, afirma sua fé no mundo das mercadorias. Abastece a casa com comida, combustível, geradores de energia. Confia na informação científica divulgada pela internet. Verifica no telescópio a aproximação do planeta ameaçador.
Sua defesa é tão frágil que, diante do inevitável, suicida-se com uma overdose das pílulas da esposa. Claire, por sua vez, tem grande fé na encenação da vida. O fracasso do casamento espetacular da irmã não a impede de planejar outro pequeno ritual, na bela varanda da casa, com música e vinho, para esperar a chegada de Melancholia. Excelente final para um melodrama hollywoodiano, que Justine descarta com desprezo.
Justine não tem ilusões a respeito do fim. Mesmo assim, para proteger o sobrinho do horror final, mostra-se capaz de criar a mais onipotente das fantasias. Constrói com ele uma frágil tenda “mágica” sob a qual se abrigam para esperar a explosão de luz trazida pela colisão com Melancholia.
O triângulo formado por três galhos presos na ponta não chega a criar uma ilusão: são como traços de uma escrita, como um significante a demarcar, “in extremis”, um território humano em face do Real.
The Responsibility of Intellectuals, Redux (Boston Review)
Boston Review – SEPTEMBER/OCTOBER 2011
Using Privilege to Challenge the State
Noam Chomsky
A San Francisco mural depicting Archbishop Óscar Romero / Photograph: Franco Folini
Since we often cannot see what is happening before our eyes, it is perhaps not too surprising that what is at a slight distance removed is utterly invisible. We have just witnessed an instructive example: President Obama’s dispatch of 79 commandos into Pakistan on May 1 to carry out what was evidently a planned assassination of the prime suspect in the terrorist atrocities of 9/11, Osama bin Laden. Though the target of the operation, unarmed and with no protection, could easily have been apprehended, he was simply murdered, his body dumped at sea without autopsy. The action was deemed “just and necessary” in the liberal press. There will be no trial, as there was in the case of Nazi criminals—a fact not overlooked by legal authorities abroad who approve of the operation but object to the procedure. As Elaine Scarry reminds us, the prohibition of assassination in international law traces back to a forceful denunciation of the practice by Abraham Lincoln, who condemned the call for assassination as “international outlawry” in 1863, an “outrage,” which “civilized nations” view with “horror” and merits the “sternest retaliation.”
In 1967, writing about the deceit and distortion surrounding the American invasion of Vietnam, I discussed the responsibility of intellectuals, borrowing the phrase from an important essay of Dwight Macdonald’s after World War II. With the tenth anniversary of 9/11 arriving, and widespread approval in the United States of the assassination of the chief suspect, it seems a fitting time to revisit that issue. But before thinking about the responsibility of intellectuals, it is worth clarifying to whom we are referring.
The concept of intellectuals in the modern sense gained prominence with the 1898 “Manifesto of the Intellectuals” produced by the Dreyfusards who, inspired by Emile Zola’s open letter of protest to France’s president, condemned both the framing of French artillery officer Alfred Dreyfus on charges of treason and the subsequent military cover-up. The Dreyfusards’ stance conveys the image of intellectuals as defenders of justice, confronting power with courage and integrity. But they were hardly seen that way at the time. A minority of the educated classes, the Dreyfusards were bitterly condemned in the mainstream of intellectual life, in particular by prominent figures among “the immortals of the strongly anti-Dreyfusard Académie Française,” Steven Lukes writes. To the novelist, politician, and anti-Dreyfusard leader Maurice Barrès, Dreyfusards were “anarchists of the lecture-platform.” To another of these immortals, Ferdinand Brunetière, the very word “intellectual” signified “one of the most ridiculous eccentricities of our time—I mean the pretension of raising writers, scientists, professors and philologists to the rank of supermen,” who dare to “treat our generals as idiots, our social institutions as absurd and our traditions as unhealthy.”
Who then were the intellectuals? The minority inspired by Zola (who was sentenced to jail for libel, and fled the country)? Or the immortals of the academy? The question resonates through the ages, in one or another form, and today offers a framework for determining the “responsibility of intellectuals.” The phrase is ambiguous: does it refer to intellectuals’ moral responsibility as decent human beings in a position to use their privilege and status to advance the causes of freedom, justice, mercy, peace, and other such sentimental concerns? Or does it refer to the role they are expected to play, serving, not derogating, leadership and established institutions?
• • •
One answer came during World War I, when prominent intellectuals on all sides lined up enthusiastically in support of their own states.
In their “Manifesto of 93 German Intellectuals,” leading figures in one of the world’s most enlightened states called on the West to “have faith in us! Believe, that we shall carry on this war to the end as a civilized nation, to whom the legacy of a Goethe, a Beethoven, and a Kant, is just as sacred as its own hearths and homes.” Their counterparts on the other side of the intellectual trenches matched them in enthusiasm for the noble cause, but went beyond in self-adulation. In The New Republic they proclaimed, “The effective and decisive work on behalf of the war has been accomplished by . . . a class which must be comprehensively but loosely described as the ‘intellectuals.’” These progressives believed they were ensuring that the United States entered the war “under the influence of a moral verdict reached, after the utmost deliberation by the more thoughtful members of the community.” They were, in fact, the victims of concoctions of the British Ministry of Information, which secretly sought “to direct the thought of most of the world,” but particularly the thought of American progressive intellectuals who might help to whip a pacifist country into war fever.
John Dewey was impressed by the great “psychological and educational lesson” of the war, which proved that human beings—more precisely, “the intelligent men of the community”—can “take hold of human affairs and manage them . . . deliberately and intelligently” to achieve the ends sought, admirable by definition.
Not everyone toed the line so obediently, of course. Notable figures such as Bertrand Russell, Eugene Debs, Rosa Luxemburg, and Karl Liebknecht were, like Zola, sentenced to prison. Debs was punished with particular severity—a ten-year prison term for raising questions about President Wilson’s “war for democracy and human rights.” Wilson refused him amnesty after the war ended, though Harding finally relented. Some, such as Thorstein Veblen, were chastised but treated less harshly; Veblen was fired from his position in the Food Administration after preparing a report showing that the shortage of farm labor could be overcome by ending Wilson’s brutal persecution of labor, specifically the International Workers of the World. Randolph Bourne was dropped by the progressive journals after criticizing the “league of benevolently imperialistic nations” and their exalted endeavors.
The pattern of praise and punishment is a familiar one throughout history: those who line up in the service of the state are typically praised by the general intellectual community, and those who refuse to line up in service of the state are punished. Thus in retrospect Wilson and the progressive intellectuals who offered him their services are greatly honored, but not Debs. Luxemburg and Liebknecht were murdered and have hardly been heroes of the intellectual mainstream. Russell continued to be bitterly condemned until after his death—and in current biographies still is.
Since power tends to prevail, intellectuals who serve their governments are considered the responsible ones.
In the 1970s prominent scholars distinguished the two categories of intellectuals more explicitly. A 1975 study, The Crisis of Democracy, labeled Brunetière’s ridiculous eccentrics “value-oriented intellectuals” who pose a “challenge to democratic government which is, potentially at least, as serious as those posed in the past by aristocratic cliques, fascist movements, and communist parties.” Among other misdeeds, these dangerous creatures “devote themselves to the derogation of leadership, the challenging of authority,” and they challenge the institutions responsible for “the indoctrination of the young.” Some even sink to the depths of questioning the nobility of war aims, as Bourne had. This castigation of the miscreants who question authority and the established order was delivered by the scholars of the liberal internationalist Trilateral Commission; the Carter administration was largely drawn from their ranks.
Like The New Republic progressives during World War I, the authors of The Crisis of Democracy extend the concept of the “intellectual” beyond Brunetière’s ridiculous eccentrics to include the better sort as well: the “technocratic and policy-oriented intellectuals,” responsible and serious thinkers who devote themselves to the constructive work of shaping policy within established institutions and to ensuring that indoctrination of the young proceeds on course.
It took Dewey only a few years to shift from the responsible technocratic and policy-oriented intellectual of World War I to an anarchist of the lecture-platform, as he denounced the “un-free press” and questioned “how far genuine intellectual freedom and social responsibility are possible on any large scale under the existing economic regime.”
What particularly troubled the Trilateral scholars was the “excess of democracy” during the time of troubles, the 1960s, when normally passive and apathetic parts of the population entered the political arena to advance their concerns: minorities, women, the young, the old, working people . . . in short, the population, sometimes called the “special interests.” They are to be distinguished from those whom Adam Smith called the “masters of mankind,” who are “the principal architects” of government policy and pursue their “vile maxim”: “All for ourselves and nothing for other people.” The role of the masters in the political arena is not deplored, or discussed, in the Trilateral volume, presumably because the masters represent “the national interest,” like those who applauded themselves for leading the country to war “after the utmost deliberation by the more thoughtful members of the community” had reached its “moral verdict.”
To overcome the excessive burden imposed on the state by the special interests, the Trilateralists called for more “moderation in democracy,” a return to passivity on the part of the less deserving, perhaps even a return to the happy days when “Truman had been able to govern the country with the cooperation of a relatively small number of Wall Street lawyers and bankers,” and democracy therefore flourished.
The Trilateralists could well have claimed to be adhering to the original intent of the Constitution, “intrinsically an aristocratic document designed to check the democratic tendencies of the period” by delivering power to a “better sort” of people and barring “those who were not rich, well born, or prominent from exercising political power,” in the accurate words of the historian Gordon Wood. In Madison’s defense, however, we should recognize that his mentality was pre-capitalist. In determining that power should be in the hands of “the wealth of the nation,” “a the more capable set of men,” he envisioned those men on the model of the “enlightened Statesmen” and “benevolent philosopher” of the imagined Roman world. They would be “pure and noble,” “men of intelligence, patriotism, property, and independent circumstances” “whose wisdom may best discern the true interest of their country, and whose patriotism and love of justice will be least likely to sacrifice it to temporary or partial considerations.” So endowed, these men would “refine and enlarge the public views,” guarding the public interest against the “mischiefs” of democratic majorities. In a similar vein, the progressive Wilsonian intellectuals might have taken comfort in the discoveries of the behavioral sciences, explained in 1939 by the psychologist and education theorist Edward Thorndike:
It is the great good fortune of mankind that there is a substantial correlation between intelligence and morality including good will toward one’s fellows . . . . Consequently our superiors in ability are on the average our benefactors, and it is often safer to trust our interests to them than to ourselves.
A comforting doctrine, though some might feel that Adam Smith had the sharper eye.
• • •
Since power tends to prevail, intellectuals who serve their governments are considered responsible, and value-oriented intellectuals are dismissed or denigrated. At home that is.
With regard to enemies, the distinction between the two categories of intellectuals is retained, but with values reversed. In the old Soviet Union, the value-oriented intellectuals were the honored dissidents, while we had only contempt for the apparatchiks and commissars, the technocratic and policy-oriented intellectuals. Similarly in Iran we honor the courageous dissidents and condemn those who defend the clerical establishment. And elsewhere generally.
The honorable term “dissident” is used selectively. It does not, of course, apply, with its favorable connotations, to value-oriented intellectuals at home or to those who combat U.S.-supported tyranny abroad. Take the interesting case of Nelson Mandela, who was removed from the official terrorist list in 2008, and can now travel to the United States without special authorization.
Father Ignacio Ellacuría / Photograph: Gervasio Sánchez
Twenty years earlier, he was the criminal leader of one of the world’s “more notorious terrorist groups,” according to a Pentagon report. That is why President Reagan had to support the apartheid regime, increasing trade with South Africa in violation of congressional sanctions and supporting South Africa’s depredations in neighboring countries, which led, according to a UN study, to 1.5 million deaths. That was only one episode in the war on terrorism that Reagan declared to combat “the plague of the modern age,” or, as Secretary of State George Shultz had it, “a return to barbarism in the modern age.” We may add hundreds of thousands of corpses in Central America and tens of thousands more in the Middle East, among other achievements. Small wonder that the Great Communicator is worshipped by Hoover Institution scholars as a colossus whose “spirit seems to stride the country, watching us like a warm and friendly ghost,” recently honored further by a statue that defaces the American Embassy in London.
What particularly troubled the Trilateral scholars was the ‘excess of democracy’ in the 1960s.
The Latin American case is revealing. Those who called for freedom and justice in Latin America are not admitted to the pantheon of honored dissidents. For example, a week after the fall of the Berlin Wall, six leading Latin American intellectuals, all Jesuit priests, had their heads blown off on the direct orders of the Salvadoran high command. The perpetrators were from an elite battalion armed and trained by Washington that had already left a gruesome trail of blood and terror, and had just returned from renewed training at the John F. Kennedy Special Warfare Center and School at Fort Bragg, North Carolina. The murdered priests are not commemorated as honored dissidents, nor are others like them throughout the hemisphere. Honored dissidents are those who called for freedom in enemy domains in Eastern Europe, who certainly suffered, but not remotely like their counterparts in Latin America.
The distinction is worth examination, and tells us a lot about the two senses of the phrase “responsibility of intellectuals,” and about ourselves. It is not seriously in question, as John Coatsworth writes in the recently published Cambridge University History of the Cold War, that from 1960 to “the Soviet collapse in 1990, the numbers of political prisoners, torture victims, and executions of nonviolent political dissenters in Latin America vastly exceeded those in the Soviet Union and its East European satellites.” Among the executed were many religious martyrs, and there were mass slaughters as well, consistently supported or initiated by Washington.
Why then the distinction? It might be argued that what happened in Eastern Europe is far more momentous than the fate of the South at our hands. It would be interesting to see the argument spelled out. And also to see the argument explaining why we should disregard elementary moral principles, among them that if we are serious about suffering and atrocities, about justice and rights, we will focus our efforts on where we can do the most good—typically, where we share responsibility for what is being done. We have no difficulty demanding that our enemies follow such principles.
Few of us care, or should, what Andrei Sakharov or Shirin Ebadi say about U.S. or Israeli crimes; we admire them for what they say and do about those of their own states, and the conclusion holds far more strongly for those who live in more free and democratic societies, and therefore have far greater opportunities to act effectively. It is of some interest that in the most respected circles, practice is virtually the opposite of what elementary moral values dictate.
But let us conform and keep only to the matter of historical import.
The U.S. wars in Latin America from 1960 to 1990, quite apart from their horrors, have long-term historical significance. To consider just one important aspect, in no small measure they were wars against the Church, undertaken to crush a terrible heresy proclaimed at Vatican II in 1962, which, under the leadership of Pope John XXIII, “ushered in a new era in the history of the Catholic Church,” in the words of the distinguished theologian Hans Küng, restoring the teachings of the gospels that had been put to rest in the fourth century when the Emperor Constantine established Christianity as the religion of the Roman Empire, instituting “a revolution” that converted “the persecuted church” to a “persecuting church.” The heresy of Vatican II was taken up by Latin American bishops who adopted the “preferential option for the poor.” Priests, nuns, and laypersons then brought the radical pacifist message of the gospels to the poor, helping them organize to ameliorate their bitter fate in the domains of U.S. power.
That same year, 1962, President Kennedy made several critical decisions. One was to shift the mission of the militaries of Latin America from “hemispheric defense”—an anachronism from World War II—to “internal security,” in effect, war against the domestic population, if they raise their heads. Charles Maechling, who led U.S. counterinsurgency and internal defense planning from 1961 to 1966, describes the unsurprising consequences of the 1962 decision as a shift from toleration “of the rapacity and cruelty of the Latin American military” to “direct complicity” in their crimes to U.S. support for “the methods of Heinrich Himmler’s extermination squads.” One major initiative was a military coup in Brazil, planned in Washington and implemented shortly after Kennedy’s assassination, instituting a murderous and brutal national security state. The plague of repression then spread through the hemisphere, including the 1973 coup installing the Pinochet dictatorship, and later the most vicious of all, the Argentine dictatorship, Reagan’s favorite. Central America’s turn—not for the first time—came in the 1980s under the leadership of the “warm and friendly ghost” who is now revered for his achievements.
The murder of the Jesuit intellectuals as the Berlin wall fell was a final blow in defeating the heresy, culminating a decade of horror in El Salvador that opened with the assassination, by much the same hands, of Archbishop Óscar Romero, the “voice for the voiceless.” The victors in the war against the Church declare their responsibility with pride. The School of the Americas (since renamed), famous for its training of Latin American killers, announces as one of its “talking points” that the liberation theology that was initiated at Vatican II was “defeated with the assistance of the US army.”
Actually, the November 1989 assassinations were almost a final blow. More was needed.
A year later Haiti had its first free election, and to the surprise and shock of Washington, which like others had anticipated the easy victory of its own candidate from the privileged elite, the organized public in the slums and hills elected Jean-Bertrand Aristide, a popular priest committed to liberation theology. The United States at once moved to undermine the elected government, and after the military coup that overthrew it a few months later, lent substantial support to the vicious military junta and its elite supporters. Trade was increased in violation of international sanctions and increased further under Clinton, who also authorized the Texaco oil company to supply the murderous rulers, in defiance of his own directives.
I will skip the disgraceful aftermath, amply reviewed elsewhere, except to point out that in 2004, the two traditional torturers of Haiti, France and the United States, joined by Canada, forcefully intervened, kidnapped President Aristide (who had been elected again), and shipped him off to central Africa. He and his party were effectively barred from the farcical 2010–11 elections, the most recent episode in a horrendous history that goes back hundreds of years and is barely known among the perpetrators of the crimes, who prefer tales of dedicated efforts to save the suffering people from their grim fate.
If we are serious about justice, we will focus our efforts where we share responsibility for what is being done.
Another fateful Kennedy decision in 1962 was to send a special forces mission to Colombia, led by General William Yarborough, who advised the Colombian security forces to undertake “paramilitary, sabotage and/or terrorist activities against known communist proponents,” activities that “should be backed by the United States.” The meaning of the phrase “communist proponents” was spelled out by the respected president of the Colombian Permanent Committee for Human Rights, former Minister of Foreign Affairs Alfredo Vázquez Carrizosa, who wrote that the Kennedy administration “took great pains to transform our regular armies into counterinsurgency brigades, accepting the new strategy of the death squads,” ushering in
what is known in Latin America as the National Security Doctrine. . . . [not] defense against an external enemy, but a way to make the military establishment the masters of the game . . . [with] the right to combat the internal enemy, as set forth in the Brazilian doctrine, the Argentine doctrine, the Uruguayan doctrine, and the Colombian doctrine: it is the right to fight and to exterminate social workers, trade unionists, men and women who are not supportive of the establishment, and who are assumed to be communist extremists. And this could mean anyone, including human rights activists such as myself.
In a 1980 study, Lars Schoultz, the leading U.S. academic specialist on human rights in Latin America, found that U.S. aid “has tended to flow disproportionately to Latin American governments which torture their citizens . . . to the hemisphere’s relatively egregious violators of fundamental human rights.” That included military aid, was independent of need, and continued through the Carter years. Ever since the Reagan administration, it has been superfluous to carry out such a study. In the 1980s one of the most notorious violators was El Salvador, which accordingly became the leading recipient of U.S. military aid, to be replaced by Colombia when it took the lead as the worst violator of human rights in the hemisphere. Vázquez Carrizosa himself was living under heavy guard in his Bogotá residence when I visited him there in 2002 as part of a mission of Amnesty International, which was opening its year-long campaign to protect human rights defenders in Colombia because of the country’s horrifying record of attacks against human rights and labor activists, and mostly the usual victims of state terror: the poor and defenseless. Terror and torture in Colombia were supplemented by chemical warfare (“fumigation”), under the pretext of the war on drugs, leading to huge flight to urban slums and misery for the survivors. Colombia’s attorney general’s office now estimates that more than 140,000 people have been killed by paramilitaries, often acting in close collaboration with the U.S.-funded military.
Signs of the slaughter are everywhere. On a nearly impassible dirt road to a remote village in southern Colombia a year ago, my companions and I passed a small clearing with many simple crosses marking the graves of victims of a paramilitary attack on a local bus. Reports of the killings are graphic enough; spending a little time with the survivors, who are among the kindest and most compassionate people I have ever had the privilege of meeting, makes the picture more vivid, and only more painful.
This is the briefest sketch of terrible crimes for which Americans bear substantial culpability, and that we could easily ameliorate, at the very least.
But it is more gratifying to bask in praise for courageously protesting the abuses of official enemies, a fine activity, but not the priority of a value-oriented intellectual who takes the responsibilities of that stance seriously.
The victims within our domains, unlike those in enemy states, are not merely ignored and quickly forgotten, but are also cynically insulted. One striking illustration came a few weeks after the murder of the Latin American intellectuals in El Salvador. Vaclav Havel visited Washington and addressed a joint session of Congress. Before his enraptured audience, Havel lauded the “defenders of freedom” in Washington who “understood the responsibility that flowed from” being “the most powerful nation on earth”—crucially, their responsibility for the brutal assassination of his Salvadoran counterparts shortly before.
The liberal intellectual class was enthralled by his presentation. Havel reminds us that “we live in a romantic age,” Anthony Lewis gushed. Other prominent liberal commentators reveled in Havel’s “idealism, his irony, his humanity,” as he “preached a difficult doctrine of individual responsibility” while Congress “obviously ached with respect” for his genius and integrity; and asked why America lacks intellectuals so profound, who “elevate morality over self-interest” in this way, praising us for the tortured and mutilated corpses that litter the countries that we have left in misery. We need not tarry on what the reaction would have been had Father Ellacuría, the most prominent of the murdered Jesuit intellectuals, spoken such words at the Duma after elite forces armed and trained by the Soviet Union assassinated Havel and half a dozen of his associates—a performance that is inconceivable.
John Dewey / Photograph: New York Public Library / Photoresearchers, Inc.
The assassination of bin Laden, too, directs our attention to our insulted victims. There is much more to say about the operation—including Washington’s willingness to face a serious risk of major war and even leakage of fissile materials to jihadis, as I have discussed elsewhere—but let us keep to the choice of name: Operation Geronimo. The name caused outrage in Mexico and was protested by indigenous groups in the United States, but there seems to have been no further notice of the fact that Obama was identifying bin Laden with the Apache Indian chief. Geronimo led the courageous resistance to invaders who sought to consign his people to the fate of “that hapless race of native Americans, which we are exterminating with such merciless and perfidious cruelty, among the heinous sins of this nation, for which I believe God will one day bring [it] to judgement,” in the words of the grand strategist John Quincy Adams, the intellectual architect of manifest destiny, uttered long after his own contributions to these sins. The casual choice of the name is reminiscent of the ease with which we name our murder weapons after victims of our crimes: Apache, Blackhawk, Cheyenne . . . We might react differently if the Luftwaffe were to call its fighter planes “Jew” and “Gypsy.”
The first 9/11, unlike the second, did not change the world. It was ‘nothing of very great consequence,’ Kissinger said.
Denial of these “heinous sins” is sometimes explicit. To mention a few recent cases, two years ago in one of the world’s leading left-liberal intellectual journals, The New York Review of Books, Russell Baker outlined what he learned from the work of the “heroic historian” Edmund Morgan: namely, that when Columbus and the early explorers arrived they “found a continental vastness sparsely populated by farming and hunting people . . . . In the limitless and unspoiled world stretching from tropical jungle to the frozen north, there may have been scarcely more than a million inhabitants.” The calculation is off by many tens of millions, and the “vastness” included advanced civilizations throughout the continent. No reactions appeared, though four months later the editors issued a correction, noting that in North America there may have been as many as 18 million people—and, unmentioned, tens of millions more “from tropical jungle to the frozen north.” This was all well known decades ago—including the advanced civilizations and the “merciless and perfidious cruelty” of the “extermination”—but not important enough even for a casual phrase. In London Review of Books a year later, the noted historian Mark Mazower mentioned American “mistreatment of the Native Americans,” again eliciting no comment. Would we accept the word “mistreatment” for comparable crimes committed by enemies?
• • •
If the responsibility of intellectuals refers to their moral responsibility as decent human beings in a position to use their privilege and status to advance the cause of freedom, justice, mercy, and peace—and to speak out not simply about the abuses of our enemies, but, far more significantly, about the crimes in which we are implicated and can ameliorate or terminate if we choose—how should we think of 9/11?
The notion that 9/11 “changed the world” is widely held, understandably. The events of that day certainly had major consequences, domestic and international. One was to lead President Bush to re-declare Ronald Reagan’s war on terrorism—the first one has been effectively “disappeared,” to borrow the phrase of our favorite Latin American killers and torturers, presumably because the consequences do not fit well with preferred self images. Another consequence was the invasion of Afghanistan, then Iraq, and more recently military interventions in several other countries in the region and regular threats of an attack on Iran (“all options are open,” in the standard phrase). The costs, in every dimension, have been enormous. That suggests a rather obvious question, not asked for the first time: was there an alternative?
A number of analysts have observed that bin Laden won major successes in his war against the United States. “He repeatedly asserted that the only way to drive the U.S. from the Muslim world and defeat its satraps was by drawing Americans into a series of small but expensive wars that would ultimately bankrupt them,” the journalist Eric Margolis writes.
The United States, first under George W. Bush and then Barack Obama, rushed right into bin Laden’s trap. . . . Grotesquely overblown military outlays and debt addiction . . . . may be the most pernicious legacy of the man who thought he could defeat the United States.
A report from the Costs of War project at Brown University’s Watson Institute for International Studies estimates that the final bill will be $3.2–4 trillion. Quite an impressive achievement by bin Laden.
That Washington was intent on rushing into bin Laden’s trap was evident at once. Michael Scheuer, the senior CIA analyst responsible for tracking bin Laden from 1996 to 1999, writes, “Bin Laden has been precise in telling America the reasons he is waging war on us.” The al Qaeda leader, Scheuer continues, “is out to drastically alter U.S. and Western policies toward the Islamic world.”
And, as Scheuer explains, bin Laden largely succeeded: “U.S. forces and policies are completing the radicalization of the Islamic world, something Osama bin Laden has been trying to do with substantial but incomplete success since the early 1990s. As a result, I think it is fair to conclude that the United States of America remains bin Laden’s only indispensable ally.” And arguably remains so, even after his death.
There is good reason to believe that the jihadi movement could have been split and undermined after the 9/11 attack, which was criticized harshly within the movement. Furthermore, the “crime against humanity,” as it was rightly called, could have been approached as a crime, with an international operation to apprehend the likely suspects. That was recognized in the immediate aftermath of the attack, but no such idea was even considered by decision-makers in government. It seems no thought was given to the Taliban’s tentative offer—how serious an offer, we cannot know—to present the al Qaeda leaders for a judicial proceeding.
At the time, I quoted Robert Fisk’s conclusion that the horrendous crime of 9/11 was committed with “wickedness and awesome cruelty”—an accurate judgment. The crimes could have been even worse. Suppose that Flight 93, downed by courageous passengers in Pennsylvania, had bombed the White House, killing the president. Suppose that the perpetrators of the crime planned to, and did, impose a military dictatorship that killed thousands and tortured tens of thousands. Suppose the new dictatorship established, with the support of the criminals, an international terror center that helped impose similar torture-and-terror states elsewhere, and, as icing on the cake, brought in a team of economists—call them “the Kandahar boys”—who quickly drove the economy into one of the worst depressions in its history. That, plainly, would have been a lot worse than 9/11.
As we all should know, this is not a thought experiment. It happened. I am, of course, referring to what in Latin America is often called “the first 9/11”: September 11, 1973, when the United States succeeded in its intensive efforts to overthrow the democratic government of Salvador Allende in Chile with a military coup that placed General Pinochet’s ghastly regime in office. The dictatorship then installed the Chicago Boys—economists trained at the University of Chicago—to reshape Chile’s economy. Consider the economic destruction, the torture and kidnappings, and multiply the numbers killed by 25 to yield per capita equivalents, and you will see just how much more devastating the first 9/11 was.
Privilege yields opportunity, and opportunity confers responsibilities.
The goal of the overthrow, in the words of the Nixon administration, was to kill the “virus” that might encourage all those “foreigners [who] are out to screw us”—screw us by trying to take over their own resources and more generally to pursue a policy of independent development along lines disliked by Washington. In the background was the conclusion of Nixon’s National Security Council that if the United States could not control Latin America, it could not expect “to achieve a successful order elsewhere in the world.” Washington’s “credibility” would be undermined, as Henry Kissinger put it.
The first 9/11, unlike the second, did not change the world. It was “nothing of very great consequence,” Kissinger assured his boss a few days later. And judging by how it figures in conventional history, his words can hardly be faulted, though the survivors may see the matter differently.
These events of little consequence were not limited to the military coup that destroyed Chilean democracy and set in motion the horror story that followed. As already discussed, the first 9/11 was just one act in the drama that began in 1962 when Kennedy shifted the mission of the Latin American militaries to “internal security.” The shattering aftermath is also of little consequence, the familiar pattern when history is guarded by responsible intellectuals.
• • •
It seems to be close to a historical universal that conformist intellectuals, the ones who support official aims and ignore or rationalize official crimes, are honored and privileged in their own societies, and the value-oriented punished in one or another way. The pattern goes back to the earliest records. It was the man accused of corrupting the youth of Athens who drank the hemlock, much as Dreyfusards were accused of “corrupting souls, and, in due course, society as a whole” and the value-oriented intellectuals of the 1960s were charged with interference with “indoctrination of the young.”
In the Hebrew scriptures there are figures who by contemporary standards are dissident intellectuals, called “prophets” in the English translation. They bitterly angered the establishment with their critical geopolitical analysis, their condemnation of the crimes of the powerful, their calls for justice and concern for the poor and suffering. King Ahab, the most evil of the kings, denounced the Prophet Elijah as a hater of Israel, the first “self-hating Jew” or “anti-American” in the modern counterparts. The prophets were treated harshly, unlike the flatterers at the court, who were later condemned as false prophets. The pattern is understandable. It would be surprising if it were otherwise.
As for the responsibility of intellectuals, there does not seem to me to be much to say beyond some simple truths. Intellectuals are typically privileged—merely an observation about usage of the term. Privilege yields opportunity, and opportunity confers responsibilities. An individual then has choices.
A novela perdeu o bonde da história (Fapesp)
HUMANIDADES | COMUNICAÇÃO
Cai status do gênero como lugar privilegiado de discussão das questões nacionais
Carlos Haag
Edição Impressa – Agosto 2011
© FOTOGRAFIAS TADEU VILANI
Em 1981, durante uma crise política grave no governo Figueiredo, o todo-poderoso Golbery do Couto e Silva pediu demissão do governo. Aos jornalistas justificou-se: “Não me perguntem nada. Eu acabo de sair de Sucupira”. A referência à cidade fictícia da novela O bem-amado (1973) e à minissérie homônima (1980-1984), de Dias Gomes, num momento delicado como aquele, revela o poder, à época, das telenovelas como representação da realidade nacional e de como os brasileiros se reconheciam nessas representações. “A partir de conflitos de gênero, geração, classe e religião, a novela fez crônicas do cotidiano que a transformaram num palco privilegiado de interpretação do Brasil. O país, que se modernizava num contexto de modernização centrada no consumo, e não na afirmação da cidadania, se reconhecia na tela da TV em um universo branco e glamoroso”, explica Esther Hamburger, professora do Departamento de Cinema, Rádio e Televisão da Universidade de São Paulo (USP) e autora do estudo O Brasil antenado (Jorge Zahar Editor). Ela analisou os novos rumos do gênero na pesquisa Formação do campo intelectual e da indústria cultural no Brasil contemporâneo, apoiada pela FAPESP e coordenada pelo sociólogo da USP Sérgio Miceli. O projeto reúne, além de Esther, outros pesquisadores de várias áreas e temas.
“No Brasil que se democratizava, a novela tratou em primeira mão de assuntos que pautariam a cena política na década seguinte. Mas, hoje, ela perdeu o seu status privilegiado de problematização das questões nacionais. Não consegue mobilizar a opinião pública, não é mais totalmente nacional e tampouco a vitrine do país. É provável que não seja mais capaz de sintetizar o país”, avisa a pesquisadora. “Afinal, aquele país centralizado, passível de uma representação hegemônica, não existe mais. Novos meios como TV a cabo e a internet tiraram da novela o seu caráter de arena de problematização. A sociedade mudou e há muita diversificação. A alfabetização aumentou e a TV não é mais o único lugar para achar informações”, observa. Para Esther, no país atual não é mais possível uma novela falar para toda a nação. “Não há mais um Brasil na TV, mas vários”, avalia.
Queda – “A novela permanece estratégica na receita e na competição entre as emissoras de televisão, mas sua capacidade de polarizar audiências nacionais está em queda. O gênero abusa de mensagens de conteúdo social, enquanto perde seu diferencial estético e sua força polêmica. A nação já não é mais o tema central, porque os temas extrapolam fronteiras. Há cada vez me-nos referências a assuntos atuais e polêmicos. A opção é por campanhas politicamente corretas, muitas vezes em detrimento da dramaturgia, amarrando a criatividade dos autores”, diz Esther. Segundo a mas vários”, avaliapesquisadora, a estrutura de conflitos melodramáticos que sustenta a narrativa ainda se mantém, mas em histórias que voltam a se restringir a espaços imaginados como femininos, o público inicial dos primórdios da telenovela nacional, e de menor valor cultural. O gênero também não atrai mais tantos talentos criativos, com textos fracos e enredos repetitivos que insistem em velhos clichês e convenções que fizeram sucesso no passado. “Ainda assim, não se pode negar que a novela pode voltar a ter o impacto político e cultural de antes, influindo no comportamento e na moda. Ela ainda é um lugar onde se pode aprender algo, em especial o novo público predominante, abaixo das classes A e B”, fala.
Do apogeu à crise recente de queda de audiências foi um longo caminho. No início imperava o estilo “fantasia”, cheio de sentimentalismo, em produções dos anos 1960, como o exótico Sheik de Agadir, paradigma quebrado com o realismo de Beto Rockfeller, representação da contemporaneidade das classes médias emergentes. Nos anos 1970 romperam-se os limites do dramalhão, mas as novelas viraram vitrines do ser moderno: a moda e o comportamento. “A Globo, durante a ditadura, adotou o discurso oficial, mas entendeu que, nas novelas, ao invés de esconder os problemas, era melhor incorporá-los nas tramas, como fez em O bem-amado. Foi o início de uma crítica crescente ao processo de modernização”, lembra Mauro Porto, professor da Tulane University e autor da pesquisa Telenovelas and national identity in Brazil. O realismo tomou conta do gênero: uma pesquisa de 1988 revelou que 58% dos entrevistados queriam ver “a realidade” nas novelas e 60% desejavam que as tramas falassem da política. “Os autores, de uma geração de esquerda, se viam como responsáveis por um projeto nacional e de consciên-cia popular”, nota Porto. “As novelas registraram os dramas da urbanização, das diferenças sociais, da fragmentação da família, da liberalização das relações conjugais e dos padrões de consumo. Chegaram ao seu ápice quando falaram dos problemas da modernização como Vale tudo (1988) e Roque Santeiro (1985)”, diz Esther. Mas a TV Manchete trouxe uma leitura alternativa do país com Pantanal, pleno do exótico e do erótico, o que rompeu o ciclo político das novelas, inclusive na Globo, que se viu obrigada a emular o novo conceito. “O ‘efeito Pantanal’, porém, não deixou herdeiros e hoje foi esquecido.”
Intimidade – “Nesse percurso, a telenovela criou um repertório comum pelo qual pessoas de classes sociais, gerações, sexo, raça e regiões diferentes se reconheciam, uma ‘comunidade imaginada’ de problematização do Brasil, da intimidade com os problemas sociais, veículo ideal para se construir a cidadania, uma narrativa da nação”, analisa Maria Immacolata Lopes, professora da Escola de Comunicações e Artes (ECA-USP) e coordenadora do Núcleo de Pesquisa de Telenovelas. O modelo se desgastou e o país mudou. “Entre 1970 e 1980 houve uma mágica entre público e novela. Em Vale tudo, pela primeira vez se viu a corrupção num espaço público não político e as novelas estavam na vanguarda”, nota Esther. “Hoje a corrupção é banal, não é mais polêmica, só traz o tédio da repetição. Em 1988 era novidade; em 2011 é algo batido.” As novelas não estão mais antenadas com o país. “Mesmo a literatura contemporânea acadêmica estrangeira sobre televisão já não discute mais a telenovela brasileira e o ‘caso’ brasileiro perdeu espaço interna e externamente diante de uma renovação da ficção televisiva internacional, em especial os seriados americanos, que ganham espaço nos canais nacionais, um novo fluxo de importação de programação que as novelas haviam substituído nas décadas anteriores”, explica. Os sitcons de hoje, ao contrário do passado, quando eram “obras fechadas” e sem improviso, estão abertos aos indicadores de sucesso e podem mudar seu rumo enquanto estão no ar, trazendo alusões a elementos políticos e culturais da realidade americana e problematizando os EUA.
“Não temos a mesma audiência nacional com todas as classes e lugares. Tudo ficou mais popular e as novelas atendem esse público espectador com merchandising social, sexo, dinâmica de tramas que mudam toda hora, ação, assassinatos”, analisa. Para a pesquisadora, essa quebra na dramaturgia reduz ainda mais o escopo do público ao fazer cair o interesse de uma grande parte da au-diência. Esther cita novas alternativas como Cordel encantado, que remete às novelas fantasiosas. Há também a procura de novos autores e diretores ou o remake de antigos sucessos, como O astro, para recuperar fórmulas de sucesso do passado, mas, mesmo adaptadas, conservam sabor de “coisa velha”. “Não sabemos se os brasileiros ainda desejam o realismo, mas é certo que se cansaram das novelas urbanas no eixo Rio-São Paulo. Gostariam de conhecer novas rea-lidades e o aspecto regional antes desprezado ou caricaturado.” A renovação não é fácil, como mostra o fracasso de experimentações como Cidade de Deus ou Antonia. “Uma solução seria mostrar a violência das cidades, do tráfico, mas isso ainda é tabu nas novelas. O cinema se revelou mais ‘antenado’ ao mostrar os poderes paralelos das periferias, como em Tropa de elite. Ou, Dois filhos de Francisco, filme que traz um Brasil onde os humildes se realizam.” A novela, pela primeira vez, perdeu o bonde da história. Num escândalo recente, um colunista político não usou uma citação de novela, como Golbery, para falar do caso, mas o bordão do filme Tropa de elite: “Palocci, pede pra sair!”.
Mental illness rise linked to climate (Sydney Morning Herald)
Erik Jensen Health
“Emotional injury, stress and despair” … the impact of climate change on health. Photo: Reuter
RATES of mental illnesses including depression and post-traumatic stress will increase as a result of climate change, a report to be released today says.
The paper, prepared for the Climate Institute, says loss of social cohesion in the wake of severe weather events related to climate change could be linked to increased rates of anxiety, depression, post-traumatic stress and substance abuse.
As many as one in five people reported ”emotional injury, stress and despair” in the wake of these events.
The report, A Climate of Suffering: The Real Cost of Living with Inaction on Climate Change, called the past 15 years a ”preview of life under unrestrained global warming”.
”While cyclones, drought, bushfires and floods are all a normal part of Australian life, there is no doubt our climate is changing,” the report says.
”For instance, the intensity and frequency of bushfires is greater. This is a ‘new normal’, for which the past provides little guidance …
”Moreover, recent conditions are entirely consistent with the best scientific predictions: as the world warms so the weather becomes wilder, with big consequences for people’s health and well-being.”
The paper suggests a possible link between Australia’s recent decade-long drought and climate change. It points to a breakdown of social cohesion caused by loss of work and associated stability, adding that the suicide rate in rural communities rose by 8 per cent.
The report also looks at mental health in the aftermath of major weather events possibly linked to climate change.
It shows that one in 10 primary school children reported symptoms of post-traumatic stress disorder in the wake of cyclone Larry in 2006. More than one in 10 reported symptoms more than three months after the cyclone.
”There’s really clear evidence around severe weather events,” the executive director of the Brain and Mind Research Institute, Professor Ian Hickie, said.
”We’re now more sophisticated in understanding the mental health effects and these effects are one of the major factors.
”What we have seriously underestimated is the effects on social cohesion. That is very hard to rebuild and they are critical to the mental health of an individual.”
Professor Hickie, who is launching the report today, said climate change and particularly severe weather events were likely to be a major factor influencing mental health in the future.
”When we talk about the next 50 years and what are going to be the big drivers at the community level of mental health costs, one we need to factor in are severe weather events, catastrophic weather events,” he said.
Educating the Obvious (N.Y.Times)
By BRIAN McFADDEN. Published: August 27, 2011
Mais da metade dos alunos não sabe resolver operações matemáticas básicas (JC, O Globo)
JC e-mail 4331, de 26 de Agosto de 2011.
Prova ABC avaliou desempenho de recém-alfabetizados; em leitura, resultado foi melhor: 56,1% mostraram dominar a língua.
Resultados de um teste aplicado em seis mil alunos de todas as capitais e do Distrito Federal mostram que 57,2% dos estudantes do 3 º ano do ensino fundamental – a antiga 2ª série – não conseguem resolver problemas básicos de matemática, como soma ou subtração. Inédita no País, a Prova ABC também avaliou a aprendizagem de leitura e escrita. “A dificuldade é na hora de fazer a conta do ‘vai um'”, explicou ontem, em São Paulo, na divulgação dos resultados, o professor Rubem Klein, da Fundação Cesgranrio, referindo-se à soma de números superiores a uma dezena.
A Prova ABC, ou Avaliação Brasileira do Final do Ciclo de Alfabetização, foi realizada pelo movimento Todos Pela Educação, em parceria com o Instituto Nacional de Estudos e Pesquisas Educacionais Anísio Teixeira (Inep), a Fundação Cesgranrio e o Instituto Paulo Montenegro/Ibope. O teste foi aplicado no início deste em estudantes de 250 escolas, conforme a proporção em cada rede (privada, estadual e municipal).
Em matemática, a média nacional de alunos do terceiro ano (2ª série) que aprenderam o esperado foi de 42,8%, o que significa que 57,2% não sabem o mínimo adequado para este período do aprendizado. As escolas privadas tiveram média de 74,3%, e as públicas, de apenas 32,6%, uma diferença de 41,7 pontos percentuais.
“Temos de levar em consideração que os professores dessas primeiras séries são formados em Pedagogia, curso que atrai pessoas de classes mais baixas e que não tiveram boa formação em matemática. É um ciclo vicioso que precisa ser rompido”, analisou o professor Paulo Horta, do Inep.
Teste de leitura: 43,9% não aprenderam o suficiente – A média nacional na prova de leitura foi 56,1%. Isso quer dizer que o restante, ou seja, 43,9% dos alunos, não aprenderam o suficiente. O índice dos que aprenderam o esperado chegou a 79% nas escolas particulares. Já nas públicas ficou em 48,6%.
Em escrita, o índice nacional dos que aprenderam o esperado caiu para 53,4%, ou seja, 46,6% não tiveram o aprendizado adequado. Nas escolas privadas, o aproveitamento foi 82,4%; nas públicas, 43,9%. “Mesmo com um índice melhor das escolas privadas, que são o objetivo dessa nova classe média para os seus filhos, elas não chegaram a 100%. E 100% significa apenas o que é esperado que as crianças tenham aprendido. No geral, a Prova ABC mostrou que as crianças que frequentam os três primeiros anos da escola não estão tendo garantido o direito básico que tem à aprendizagem”, observou a diretora-executiva do Todos Pela Educação, Priscila Cruz.
Assim como em outros índices da educação brasileira, o desempenho dos alunos do Sul e do Sudeste superou na Prova ABC os resultados das crianças do Norte e Nordeste, conforme o modelo do Sistema de Avaliação da Educação Básica (Saeb). A nota média nacional em matemática foi 171,07 (o desejado era 175), mas no Sul chegou a 185,64 e, no Sudeste, a 179,06. Por outro lado, foi bem menor no Norte (152,62), no Nordeste (158,19).
Regionalmente ainda os resultados da prova de leitura, conforme a escala Saeb de 175 pontos, não foram diferentes. No Sul a média foi 197,93, enquanto no Nordeste chegou a 167,37, uma diferença de 30 pontos. No Centro-Oeste a nota foi 196,57, no Sudeste 193,57 e no Norte 172,78.
Na prova de escrita – cuja nota média para um nível de aprendizagem considerado exitoso é 75, em uma escala de 0 a 100 – o Sudeste atingiu a média de 77,2 – uma diferença de 27 pontos em relação aos 50,2 do Nordeste.
Média nas escolas privadas foi 211; nas públicas, 158 – A metodologia da Prova ABC leva em conta a mesma escala Saeb, responsável por compor a nota do Índice de Desenvolvimento da Educação Básica (Ideb), que é o principal indicador de qualidade da educação do País. Nessa prova, como no Saeb, os alunos precisaram obter um resultado igual a 175 pontos para que o aprendizado equivalente ao terceiro ano (ou segunda série) seja considerado suficiente. Na prova escrita, no entanto, que foge do padrão Saeb, a nota média considerada de bom desempenho foi 75.
Por nota, a média nacional em matemática foi 171,1 (211,2 para escolas privadas e 158 para as públicas. Na prova de leitura, a nota média do País foi 185,8 (216,7 para as particulares e 175,8 para as públicas). Com outra escala de pontos, na prova escrita a nota média foi 68,1 (86,2 na rede privada e 62,3 na pública).
Em matemática, para conseguir os 175 pontos, as crianças teriam que demonstrar domínio de soma e subtração resolvendo problemas envolvendo, por exemplo, notas e moedas. Na prova de leitura, os alunos deveriam identificar temas de uma narrativa, identificar características de personagens em textos, como lendas, fábulas e histórias em quadrinhos, e perceber relações de causa e efeito nas narrativas Já na escrita foram exigidas três competências: adequação ao tema e ao gênero, coesão e coerência e registro (grafia, normas gramaticais, pontuação e segmentação de palavras).
Climate Cycles Are Driving Wars: When El Nino Warmth Hits, Tropical Conflicts Double (Science Daily)
ScienceDaily (Aug. 24, 2011) — In the first study of its kind, researchers have linked a natural global climate cycle to periodic increases in warfare. The arrival of El Niño, which every three to seven years boosts temperatures and cuts rainfall, doubles the risk of civil wars across 90 affected tropical countries, and may help account for a fifth of worldwide conflicts during the past half-century, say the authors.
El Nino drought cycles heavily affecting some 90 countries (red) appear to be helping drive modern civil wars. (Credit: Courtesy Hsiang et al./Nature)
The paper, written by an interdisciplinary team at Columbia University’s Earth Institute, appears in the current issue of the leading scientific journal Nature.
In recent years, historians and climatologists have built evidence that past societies suffered and fell due in connection with heat or droughts that damaged agriculture and shook governments. This is the first study to make the case for such destabilization in the present day, using statistics to link global weather observations and well-documented outbreaks of violence. The study does not blame specific wars on El Niño, nor does it directly address the issue of long-term climate change. However, it raises potent questions, as many scientists think natural weather cycles will become more extreme with warming climate, and some suggest ongoing chaos in places like Somalia are already being stoked by warming climate.
“The most important thing is that this looks at modern times, and it’s done on a global scale,” said Solomon M. Hsiang, the study’s lead author, a graduate of the Earth Institute’s Ph.D. in sustainable development. “We can speculate that a long-ago Egyptian dynasty was overthrown during a drought. That’s a specific time and place, that may be very different from today, so people might say, ‘OK, we’re immune to that now.’ This study shows a systematic pattern of global climate affecting conflict, and shows it right now.”
The cycle known as the El Niño-Southern Oscillation, or ENSO, is a periodic warming and cooling of the tropical Pacific Ocean. This affects weather patterns across much of Africa, the Mideast, India, southeast Asia, Australia, and the Americas, where half the world’s people live. During the cool, or La Niña, phase, rain may be relatively plentiful in tropical areas; during the warmer El Niño, land temperatures rise, and rainfall declines in most affected places. Interacting with other factors including wind and temperature cycles over the other oceans, El Niño can vary dramatically in power and length. At its most intense, it brings scorching heat and multi-year droughts. (In higher latitudes, effects weaken, disappear or reverse; La Niña conditions earlier this year helped dry the U.S. Southwest and parts of east Africa.)
The scientists tracked ENSO from 1950 to 2004 and correlated it with onsets of civil conflicts that killed more than 25 people in a given year. The data included 175 countries and 234 conflicts, over half of which each caused more than 1,000 battle-related deaths. For nations whose weather is controlled by ENSO, they found that during La Niña, the chance of civil war breaking out was about 3 percent; during El Niño, the chance doubled, to 6 percent. Countries not affected by the cycle remained at 2 percent no matter what. Overall, the team calculated that El Niño may have played a role in 21 percent of civil wars worldwide — and nearly 30 percent in those countries affected by El Niño.
Coauthor Mark Cane, a climate scientist at Columbia’s Lamont-Doherty Earth Observatory, said that the study does not show that weather alone starts wars. “No one should take this to say that climate is our fate. Rather, this is compelling evidence that it has a measurable influence on how much people fight overall,” he said. “It is not the only factor–you have to consider politics, economics, all kinds of other things.” Cane, a climate modeler, was among the first to elucidate the mechanisms of El Niño, showing in the 1980s that its larger swings can be predicted — knowledge now used by organizations around the world to plan agriculture and relief services.
The authors say they do not know exactly why climate feeds conflict. “But if you have social inequality, people are poor, and there are underlying tensions, it seems possible that climate can deliver the knockout punch,” said Hsiang. When crops fail, people may take up a gun simply to make a living, he said. Kyle C. Meng, a sustainable-development Ph.D. candidate and the study’s other author, pointed out that social scientists have shown that individuals often become more aggressive when temperatures rise, but he said that whether that applies to whole societies is only speculative.
Bad weather does appear to tip poorer countries into chaos more easily; rich Australia, for instance, is controlled by ENSO, but has never seen a civil war. On the other side, Hsiang said at least two countries “jump out of the data.” In 1982, a powerful El Niño struck impoverished highland Peru, destroying crops; that year, simmering guerrilla attacks by the revolutionary Shining Path movement turned into a full-scale 20-year civil war that still sputters today. Separately, forces in southern Sudan were already facing off with the domineering north, when intense warfare broke out in the El Niño year of 1963. The insurrection abated, but flared again in 1976, another El Niño year. Then, 1983 saw a major El Niño–and the cataclysmic outbreak of more than 20 years of fighting that killed 2 million people, arguably the world’s bloodiest conflict since World War II. It culminated only this summer, when South Sudan became a separate nation; fighting continues in border areas. Hsiang said some other countries where festering conflicts have tended to blow up during El Niños include El Salvador, the Philippines and Uganda (1972); Angola, Haiti and Myanmar (1991); and Congo, Eritrea, Indonesia and Rwanda (1997).
The idea that environment fuels violence has gained currency in the past decade, with popular books by authors like Jared Diamond, Brian Fagan and Mike Davis. Academic studies have drawn links between droughts and social collapses, including the end of the Persian Gulf’s Akkadian empire (the world’s first superpower), 6,000 years ago; the AD 800-900 fall of Mexico’s Maya civilization; centuries-long cycles of warfare within Chinese dynasties; and recent insurgencies in sub-Saharan Africa. Last year, tree-ring specialists at Lamont-Doherty Earth Observatory published a 1,000-year atlas of El Niño-related droughts; data from this pinpoints droughts coinciding with the downfall of the Angkor civilization of Cambodia around AD 1400, and the later dissolution of kingdoms in China, Vietnam, Myanmar and Thailand.
Some scientists and historians remain unconvinced of connections between climate and violence. “The study fails to improve on our understanding of the causes of armed conflicts, as it makes no attempt to explain the reported association between ENSO cycles and conflict risk,” said Halvard Buhaug, a political scientist with the Peace Research Institute Oslo in Norway who studies the issue. “Correlation without explanation can only lead to speculation.” Another expert, economist Marshall Burke of the University of California, Berkeley, said the authors gave “very convincing evidence” of a connection. But, he said, the question of how overall climate change might play out remains. “People may respond differently to short-run shocks than they do to longer-run changes in average temperature and precipitation,” he said. He called the study “a useful and illuminating basis for future work.”
The above story is reprinted (with editorial adaptations by ScienceDaily staff) from materials provided by The Earth Institute at Columbia University.
Journal Reference:
Solomon M. Hsiang, Kyle C. Meng, Mark A. Cane. Civil conflicts are associated with the global climate. Nature, 2011; 476 (7361): 438 DOI: 10.1038/nature10311
Anti-reflexivity for beginners
As Prosperity Rises in Brazil’s Northeast, So Does Drug Violence (N.Y. Times)
A house in Nova Constituinte, in Salvador, is protected by a makeshift fence. The arrival of crack cocaine has been particularly devastating there, and the number of murders in Bahia increased 430 percent between 1999 and 2008. Lalo de Almeida for The New York Times.
By ALEXEI BARRIONUEVO
Published: August 29, 2011
SALVADOR, Brazil — Jenilson Dos Santos Conceição, 20, lay face down on the rough concrete, his body twisted, sandals still on his feet, as the blood from his 14 bullet wounds stained the sloped alleyway.
A small crowd of residents watched dispassionately as a dozen police officers hovered around the young man’s lifeless body.“He was followed until he was executed right here,” said Bruno Ferreira de Oliveira, a senior investigator. “They wanted to make sure he was dead.”
Mr. Conceição was the third person found murdered in the state of Bahia on that July day. By day’s end, 6 would die violently, and by month’s end 354 had been killed, the police said.
The geography of violence in Brazil has been turned on its head the past few years. In the southeast, home to Rio de Janeiro, São Paulo and many of the country’s most enduring stereotypes of shootouts and kidnappings, the murder rate actually dropped by 47 percent between 1999 and 2009, according to a study by José Maria Nóbrega, a political science professor at the Federal University of Campina Grande.
But here in the northeast, a poor region that benefited most from the wealth-transfer programs that former President Luiz Inácio Lula da Silva championed during his eight years in office, the murder rate nearly doubled in the same 10-year period, turning this area into the nation’s most violent, Dr. Nóbrega found.
Salvador, the region’s largest city, is one of Brazil’s biggest tourist draws, the gateway to some of the country’s most spectacular beaches. And like Rio, it is preparing to co-host the 2014 World Cup. So the authorities here are taking a page from Rio’s playbook, trying to grapple with the surge in violent crime by establishing permanent police units in violent areas frequented by drug traffickers.
The community police forces being installed here are similar to the “police pacification units” the Rio government has been using — to both great fanfare and controversy — since 2008 to stem drug violence there.
The northeast has long been plagued by crime, but the increase illustrates how Brazil’s economic boom is causing drug-related violence — the main cause for the homicide scourge — to migrate to other parts of the country as traffickers seek new markets, straining local police forces, according to both Dr. Nóbrega and local officials.
The same economic wave that put more money in millions of poor Brazilians’ pockets, especially here in the north, has also stimulated more drug trafficking and the deadly crime associated with it, officials here contended. Drug traffickers, realizing the potential of a stronger market, have focused more heavily on the northeast, resulting in drug wars and addiction-fueled violence, they said.
“If the consumer market is booming, the drug trafficker will come here as well,” said Jaques Wagner, the governor of Bahia. “The social progress in Brazil is visible. But at the same time we still have trouble with drug trafficking and with a lack of respect for human life.”
In the states of Bahia and Alagoas, especially, there has been an explosion of violence in the past decade. The number of murders in Bahia grew by 430 percent, to 4,709, between 1999 and 2008, Dr. Nóbrega said, and last year the state’s murder rate of 34.2 per 100,000 residents was higher than Rio’s, which fell to 29.8. (Bahia officials said that after leveling off in 2010, homicides were down 13 percent through July 2011 compared with the first seven months of 2010.)
Travel agencies say they are concerned about the rise in violent crime in Bahia’s slums — as well as the drug-fueled petty assaults in Pelourinho, Salvador’s colorful historic center.
“Salvador, right now, is not ready for the World Cup by any stretch, and they are starting to realize that,” said Paul Irvine, the director of Dehouche, a travel agency in Rio de Janeiro that organizes trips to both cities.
Governor Wagner shrugged off such assertions, noting that Bahia holds a Carnaval celebration every year where more than one million people take to the streets, with 22,000 police officers providing security.
“We have gone four years without a homicide on the parade route,” he said. “For me, police readiness for the World Cup won’t be any problem at all.”
Rio’s violent slums have been characterized by battles between the police and heavily armed drug gangs that have controlled large areas. But in the northeast, security officials contend, people have historically settled disputes on their own — neighbor to neighbor, with deadly impunity.
“The northeast is used to seeking justice with its own hands,” said Mauricio Teles Barbosa, the secretary of security in Bahia. “They do not believe in the police because they were the police. They were the colonels, the outlaws that sought justice without the participation of the state.”
Mr. Wagner argued that these attitudes toward violence, along with an indifference shown by the state in providing police protection and social services, allowed murders to go largely unchecked. But more rampant drug trafficking, fueled in part by criminal gangs operating out of São Paulo, has greatly worsened the situation, Mr. Barbosa said.
The arrival of crack cocaine has been particularly devastating. In Nova Constituinte, a community on the outskirts of Salvador that sprouted on a former banana plantation, a series of drug-related killings has stalked the area for the past five years, including the massacre of six teenagers caught in the crossfire of rival gangs, said Arnaldo Anselmo, 42, a community leader.
Gildasio Oliveira Silva said that drug traffickers twice tried to kill his teenage son, who had fallen prey to crack and owed his dealers money. Last December, he said, they gunned down his wife, Ana Maria Passos ou Assis, 39, as she was cleaning the bathroom of Mr. Silva’s small convenience store along Nova Constituinte’s main avenue.
“The violence has gotten worse here,” said Mr. Silva, 68, a former police officer. “And it’s all related to drugs.”
After becoming governor in 2007, Mr. Wagner vowed to build up the police and try to stem the surging violence. He has added 7,000 new police officers in the past four years and authorized 3,500 more this year.
Bahia inaugurated its first community police unit in Calabar, a poor enclave surrounded by more expensive high-rises. Since opening in April with 120 officers, no homicides have been reported, said Capt. Maria de Oliveira Silva, who heads the unit.
“In the last three years, you didn’t go a month without someone getting killed here,” said Lindalva Reis, 58, who has lived in Calabar for 38 years.
Three more community police units are scheduled to open over the next year near Nova Constituinte.
Like the units in Rio, the officers being selected are mostly rookies, to try to cut down on corruption and the more aggressive habits of some older officers.
Unlike in Rio, the installation of the new units here has not required first clearing out entrenched drug gangs with bloody police and military operations that can last weeks.
To counter criticism that its police have struggled to solve crimes, the Bahia State government established a dedicated homicide department earlier this year, with 150 officers focused on murder investigations.
Among the challenges of the new unit is rooting out “extermination groups,” militias composed of police officers who have practiced vigilante justice and been suspected in dozens of murders, said Arthur Gallas, the homicide unit’s director.
Then there is the mountain of unresolved cases. In the new department’s offices, investigators recently pored over stacks of files containing 1,500 unsolved homicides dating from before 2007.
But the new push is still a work in progress.
At the crime scene of Mr. Conceição, the police did not set up security tape to prevent evidence contamination. “Preserving evidence is very difficult here,” said Helder Cunha, a crime scene investigator, noting that a proposal to require crime scene tape in Bahia had yet to be put into practice.
Myrna Domit contributed reporting from São Paulo.
Profits Before Environment (N.Y. Times)
August 30, 2011, 10:27 PM
By MARK BITTMAN
I wasn’t surprised when the administration of George W. Bush sacrificed the environment for corporate profits. But when the same thing happens under a Democratic administration, it’s depressing. With little or no public input, policies that benefit corporations regardless of the consequences continue to be enacted.
No wonder an April 2010 poll from the Pew Research Center found that about only 20 percent of Americans have faith in the government (it’s one thing upon which the left and right and maybe even the center agree). But maybe this is nothing new: as Glenda Farrell, as Genevieve “Gen” Larkin, put it in “Gold Diggers of 1937,” “It’s so hard to be good under the capitalistic system.”
But is anyone in power even trying? Last winter, the Department of Agriculture deregulated Monsanto’s genetically modified alfalfa, despite concerns about cross-pollination of non-genetically modified crops. It then defied a court order banning the planting of genetically modified sugar beets pending completion of an environmental impact study.
Monsanto engineers these plants and makes Roundup, the herbicide they resist. But Roundup-ready crops don’t increase long-term yields, a host of farmers are now dealing with “superweeds” and there is worry about superbugs, nearly all courtesy of Monsanto. In fact, this system doesn’t contribute to much of anything except Monsanto’s bottom line. Yet Agriculture Secretary Tom Vilsack gave Monsanto the nod, perhaps yielding to pressure from the White House.
The United States exerts that same kind of pressure abroad. WikiLeaks cables show that U.S. “biotechnology outreach programs” have promoted genetically modified crops in Africa, Asia and South America; they’ve also revealed that diplomats schemed to retaliate against any European Union countries that oppose those crops.
Sacrificing the environment for profits didn’t stop with Bush, and it doesn’t stop with genetically modified organisms. Take, for example, the Keystone XL pipeline extension. XL is right: the 36-inch-wide pipeline, which will stretch from the Alberta tar sands across the Great Plains to the Gulf Coast, will cost $7 billion and run for 1,711 miles — more than twice as long as the Alaska pipeline. It will cross nearly 2,000 rivers, the huge wetlands ecosystem called the Nebraska Sandhills and the Ogallala aquifer, the country’s biggest underground freshwater supply.
If Keystone is built, we’ll see rising greenhouse gas emissions right away (tar sands production creates three times as many greenhouse gases as does conventional oil), and our increased dependence on fossil fuels will further the likelihood of climate-change disaster. Then there is the disastrous potential of leaks of the non-Wiki-variety. (It’s happened before.)
Proponents say the pipeline will ease gas prices and oil “insecurity.” But domestic drilling has raised, not lowered, oil prices, and as for the insecurity — what we need is to develop wiser ways to use the oil we have.
They say, too, that the pipeline could create 100,000 new jobs. But even the Amalgamated Transit Union and the Transport Workers Union oppose the pipeline, saying, “We need jobs, but not ones based on increasing our reliance on Tar Sands oil.”
Sounds as if union officials have been reading the writer and activist Bill McKibben, who calls the pipeline “a fuse to the biggest carbon bomb on the continent,” and NASA scientist Jim Hansen, who says the oil Keystone will deliver “is essentially game over” for the planet.
Game over? No problem, says the State Department, which concluded that the project will have no significant impact on “most resources along the proposed pipeline corridor.” The Sierra Club quickly responded by calling the report “an insult to anyone who expects government to work for the interests of the American people.”
I do expect that, and I am insulted. President Obama can deny Keystone the permit. A truly environmentally friendly president (like the one candidate Obama appeared to be) would be looking for creative ways to leave fossil fuels underground, not extract them. Perhaps he doesn’t “believe in” global warming at this point, like many Republicans?
When government defends corporate interests, citizens must fight. McKibben has helped organize protests at the White House against Keystone, and he’s one of hundreds who’ve been arrested in the last couple of weeks. These people are showing that the role of government as corporate ally must be challenged.
As it will be in the fight against carte blanche for genetically modified organisms: From Oct. 1 to Oct. 16, there will be a march from New York City to Washington to demand that genetically modified foods be labeled, something a majority of Americans want. This small, perfectly reasonable request has run into joint opposition from the biotech industry and (here we go again) the Food and Drug Administration.
Why are most of us are filled with mistrust of the government? Maybe because we, like Gen Larkin, know it’s so hard to be good under the capitalistic system.
Life of Hurricane Irene from the Caribbean to Canada
O MITO DO TORCEDOR VIOLENTO (Fazendo Media/Le Monde Diplomatique Brasil)
Por Irlan Simões, 02.08.2011
Fazendo Media
Em maio de 2010, após intensas discussões entre o poder público, a Polícia Militar e presidentes de clubes, o estado de Sergipe tornou-se pioneiro em um processo que avança sobre o futebol brasileiro: a criminalização das torcidas organizadas (ou T.O.s). Uma mestranda do núcleo de Pós-Graduação em Psicologia Social da Universidade Federal de Sergipe, Klecia Renata de Oliveira Batista, animou-se a avaliar tal fenômeno.
Durante dois anos, a mestranda sergipana acompanhou o funcionamento interno da torcida Trovão Azul, adepta do Confiança, interessada em estudar a violência no meio. Intitulado “Entre torcer e ser banido, vamos nos (re)organizar: um estudo psicanalítico da torcida Trovão Azul”, a tese tornou-se um documento inédito sobre a criminalização das torcidas organizadas a partir da realidade sergipana. “Foi um processo fundamental para o meu trabalho, justamente quando eu estava tentando mapear a pressão que a torcida vinha enfrentando no momento”, afirma Klecia.
Defendido em 27 de maio último, o trabalho de Klecia aponta: a “modernização” do futebol brasileiro visa na verdade adequar o jogo aos interesses do mercado; ela está sendo imposta mesmo que as transformações custem a perda dos valores culturais embutidos no futebol. “O que se vê hoje é a torcida organizada enquadrando-se ao que alguns historiadores chamam de torcidas-empresa, rendendo-se a uma lógica organizada pelo capital”, afirma a pesquisadora.
O Estado como protagonista
Visando explicar o fenômeno, a mestranda recorreu ao referencial psicanalítico de Sigmund Freud. Ela sugere que, na busca de uma adequação dos estádios e do jogo ao que se entende pelo “ideal da ordem, limpeza e beleza da Modernidade”. Justifica assim as medidas punitivas que têm sido tomadas contra as torcidas organizadas.
Segundo a pesquisadora, estes coletivos cumpriam papel de resistência a esse processo. “Hoje, não há mais margem de sobreviver no futebol fora desse padrão de “modernidade”. Dessa realidade, a única coisa que tinha sobrado eram as torcidas, que agora também estão sendo ameaçadas”, afirma. Para ela, a violência no futebol não se restringe às torcidas organizadas. Na realidade, a violência é própria da vida do homem em sociedade e as torcidas constituem, no âmbito futebolístico, um microespaço no qual essa violência se torna presente.
“O novo Estatuto do Torcedor é o carro-chefe desse processo de modernização”, afirma Klecia Renata, questionando o papel que o projeto aplicado pelo ministério dos Esportes vem cumprindo. Para ela, a lei sancionada em 2010 é responsável pelas ameaças de banimento, proibição da entrada nos estádios, venda de materiais padronizados e criminalização dos torcedores organizados. Ainda segundo a pesquisadora, a reorganização das T.Os tem gerado elitização de seu corpo de integrantes, uma vez que a concepção de que o torcedor mais pobre é o causador da violência é o que tem imperado no senso comum.
Panorama nacional
Além da orientação do professor Eduardo Leal Cunha e da presença de Daniel Menezes Coelho, ambos da UFS, a defesa da dissertação teve como convidado o historiador Bernardo Borges Buarque de Hollanda, doutor em História Social pela Pontifícia Universidade Católica do Rio de Janeiro e pesquisador do Centro de Pesquisa e Documentação de História Contemporânea da Faculdade Getúlio Vargas (CPDOC-FGV). Estudioso do assunto há mais de dez anos, Bernardo reforçou, no seu comentário como integrante da banca da defesa da dissertação, a ligação entre o “ideal da ordem e limpeza da Modernidade” e o processo de elitização do público torcedor do futebol, traçando paralelos com os processos ocorridos em outros países, como a Inglaterra.
O pesquisador, que também estudou o histórico das torcidas organizadas no Brasil, lembra que criminalizar os torcedores uniformizados é parte do mesmo projeto que busca excluir o torcedor mais pobre dos estádios. “Isso é uma forma de elitizar o espectador, e essa vai ser a tendência. O “telespectador” vai ser o lugar das classes populares”, afirma. Bernardo justifica sua hipótese mostrando como os estádios têm diminuído, após sucessivas reformas, a sua capacidade de público e aumentado o valor dos ingressos buscando atingir apenas um público consumidor de classe média-alta.
Um aspecto também ressaltado pelo estudioso é a movimentação das torcidas organizadas buscando frear tal processo. No Rio de Janeiro, foi fundada a Federação das Torcidas Organizadas, a Ftorj, enquanto no âmbito nacional a Confederação das Torcidas Organizadas (Conatorg) dá os primeiros passos. “É sempre muito difícil uma representação das torcidas organizadas porque existem muitos conflitos internos e entre elas. Mas já é um sinal de que há um avanço, uma possibilidade de declamar direitos. Não apenas deveres, como querem os dirigentes”, afirma.
Quando questionado sobre como o senso comum brasileiro tem apoiado tal processo de modernização, Bernardo é enfático: “É muito desigual essa transmissão de mensagens”. Para ele há grande dificuldade em explicar como esse processo vai excluir os próprios torcedores que aprovam tais medidas.
O avanço do processo de criminalização
Em 13 de junho de 2011, o Ministério Público do Estado do Rio de Janeiro acionou as torcidas organizadas para uma audiência pública. Estavam presentes representantes de 36 torcidas, do ministério do Esporte, da Polícia Militar, da secretaria de Estado de Esporte e Lazer, da superintendência de Desportos do Estado do Rio de Janeiro (Suderj) e da Federação das Torcidas Organizadas do Rio de Janeiro (Ftorj).
Todos os convidados tiveram de assinar um Termo de Ajustamento de Conduta (TAC) que operacionaliza o Estatuto do Torcedor. Entre as exigências, estão a proibição de diversos artigos, como bandeiras, faixas, e materiais que possivelmente ocasionariam o ferimento dos presentes no estádio e a penalização da Torcida Organizada em caso de descumprimento de algumas normas por parte de algum dos seus integrantes.
Ao fim da Audiência, Flávio Martins, presidente da Ftorj, lamentou que apenas as torcidas organizadas fossem responsabilizadas pelo esvaziamento dos estádios. “Muito se fala da violência promovida pelas torcidas, mas nunca se questiona a condição do transporte público que tem sido disponibilizado, nem o valor dos ingressos e nem o horário dos jogos”, afirmou.
(*) Matéria publicada originalmente no Outras Palavras, do Le Monde Diplomatique Brasil.
Seca de 2010 na Amazônia foi a mais drástica desde 1902 (Fapesp)
Constatação foi feita por pesquisadores do Inpe a partir da análise de série histórica de dados de pluviosidade na região da bacia amazônica (foto:Fapeam)
30/08/2011
Agência FAPESP – Cientistas do Instituto Nacional de Pesquisas Espaciais (Inpe) concluíram em um estudo, publicado na revista Geophysical Research Letters, que a seca de 2010 na Amazônia foi a mais drástica já registrada desde 1902, superando a de 2005, que até então era considerada a maior do século.
A constatação foi feita a partir da análise de uma série histórica de dados de pluviosidade na região da bacia amazônica, com medições desde 1902.
Os resultados do estudo apontam que o processo teve início no começo do verão, durante o El Niño (um processo natural de aquecimento das águas do Pacífico), mas foi intensificado pelo aquecimento das águas tropicais do Atlântico Norte. Em função disso, se originou uma estação seca que se estendeu por muitos meses, ocasionando alterações no ciclo hidrológico.
Como consequência desse processo, houve rebaixamento dos níveis de água e seca completa de cursos d’água e tributários de rios na bacia amazônica. A região sul foi a mais afetada. O fenômeno causou graves problemas socioambientais, especialmente às populações ribeirinhas, que ficaram isoladas por dependerem dos rios para seu deslocamento.
Em outro artigo recém-publicado na revista Theoretical Applied Climatology, pesquisadores do Inpe apresentaram os resultados de um amplo estudo sobre as inundações na Amazônia e Nordeste do Brasil, ocorridas no período de maio a julho de 2009. O fenômeno provocou mortes e deixou milhares de famílias desabrigadas. O trabalho demonstra que essas chuvas torrenciais foram as mais intensas e duradouras já registradas.
O rio Negro, principal tributário do rio Amazonas, atingiu seu maior nível em 107 anos. Os autores concluíram que o evento foi resultado de uma conjuntura de fatores meteorológicos, especialmente o aquecimento acima do normal das águas superficiais do Atlântico Sul – aspecto importante para a explicação das chuvas abundantes em vastas regiões do leste amazônico e Nordeste do país.
Os pesquisadores destacaram também que esses episódios extremos, assim como a seca duradoura ocorrida no ano de 2010 na bacia amazônica, reforçam a hipótese de que anomalias no regime pluviométrico e de temperatura serão mais frequentes em cenários futuros de mudanças climáticas.
Entre os autores dos estudos está José Antônio Marengo Orsini, chefe do Centro de Sistema Terrestre do Inpe.
O artigo The drought of 2010 in the context of historical droughts in the Amazon region(doi:10.1029/2011GL047436), de Orsini e outros, pode ser lido em www.inpe.br/noticias/arquivos/pdf/2011GL047436.pdf.
O partido anticiência (JC, O Globo)
JC e-mail 4333, de 30 de Agosto de 2011.
Artigo de Paul Krugman publicado no O Globo de hoje (30).
John Huntsman Jr., ex-governador de Utah e embaixador na China, não é um forte pré-candidato à indicação do Partido Republicano para concorrer à Presidência. E isto é muito ruim porque o desejo de Huntsman é dizer o indizível sobre o partido – especialmente que ele está se tornando o “partido anticiência”. Isto é algo enormemente importante. E deveria nos aterrorizar.
Para entender o que Huntsman defende, considere declarações recentes dos dois mais fortes pretendentes à indicação republicana: Rick Perry e Mitt Romney.
Perry, governador do Texas, fez manchetes recentemente ao fazer pouco da evolução humana como uma “simples teoria”, que tem “algumas lacunas” – uma observação que soaria como novidade para a vasta maioria dos biólogos. Mas o que mais chamou a atenção foi o que ele disse sobre mudança climática: “Penso que há um número substancial de cientistas que manipulou dados para obter dólares para seus projetos. E penso que estamos vendo, quase toda semana, ou todo dia, cientistas questionando a ideia original de que o aquecimento global provocado pelo homem é a causa da mudança climática.” É uma declaração extraordinária – ou talvez o adjetivo correto seja “vil”.
A segunda parte da declaração de Perry é falsa: o consenso científico sobre a interferência humana no aquecimento global – que inclui 97% a 98% dos pesquisadores de campo, segundo a Academia Nacional de Ciências – está se tornando mais forte à medida que aumentam as evidências sobre a mudança do clima.
De fato, se você acompanha a ciência climática sabe que o principal aspecto nos últimos anos tem sido a preocupação crescente de que as projeções sobre o futuro do clima estejam subestimando o provável aumento da temperatura. Advertências de que poderemos enfrentar mudanças cimáticas capazes de ameaçar a civilização no fim do século, antes consideradas estranhas, partem agora dos principais grupos de pesquisa.
Mas não se preocupe, sugere Perry; os cientistas estão apenas atrás de dinheiro, “manipulando dados” para criar uma falsa ameaça. Em seu livro “Fed Up”, ele despreza a ciência do clima como “uma bagunça falsa e artificial que está se desmanchando”.
Eu poderia dizer que Perry está tirando isso de uma teoria conspiratória verdadeiramente louca, que afirma que milhares de cientistas de todo o mundo estão levando dinheiro, sem que nenhum deseje quebrar o código de silêncio. Poderia apontar que múltiplas investigações em acusações de falsidade intelectual da parte dos cientistas climáticos acabaram com a absolvição dos pesquisadores de todas as acusações. Mas não se preocupe: Perry e os que pensam como ele sabem em que desejam acreditar e sua resposta a qualquer um que os contradiga é iniciar uma caça às bruxas.
Então de que modo Romney, o outro forte concorrente à indicação republicana, respondeu ao desafio de Perry? Correndo dele. No passado, Romney, ex-governador de Massachusetts, endossou fortemente a noção de que a mudança climática provocada pelo homem é uma real preocupação. Mas, na semana passada, ele suavizou isso e disse pensar que o mundo está realmente esquentando, mas “eu não conheço isto” e “não sei se isso é causado principalmente pelo homem”. Que coragem moral!
É claro, sabemos o que está motivando a súbita falta de convicção de Romney. Segundo o Public Policy Polling, somente 21% dos eleitores republicanos de Iowa acreditam no Aquecimento Global (e somente 35% creem na evolução). Dentro do Partido Republicano, ignorância deliberada tornou-se um teste decisivo para os candidatos, no qual Romney está determinado a passar a qualquer custo.
Então, é agora altamente provável que o candidato presidencial de um de nossos dois grandes partidos políticos será ou um homem que acredita no que quer acreditar, ou um homem que finge acreditar em qualquer coisa que ele ache que a base do partido quer que ele acredite.
E o caráter crescentemente anti-intelectual da direita, tanto dentro do Partido Republicano como fora dele, se estende além da questão da mudança climática.
Ultimamente, por exemplo, a seção editorial do “Wall Street Journal” passou da antiga preferência pelas ideias econômicas de “charlatães e maníacos” — pela definição famosa de um dos principais conselheiros econômicos do ex-presidente George W. Bush – para um descrédito geral do pensamento árduo sobre questões econômicas. Não prestem atenção a “teorias fantasiosas” que conflitam com o “senso comum”, diz-nos o “Journal”. Por que deveria alguém imaginar que se precisa mais do que estômago para analisar coisas como crises financeiras e recessões?
Agora, não se sabe quem ganhará a eleição presidencial do próximo ano. Mas há chances de que, mais dia menos dia, a maior nação do mundo será dirigida por um partido que é agressivamente anticiência, mesmo anticonhecimento. E, numa era de grandes desafios – ambiental, econômico e outros – é uma terrível perspectiva.
Paul Krugman é colunista do “New York Times”.
David Graeber on the History of Debt (PBS, Naked Capitalism)
FRIDAY, AUGUST 26, 2011 (nakedcapitalism.com)
What is Debt? – An Interview with Economic Anthropologist David Graeber
David Graeber currently holds the position of Reader in Social Anthropology at Goldsmiths University London. Prior to this he was an associate professor of anthropology at Yale University. He is the author of ‘Debt: The First 5,000 Years’ which is available from Amazon.
Interview conducted by Philip Pilkington, a journalist and writer based in Dublin, Ireland.
Philip Pilkington: Let’s begin. Most economists claim that money was invented to replace the barter system. But you’ve found something quite different, am I correct?
David Graeber: Yes there’s a standard story we’re all taught, a ‘once upon a time’ — it’s a fairy tale.
It really deserves no other introduction: according to this theory all transactions were by barter. “Tell you what, I’ll give you twenty chickens for that cow.” Or three arrow-heads for that beaver pelt or what-have-you. This created inconveniences, because maybe your neighbor doesn’t need chickens right now, so you have to invent money.
The story goes back at least to Adam Smith and in its own way it’s the founding myth of economics. Now, I’m an anthropologist and we anthropologists have long known this is a myth simply because if there were places where everyday transactions took the form of: “I’ll give you twenty chickens for that cow,” we’d have found one or two by now. After all people have been looking since 1776, when the Wealth of Nations first came out. But if you think about it for just a second, it’s hardly surprising that we haven’t found anything.
Think about what they’re saying here – basically: that a bunch of Neolithic farmers in a village somewhere, or Native Americans or whatever, will be engaging in transactions only through the spot trade. So, if your neighbor doesn’t have what you want right now, no big deal. Obviously what would really happen, and this is what anthropologists observe when neighbors do engage in something like exchange with each other, if you want your neighbor’s cow, you’d say, “wow, nice cow” and he’d say “you like it? Take it!” – and now you owe him one. Quite often people don’t even engage in exchange at all – if they were real Iroquois or other Native Americans, for example, all such things would probably be allocated by women’s councils.
So the real question is not how does barter generate some sort of medium of exchange, that then becomes money, but rather, how does that broad sense of ‘I owe you one’ turn into a precise system of measurement – that is: money as a unit of account?
By the time the curtain goes up on the historical record in ancient Mesopotamia, around 3200 BC, it’s already happened. There’s an elaborate system of money of account and complex credit systems. (Money as medium of exchange or as a standardized circulating units of gold, silver, bronze or whatever, only comes much later.)
So really, rather than the standard story – first there’s barter, then money, then finally credit comes out of that – if anything its precisely the other way around. Credit and debt comes first, then coinage emerges thousands of years later and then, when you do find “I’ll give you twenty chickens for that cow” type of barter systems, it’s usually when there used to be cash markets, but for some reason – as in Russia, for example, in 1998 – the currency collapses or disappears.
PP: You say that by the time historical records start to be written in the Mesopotamia around 3200 BC a complex financial architecture is already in place. At the same time is society divided into classes of debtors and creditors? If not then when does this occur? And do you see this as the most fundamental class division in human history?
DG: Well historically, there seem to have been two possibilities.
One is what you found in Egypt: a strong centralized state and administration extracting taxes from everyone else. For most of Egyptian history they never developed the habit of lending money at interest. Presumably, they didn’t have to.
Mesopotamia was different because the state emerged unevenly and incompletely. At first there were giant bureaucratic temples, then also palace complexes, but they weren’t exactly governments and they didn’t extract direct taxes – these were considered appropriate only for conquered populations. Rather they were huge industrial complexes with their own land, flocks and factories. This is where money begins as a unit of account; it’s used for allocating resources within these complexes.
Interest-bearing loans, in turn, probably originated in deals between the administrators and merchants who carried, say, the woollen goods produced in temple factories (which in the very earliest period were at least partly charitable enterprises, homes for orphans, refugees or disabled people for instance) and traded them to faraway lands for metal, timber, or lapis lazuli. The first markets form on the fringes of these complexes and appear to operate largely on credit, using the temples’ units of account. But this gave the merchants and temple administrators and other well-off types the opportunity to make consumer loans to farmers, and then, if say the harvest was bad, everybody would start falling into debt-traps.
This was the great social evil of antiquity – families would have to start pawning off their flocks, fields and before long, their wives and children would be taken off into debt peonage. Often people would start abandoning the cities entirely, joining semi-nomadic bands, threatening to come back in force and overturn the existing order entirely. Rulers would regularly conclude the only way to prevent complete social breakdown was to declare a clean slate or ‘washing of the tablets,’ they’d cancel all consumer debt and just start over. In fact, the first recorded word for ‘freedom’ in any human language is the Sumerian amargi, a word for debt-freedom, and by extension freedom more generally, which literally means ‘return to mother,’ since when they declared a clean slate, all the debt peons would get to go home.
PP: You have noted in the book that debt is a moral concept long before it becomes an economic concept. You’ve also noted that it is a very ambivalent moral concept insofar as it can be both positive and negative. Could you please talk about this a little? Which aspect is more prominent?
DG: Well it tends to pivot radically back and forth.
One could tell the history like this: eventually the Egyptian approach (taxes) and Mesopotamian approach (usury) fuse together, people have to borrow to pay their taxes and debt becomes institutionalized.
Taxes are also key to creating the first markets that operate on cash, since coinage seems to be invented or at least widely popularized to pay soldiers – more or less simultaneously in China, India, and the Mediterranean, where governments find the easiest way to provision the troops is to issue them standard-issue bits of gold or silver and then demand everyone else in the kingdom give them one of those coins back again. Thus we find that the language of debt and the language of morality start to merge.
In Sanskrit, Hebrew, Aramaic, ‘debt,’ ‘guilt,’ and ‘sin’ are actually the same word. Much of the language of the great religious movements – reckoning, redemption, karmic accounting and the like – are drawn from the language of ancient finance. But that language is always found wanting and inadequate and twisted around into something completely different. It’s as if the great prophets and religious teachers had no choice but to start with that kind of language because it’s the language that existed at the time, but they only adopted it so as to turn it into its opposite: as a way of saying debts are not sacred, but forgiveness of debt, or the ability to wipe out debt, or to realize that debts aren’t real – these are the acts that are truly sacred.
How did this happen? Well, remember I said that the big question in the origins of money is how a sense of obligation – an ‘I owe you one’ – turns into something that can be precisely quantified? Well, the answer seems to be: when there is a potential for violence. If you give someone a pig and they give you a few chickens back you might think they’re a cheapskate, and mock them, but you’re unlikely to come up with a mathematical formula for exactly how cheap you think they are. If someone pokes out your eye in a fight, or kills your brother, that’s when you start saying, “traditional compensation is exactly twenty-seven heifers of the finest quality and if they’re not of the finest quality, this means war!”
Money, in the sense of exact equivalents, seems to emerge from situations like that, but also, war and plunder, the disposal of loot, slavery. In early Medieval Ireland, for example, slave-girls were the highest denomination of currency. And you could specify the exact value of everything in a typical house even though very few of those items were available for sale anywhere because they were used to pay fines or damages if someone broke them.
But once you understand that taxes and money largely begin with war it becomes easier to see what really happened. After all, every Mafiosi understands this. If you want to take a relation of violent extortion, sheer power, and turn it into something moral, and most of all, make it seem like the victims are to blame, you turn it into a relation of debt. “You owe me, but I’ll cut you a break for now…” Most human beings in history have probably been told this by their debtors. And the crucial thing is: what possible reply can you make but, “wait a minute, who owes what to who here?” And of course for thousands of years, that’s what the victims have said, but the moment you do, you are using the rulers’ language, you’re admitting that debt and morality really are the same thing. That’s the situation the religious thinkers were stuck with, so they started with the language of debt, and then they tried to turn it around and make it into something else.
PP: You’d be forgiven for thinking this was all very Nietzschean. In his ‘On the Genealogy of Morals’ the German philosopher Friedrich Nietzsche famously argued that all morality was founded upon the extraction of debt under the threat of violence. The sense of obligation instilled in the debtor was, for Nietzsche, the origin of civilisation itself. You’ve been studying how morality and debt intertwine in great detail. How does Nietzsche’s argument look after over 100 years? And which do you see as primal: morality or debt?
DG: Well, to be honest, I’ve never been sure if Nietzsche was really serious in that passage or whether the whole argument is a way of annoying his bourgeois audience; a way of pointing out that if you start from existing bourgeois premises about human nature you logically end up in just the place that would make most of that audience most uncomfortable.
In fact, Nietzsche begins his argument from exactly the same place as Adam Smith: human beings are rational. But rational here means calculation, exchange and hence, trucking and bartering; buying and selling is then the first expression of human thought and is prior to any sort of social relations.
But then he reveals exactly why Adam Smith had to pretend that Neolithic villagers would be making transactions through the spot trade. Because if we have no prior moral relations with each other, and morality just emerges from exchange, then ongoing social relations between two people will only exist if the exchange is incomplete – if someone hasn’t paid up.
But in that case, one of the parties is a criminal, a deadbeat and justice would have to begin with the vindictive punishment of such deadbeats. Thus he says all those law codes where it says ‘twenty heifers for a gouged-out eye’ – really, originally, it was the other way around. If you owe someone twenty heifers and don’t pay they gouge out your eye. Morality begins with Shylock’s pound of flesh.
Needless to say there’s zero evidence for any of this – Nietzsche just completely made it up. The question is whether even he believed it. Maybe I’m an optimist, but I prefer to think he didn’t.
Anyway it only makes sense if you assume those premises; that all human interaction is exchange, and therefore, all ongoing relations are debts. This flies in the face of everything we actually know or experience of human life. But once you start thinking that the market is the model for all human behavior, that’s where you end up with.
If however you ditch the whole myth of barter, and start with a community where people do have prior moral relations, and then ask, how do those moral relations come to be framed as ‘debts’ – that is, as something precisely quantified, impersonal, and therefore, transferrable – well, that’s an entirely different question. In that case, yes, you do have to start with the role of violence.
PP: Interesting. Perhaps this is a good place to ask you about how you conceive your work on debt in relation to the great French anthropologist Marcel Mauss’ classic work on gift exchange.
DG: Oh, in my own way I think of myself as working very much in the Maussian tradition. Mauss was one of the first anthropologists to ask: well, all right, if not barter, then what? What do people who don’t use money actually do when things change hands? Anthropologists had documented an endless variety of such economic systems, but hadn’t really worked out common principles. What Mauss noticed was that in almost all of them, everyone pretended as if they were just giving one another gifts and then they fervently denied they expected anything back. But in actual fact everyone understood there were implicit rules and recipients would feel compelled to make some sort of return.
What fascinated Mauss was that this seemed to be universally true, even today. If I take a free-market economist out to dinner he’ll feel like he should return the favor and take me out to dinner later. He might even think that he is something of chump if he doesn’t and this even if his theory tells him he just got something for nothing and should be happy about it. Why is that? What is this force that compels me to want to return a gift?
This is an important argument, and it shows there is always a certain morality underlying what we call economic life. But it strikes me that if you focus too much on just that one aspect of Mauss’ argument you end up reducing everything to exchange again, with the proviso that some people are pretending they aren’t doing that.
Mauss didn’t really think of everything in terms of exchange; this becomes clear if you read his other writings besides ‘The Gift’. Mauss insisted there were lots of different principles at play besides reciprocity in any society – including our own.
For example, take hierarchy. Gifts given to inferiors or superiors don’t have to be repaid at all. If another professor takes our economist out to dinner, sure, he’ll feel that he should reciprocate; but if an eager grad student does, he’ll probably figure just accepting the invitation is favor enough; and if George Soros buys him dinner, then great, he did get something for nothing after all. In explicitly unequal relations, if you give somebody something, far from doing you a favor back, they’re more likely to expect you to do it again.
Or take communistic relations – and I define this, following Mauss actually, as any ones where people interact on the basis of ‘from each according to their abilities to each according to their needs’. In these relations people do not rely on reciprocity, for example, when trying to solve a problem, even inside a capitalist firm. (As I always say, if somebody working for Exxon says, “hand me the screwdriver,” the other guy doesn’t say, “yeah and what do I get for it?”) Communism is in a way the basis of all social relations – in that if the need is great enough (I’m drowning) or the cost small enough (can I have a light?) everyone will be expected to act that way.
Anyway that’s one thing I got from Mauss. There are always going to be lots of different sorts of principles at play simultaneously in any social or economic system – which is why we can never really boil these things down to a science. Economics tries to, but it does it by ignoring everything except exchange.
PP: Let’s move onto economic theory then. Economics has some pretty specific theories about what money is. There’s the mainstream approach that we discussed briefly above; this is the commodity theory of money in which specific commodities come to serve as a medium of exchange to replace crude barter economies. But there’s also alternative theories that are becoming increasingly popular at the moment. One is the Circuitist theory of money in which all money is seen as a debt incurred by some economic agent. The other – which actually integrates the Circuitist approach – is the Chartalist theory of money in which all money is seen as a medium of exchange issued by the Sovereign and backed by the enforcement of tax claims. Maybe you could say something about these theories?
DG: One of my inspirations for ‘Debt: The First 5,000 Years’ was Keith Hart’s essay ‘Two Sides of the Coin’. In that essay Hart points out that not only do different schools of economics have different theories on the nature of money, but there is also reason to believe that both are right. Money has, for most of its history, been a strange hybrid entity that takes on aspects of both commodity (object) and credit (social relation.) What I think I’ve managed to add to that is the historical realization that while money has always been both, it swings back and forth – there are periods where credit is primary, and everyone adopts more or less Chartalist theories of money and others where cash tends to predominate and commodity theories of money instead come to the fore. We tend to forget that in, say, the Middle Ages, from France to China, Chartalism was just common sense: money was just a social convention; in practice, it was whatever the king was willing to accept in taxes.
PP: You say that history swings between periods of commodity money and periods of virtual money. Do you not think that we’ve reached a point in history where due to technological and cultural evolution we may have seen the end of commodity money forever?
DG: Well, the cycles are getting a bit tighter as time goes by. But I think we’ll still have to wait at least 400 years to really find out. It is possible that this era is coming to an end but what I’m more concerned with now is the period of transition.
The last time we saw a broad shift from commodity money to credit money it wasn’t a very pretty sight. To name a few we had the fall of the Roman Empire, the Kali Age in India and the breakdown of the Han dynasty… There was a lot of death, catastrophe and mayhem. The final outcome was in many ways profoundly libratory for the bulk of those who lived through it – chattel slavery, for example, was largely eliminated from the great civilizations. This was a remarkable historical achievement. The decline of cities actually meant most people worked far less. But still, one does rather hope the dislocation won’t be quite so epic in its scale this time around. Especially since the actual means of destruction are so much greater this time around.
PP: Which do you see as playing a more important role in human history: money or debt?
DG: Well, it depends on your definitions. If you define money in the broadest sense, as any unit of account whereby you can say 10 of these are worth 7 of those, then you can’t have debt without money. Debt is just a promise that can be quantified by means of money (and therefore, becomes impersonal, and therefore, transferable.) But if you are asking which has been the more important form of money, credit or coin, then probably I would have to say credit.
PP: Let’s move on to some of the real world problems facing the world today. We know that in many Western countries over the past few years households have been running up enormous debts, from credit card debts to mortgages (the latter of which were one of the root causes of the recent financial crisis). Some economists are saying that economic growth since the Clinton era was essentially run on an unsustainable inflating of household debt. From an historical perspective what do you make of this phenomenon?
DG: From an historical perspective, it’s pretty ominous. One could go further than the Clinton era, actually – a case could be made that we are seeing now is the same crisis we were facing in the 70s; it’s just that we managed to fend it off for 30 or 35 years through all these elaborate credit arrangements (and of course, the super-exploitation of the global South, through the ‘Third World Debt Crisis’.)
As I said Eurasian history, taken in its broadest contours, shifts back and forth between periods dominated by virtual credit money and those dominated by actual coin and bullion. The credit systems of the ancient Near East give way to the great slave-holding empires of the Classical world in Europe, India, and China, which used coinage to pay their troops. In the Middle Ages the empires go and so does the coinage – the gold and silver is mostly locked up in temples and monasteries – and the world reverts to credit. Then after 1492 or so you have the return world empires again; and gold and silver currency together with slavery, for that matter.
What’s been happening since Nixon went off the gold standard in 1971 has just been another turn of the wheel – though of course it never happens the same way twice. However, in one sense, I think we’ve been going about things backwards. In the past, periods dominated by virtual credit money have also been periods where there have been social protections for debtors. Once you recognize that money is just a social construct, a credit, an IOU, then first of all what is to stop people from generating it endlessly? And how do you prevent the poor from falling into debt traps and becoming effectively enslaved to the rich? That’s why you had Mesopotamian clean slates, Biblical Jubilees, Medieval laws against usury in both Christianity and Islam and so on and so forth.
Since antiquity the worst-case scenario that everyone felt would lead to total social breakdown was a major debt crisis; ordinary people would become so indebted to the top one or two percent of the population that they would start selling family members into slavery, or eventually, even themselves.
Well, what happened this time around? Instead of creating some sort of overarching institution to protect debtors, they create these grandiose, world-scale institutions like the IMF or S&P to protect creditors. They essentially declare (in defiance of all traditional economic logic) that no debtor should ever be allowed to default. Needless to say the result is catastrophic. We are experiencing something that to me, at least, looks exactly like what the ancients were most afraid of: a population of debtors skating at the edge of disaster.
And, I might add, if Aristotle were around today, I very much doubt he would think that the distinction between renting yourself or members of your family out to work and selling yourself or members of your family to work was more than a legal nicety. He’d probably conclude that most Americans were, for all intents and purposes, slaves.
PP: You mention that the IMF and S&P are institutions that are mainly geared toward extracting debts for creditors. This seems to have become the case in the European monetary union too. What do you make of the situation in Europe at the moment?
DG: Well, I think this is a prime example of why existing arrangements are clearly untenable. Obviously the ‘whole debt’ cannot be paid. But even when some French banks offered voluntary write-downs for Greece, the others insisted they would treat it as if it were a default anyway. The UK takes the even weirder position that this is true even of debts the government owes to banks that have been nationalized – that is, technically, that they owe to themselves! If that means that disabled pensioners are no longer able to use public transit or youth centers have to be closed down, well that’s simply the ‘reality of the situation,’ as they put it.
These ‘realities’ are being increasingly revealed to simply be ones of power. Clearly any pretence that markets maintain themselves, that debts always have to be honored, went by the boards in 2008. That’s one of the reasons I think you see the beginnings of a reaction in a remarkably similar form to what we saw during the heyday of the ‘Third World debt crisis’ – what got called, rather weirdly, the ‘anti-globalization movement’. This movement called for genuine democracy and actually tried to practice forms of direct, horizontal democracy. In the face of this there was the insidious alliance between financial elites and global bureaucrats (whether the IMF, World Bank, WTO, now EU, or what-have-you).
When thousands of people begin assembling in squares in Greece and Spain calling for real democracy what they are effectively saying is: “Look, in 2008 you let the cat out of the bag. If money really is just a social construct now, a promise, a set of IOUs and even trillions of debts can be made to vanish if sufficiently powerful players demand it then, if democracy is to mean anything, it means that everyone gets to weigh in on the process of how these promises are made and renegotiated.” I find this extraordinarily hopeful.
PP: Broadly speaking how do you see the present debt/financial crisis unravelling? Without asking you to peer into the proverbial crystal-ball – because that’s a silly thing to ask of anyone – how do you see the future unfolding; in the sense of how do you take your bearings right now?
DG: For the long-term future, I’m pretty optimistic. We might have been doing things backwards for the last 40 years, but in terms of 500-year cycles, well, 40 years is nothing. Eventually there will have to be recognition that in a phase of virtual money, safeguards have to be put in place – and not just ones to protect creditors. How many disasters it will take to get there? I can’t say.
But in the meantime there is another question to be asked: once we do these reforms, will the results be something that could even be called ‘capitalism’?



Você precisa fazer login para comentar.