Arquivo anual: 2011

Freakonomics Poll: When It Comes to Predictions, Whom Do You Trust? (Freakonomics.com)

FREAKONOMICS

09/16/2011 | 11:27 am

Our latest Freakonomics Radio podcast, “The Folly of Prediction,” is built around the premise that humans love to predict the future, but are generally terrible at it. (You can download/subscribe at iTunes, get the RSS feed, listen live via the media player above, or read the transcript here.)

There are a host of professions built around predicting some future outcome: from predicting the score of a sports match, to forecasting the weather for the weekend, to being able to tell what the stock market is going to do tomorrow. But is anyone actually good at it?

From your experience, which experts do you trust for predictions?

  • None of the Above (39%, 447 Votes)
  • Meteorologists (37%, 414 Votes)
  • Economists (14%, 158 Votes)
  • Sports Experts (9%, 98 Votes)
  • Political Pundits (1%, 16 Votes)
  • Stock Market Analysts (1%, 10 Votes)

Total Voters: 1,132

The Revolution Begins at Home: An Open Letter to Join the Wall Street Occupation (The Independent)

Arun Gupta
September 28, 2011

(Photo courtesy of Flickr.com/pweiskel08). 

What is occurring on Wall Street right now is truly remarkable. For over 10 days, in the sanctum of the great cathedral of global capitalism, the dispossessed have liberated territory from the financial overlords and their police army.

They have created a unique opportunity to shift the tides of history in the tradition of other great peaceful occupations from the sit-down strikes of the 1930s to the lunch-counter sit-ins of the 1960s to the democratic uprisings across the Arab world and Europe today.

While the Wall Street occupation is growing, it needs an all-out commitment from everyone who cheered the Egyptians in Tahrir Square, said “We are all Wisconsin,” and stood in solidarity with the Greeks and Spaniards. This is a movement for anyone who lacks a job, housing or healthcare, or thinks they have no future.

Our system is broken at every level. More than 25 million Americans are unemployed. More than 50 million live without health insurance. And perhaps 100 million Americans are mired in poverty, using realistic measures. Yet the fat cats continue to get tax breaks and reap billions while politicians compete to turn the austerity screws on all of us.

At some point the number of people occupying Wall Street – whether that’s five thousand, ten thousand or fifty thousand – will force the powers that be to offer concessions. No one can say how many people it will take or even how things will change exactly, but there is a real potential for bypassing a corrupt political process and to begin realizing a society based on human needs not hedge fund profits.

After all, who would have imagined a year ago that Tunisians and Egyptians would oust their dictators?

At Liberty Park, the nerve center of the occupation, more than a thousand people gather every day to debate, discuss and organize what to do about our failed system that has allowed the 400 richest Americans at the top to amass more wealth than the 180 million Americans at the bottom.

It’s astonishing that this self-organized festival of democracy has sprouted on the turf of the masters of the universe, the men who play the tune that both political parties and the media dance to. The New York Police Department, which has deployed hundreds of officers at a time to surround and intimidate protesters, is capable of arresting everyone and clearing Liberty Plaza in minutes. But they haven’t, which is also astonishing.

That’s because assaulting peaceful crowds in a public square demanding real democracy – economic and not just political – would remind the world of the brittle autocrats who brutalized their people demanding justice before they were swept away by the Arab Spring. And the state violence has already backfired. After police attacked a Saturday afternoon march that started from Liberty Park the crowds only got bigger and media interest grew.

The Wall Street occupation has already succeeded in revealing the bankruptcy of the dominant powers – the economic, the political, media and security forces. They have nothing positive to offer humanity, not that they ever did for the Global South, but now their quest for endless profits means deepening the misery with a thousand austerity cuts.

Even their solutions are cruel jokes. They tell us that the “Buffett Rule” would spread the pain by asking the penthouse set to sacrifice a tin of caviar, which is what the proposed tax increase would amount to. Meanwhile, the rest of us will have to sacrifice healthcare, food, education, housing, jobs and perhaps our lives to sate the ferocious appetite of capital.

That’s why more and more people are joining the Wall Street occupation. They can tell you about their homes being foreclosed upon, months of grinding unemployment or minimum-wage dead-end jobs, staggering student debt loads, or trying to live without decent healthcare. It’s a whole generation of Americans with no prospects, but who are told to believe in a system that can only offer them Dancing With The Stars and pepper spray to the face.

Yet against every description of a generation derided as narcissistic, apathetic and hopeless they are staking a claim to a better future for all of us.

That’s why we all need to join in. Not just by liking it on Facebook, signing a petition at change.org or retweeting protest photos, but by going down to the occupation itself.

There is great potential here. Sure, it’s a far cry from Tahrir Square or even Wisconsin. But there is the nucleus of a revolt that could shake America’s power structure as much as the Arab world has been upended.

Instead of one to two thousand people a day joining in the occupation there needs to be tens of thousands of people protesting the fat cats driving Bentleys and drinking thousand-dollar bottles of champagne with money they looted from the financial crisis and then from the bailouts while Americans literally die on the streets.

To be fair, the scene in Liberty Plaza seems messy and chaotic. But it’s also a laboratory of possibility, and that’s the beauty of democracy. As opposed to our monoculture world, where political life is flipping a lever every four years, social life is being a consumer and economic life is being a timid cog, the Wall Street occupation is creating a polyculture of ideas, expression and art.

Yet while many people support the occupation, they hesitate to fully join in and are quick to offer criticism. It’s clear that the biggest obstacles to building a powerful movement are not the police or capital – it’s our own cynicism and despair.

Perhaps their views were colored by the New York Times article deriding protestors for wishing to “pantomime progressivism” and “Gunning for Wall Street with faulty aim.” Many of the criticisms boil down to “a lack of clear messaging.”

But what’s wrong with that? A fully formed movement is not going to spring from the ground. It has to be created. And who can say what exactly needs to be done? We are not talking about ousting a dictator; though some say we want to oust the dictatorship of capital.

There are plenty of sophisticated ideas out there: end corporate personhood; institute a “Tobin Tax” on stock purchases and currency trading; nationalize banks; socialize medicine; fully fund government jobs and genuine Keynesian stimulus; lift restrictions on labor organizing; allow cities to turn foreclosed homes into public housing; build a green energy infrastructure.

But how can we get broad agreement on any of these? If the protesters came into the square with a pre-determined set of demands it would have only limited their potential. They would have either been dismissed as pie in the sky – such as socialized medicine or nationalize banks – or if they went for weak demands such as the Buffett Rule their efforts would immediately be absorbed by a failed political system, thus undermining the movement.

That’s why the building of the movement has to go hand in hand with common struggle, debate and radical democracy. It’s how we will create genuine solutions that have legitimacy. And that is what is occurring down at Wall Street.

Now, there are endless objections one can make. But if we focus on the possibilities, and shed our despair, our hesitancy and our cynicism, and collectively come to Wall Street with critical thinking, ideas and solidarity we can change the world.

How many times in your life do you get a chance to watch history unfold, to actively participate in building a better society, to come together with thousands of people where genuine democracy is the reality and not a fantasy?

For too long our minds have been chained by fear, by division, by impotence. The one thing the elite fear most is a great awakening. That day is here. Together we can seize it.

Cirurgias plásticas reforçam ideal do corpo como capital social (Fapesp)

Pesquisa FAPESP
Edição 187 – Setembro 2011

Humanidades > Antropologia
A economia das aparências

Carlos Haag

“A cirurgia plástica é um crime contra a religião e os bons costumes. Mudar a cara que Deus nos deu, cortar a pele, coser os peitos e quem sabe o que mais, vade retro.” É assim que Ponciana, personagem do romance Tereza Batista cansada de guerra, de Jorge Amado, reage ao ver a vizinha, dona Beatriz, “renovada”, com “rosto liso, sem rugas nem papo, seios altos aparentando não mais de trinta fogosas primaveras, num total descaramento, a glorificação ambulante da medicina moderna”. Imagine–se como ela reagiria hoje, ao saber da pesquisa recente do Ibope em conjunto com a Sociedade Brasileira de Cirurgia Plástica (SBCP): no Brasil a cada minuto é realizada uma operação plástica, 1.700 por dia, um total anual de 645 mil, que só nos deixa atrás dos Estados Unidos, com 1,5 milhão de cirurgias. Das intervenções nacionais, 65% são só cosméticas e as mulheres são as maiores clientes: 82%. A preferência nacional é pela lipo (30%), seguida pela prótese de silicone (21%). Nos últimos cinco anos aumentou em 30% a procura da plástica estética também pelos homens.

“O que fez a plástica virar quase obrigação, com uma demanda crescente em todas as regiões e segmentos sociais? O país é o único que oferece plásticas pelo sistema público de saúde (15% do total) e clínicas particulares têm até carnês de prestações”, diz o antropólogo americano Alexander Edmonds, da Universidade de Amsterdã e autor de Pretty modern: beauty, sex and plastic surgery in Brazil, recém-lançado nos EUA pela Duke University Press. “No Brasil não basta ser magra. A mulher tem que ser sarada, definida, sensual. Mais do que boa mãe, profissional competente e esposa cuidadosa, ela tem que enfrentar o ‘quarto turno’ da academia, correndo atrás de um corpo sempre inatingível. O maior algoz da mulher brasileira é ela mesma, que vive procurando aprovação de outras mulheres. Temos que pensar numa mulher que comporte falhas, não criminalize seu corpo por fugir aos padrões e que aproveite momentos como a maternidade sem querer voltar às pressas à forma anterior”, explica Joana de Vilhena Moraes, coordenadora do Núcleo de Doenças da Beleza da Pontifícia Universidade Católica do Rio (PUC-Rio) e autora de Com que corpo eu vou? Sociabilidade e usos do corpo nas mulheres das camadas altas e populares (Editora Pallas/PUC–Rio), livro que traz os resultados de uma pesquisa financiada pela Faperj sobre os padrões estéticos em diferentes camadas sociais. “Descobrimos que, se a procura do corpo perfeito é democrática, desejo de mulheres ricas ou pobres, há diferentes conceitos de beleza. Entre as ricas, qualquer sacrifício vale a pena para ganhar a magreza das modelos. Entre as mais pobres, o bonito mesmo é o corpo farto e curvilíneo das dançarinas de pagode. O que diverge entre os grupos é o sofrimento: as ricas se escondem sob roupas largas; as pobres exibem a gordura sem pudor em microshorts e tops justos.” Segundo ela, isso não impede que também malhem e fiquem nas filas dos hospitais públicos para fazer plástica estética. “A mídia, com apoio do discurso médico, estimula que as mulheres recorram a esses expedientes que evitam a constatação das mudanças da sua subjetividade, valendo-se, para isso, do estágio atual de evolução das ciências biotecnológicas, nas quais o país é respeitado globalmente.”

Curiosamente, segundo Edmonds, por muito tempo a cirurgia cosmética não foi vista como medicina legítima e para ganhar a aceitação precisou ser transformada em “cura”, aliando-se à psicologia: conceitos como “complexo de inferioridade” deram à operação um fundamento terapêutico. “O cirurgião Ivo Pitanguy foi o responsável por diluir os limites entre as cirurgias estética e reparadora, já que ambas curariam a psique. Para ele, o cirurgião plástico seria um ‘psicólogo com bisturi’ e o objeto terapêutico real da operação não seria o corpo, mas a mente”, nota o americano. Mas há consequências sobre a profissão. “A saúde é, agora, um guarda-chuva simbólico e não se restringe a permanecer na normalidade médica: é cuidar da forma, do peso, da aparência. A ‘saúde’ se estetizou”, analisa Francisco Romão Ferreira, professor do PGEBS (Programa de Pós-Graduação no Ensino de Biociências na Saúde do IOC/Fiocruz) e autor da pesquisa Os sentidos do corpo – Cirurgias estéticas, discurso médico e saúde pública. “Há uma pseudodemocratização da tecnologia que leva as pessoas a pensar que o processo é simples e com poucos riscos, e recém-formados em medicina migram para esse filão do mercado, que faz com que esses profissionais alertem para a banalização das cirurgias. É uma ruptura com a medicina tradicional que tem no corpo seu campo de ação. Essa medicina, ao contrário, se inscreve na superfície do corpo, com critérios subjetivos fora dele. A doença é criada artificialmente no âmbito da cultura, fora do corpo, mas que começa a fazer parte dele.”

“A beleza física ligou-se ao imaginário nacional e global do Brasil e é impossível conceber a identidade brasileira sem um componente estético, uma ‘cidadania cosmética’ que não significa direitos reais, mas forma de reproduzir desigualdades sociais e estruturais”, afirma o antropólogo Alvaro Jarrin, da Duke University, autor da pesquisa Cosmetic citizenship: beauty and social inequality in Brazil. É o que Edmonds chama de “saúde estética”, uma mistura de direito à saúde com consumismo. “Se o povo não realizou sua cidadania, ao menos pode se ‘refazer’ como ‘cidadão cosmético’. Os socialmente excluídos viram ‘sofredores estéticos’. A saúde sempre foi vista como bela; no Brasil, a beleza se transformou em saudável.” Para Jarrin, Pitanguy entendeu essa necessidade dos pobres por uma cidadania da beleza ao criar o primeiro serviço de cirurgia plástica popular num hospital-escola, ganhando apoio do Estado como um serviço filantrópico. “O governo é cúmplice e capitaliza indiretamente o sucesso do desenvolvimento das cirurgias de beleza”, nota. “O direito à cirurgia cosmética nunca foi diretamente autorizado pelo SUS, mas, por redefinições engenhosas do que é saúde, médicos fazem plásticas cosméticas em hospitais públicos, onde podem praticar com poucos riscos de processos por erros, desenvolvendo o ‘estilo brasileiro’, exportado para todo o mundo”, acredita Edmonds.

“Assim, as representações do corpo da mulher brasileira não são mais pela ‘verdadeira natureza perdida’, expressão da mistura das raças, mas produto da associação entre essa noção antiga e as técnicas mais modernas, uma intimidade perigosa entre prótese e carne. Num país cuja imagem é a ‘beleza natural’, a valorização das técnicas cirúrgicas dos médicos brasileiros é um paradoxo”, avalia a historiadora Denise Bernuzzi de Sant’Anna, coordenadora do grupo de pesquisa A Condição Corporal, da PUC-SP, e autora de Corpos de passagem: ensaios sobre a subjetividade contemporânea. “Mas a liberdade de construir o próprio corpo não escapa a exigências como ser jovem e a obsessão pela alegria sem escalas e em curtíssimo prazo, em que cada um é responsável pelo sucesso ou fracasso em função do culto ao corpo ou seu descuido”, avalia. “O problema não é o cuidado de si, mas fazer do corpo um território que dispensa o contato com quem é diferente de nós; não gostar de alguém pelo seu corpo.” Uma segregação com objetivos definidos. “Sofrer para ter um corpo ‘em forma’ é recompensado pela gratificação de pertencer a um grupo de ‘valor superior’. O corpo identifica a pessoa a um grupo e o distingue de outros. Este corpo ‘trabalhado’, ‘malhado’, ‘sarado’, é, hoje, um sinal indicativo de certa virtude. Sob a moral da boa forma, ‘trabalhar’ o corpo é um ato de significação como se vestir. Ele, como as roupas, é um símbolo que torna visível as diferenças entre grupos sociais”, observa a antropóloga Mirian Goldenberg, professora da Universidade Federal do Rio de Janeiro (UFRJ), autora de O corpo como capital e que analisou o fenômeno na pesquisa Mudanças nos papéis de gênero, sexualidade e conjugalidade, apoiada pelo CNPq.

“No Brasil, o corpo é um capital, um modelo de riqueza, a mais desejada pelos indivíduos das camadas médias e das mais pobres, que percebem o corpo como um importante veículo de ascensão social e como capital no mercado de trabalho, no mercado de casamento e no mercado sexual. A busca do corpo ‘sarado’ é, para os adeptos do culto à beleza, uma luta contra a morte simbólica imposta aos que não se disciplinam e se enquadram aos padrões.” Com direito a sutilezas geográficas. “Em São Paulo há a cultura do light, mas a roupa ainda é o adereço importante. No Rio há um desvelamento do corpo. Quando perguntaram a Adriane Galisteu como ela sabia a hora de fechar a boca ela disse: ‘Se me chamarem de gostosa na rua, sei que estou gorda’. Esse é o pensamento carioca”, diz Joana. Todos, porém, querem ser bem avaliados pelos pares. “Uma mulher gorda na classe média e alta é motivo de escárnio. Na favela, ela não precisa se livrar dos recheios para ser admirada. As mais pobres gastam mais energia em garantir direitos básicos de sobrevivência, coisas que para a mulher mais rica estão resolvidas. Pelo menos nessa relação com o corpo as moradoras de favela são mais felizes”, conta.

Em sua pesquisa, Joana descobriu que as mulheres das classes mais abastadas usam um discurso mais sofisticado, individualista, dizendo que fazem sacrifícios, como plásticas e malhação, para elas mesmas. Prova de uma relação tensa com o espelho: nunca se justifica o “trabalho” do corpo como querer ser um objeto de mais desejo. “Nas favelas, elas dizem claramente que fazem as intervenções para ‘ficar gostosas’, numa sexualidade vivida de maneira mais plena”, observa. O que não significa que as mulheres mais pobres não se percebam mais cheinhas e estejam satisfeitas com seus corpos, pois têm acesso à informação, leem revistas, veem a mesma novela que as mulheres mais ricas. “A diferença é que elas não estão aprisionadas nesse processo. Privação e disciplina são valores máximos das classes altas. Nas classes populares, a privação é associada à pobreza, e a gordura à prosperidade. Uma mulher da favela me disse que não ia ‘viver de alface’ porque iam achar que estava na miséria.”

Mas, para desgosto de Gilberto Freyre, que via a beleza brasileira na mulher de seios pequenos e glúteos grandes, Brasil e EUA, hoje, compartilham ideais corpóreos. Uma obsessão americana, o aumento das mamas está em alta aqui desde os anos 1980, a ponto de a capa da revista Time (julho de 2001) trazer a cantora Carla Perez com seios proeminentes, nos moldes das mulheres americanas, com a pergunta se o novo “busto tropical” não seria um “imperialismo cultural”. Mas há diferenças. Um estudo da Sociedade Internacional de Cirurgia Plástica Estética (Isaps, na sigla em inglês) afirma que as brasileiras querem seios maiores, mas também nádegas grandes com quadris esculpidos, em busca do corpo “brasileiro” curvilíneo. Para Bárbara Machado, chefe da equipe médica da clínica Pitanguy, a redução de seios era mais popular, mas, com o aumento da segurança das próteses e os ícones de beleza com seios maiores, a brasileira optou por mamas maiores, sem, no entanto, abrir mão das curvas.

Mera futilidade? Edmonds observa que a beleza é fundamental até no mercado de trabalho. “A aparência, cor e apelo sexual ‘adicionam valor’ ao serviço ou são critérios de seleção. Mulheres e homens atrativos têm maiores salários, pois o trabalhador vira parte do produto oferecido ao consumidor.” A cultura do corpo também é a cultura da produtividade. “A aparência fala sobre seu caráter. Se você souber gerenciar bem seu corpo, a leitura que é feita do seu caráter é que você sabe viver, é bom profissional, não é desleixado e administra sua vida de forma competente”, diz Joana. “As mulheres, porém, precisam pensar num outro modelo de pessoa bem-sucedida, porque o atual está levando as pessoas a um adoecimento extremo, já que há um acúmulo descomunal de tarefas, fruto do feminismo, que deu liberdade para a mulher trabalhar sem levar em conta que ela precisaria, também, ser linda e esbelta.” As conquistas feministas adquirem outro significado na modernidade plástica. “A tirania dos ideais de beleza foi explorado pelas feministas nos anos 1970. Mas agora a luta das mulheres para melhorar a aparência é legitimada como vitória do feminismo e já se aceita o egoísmo sadio do prazer de cuidar de si, um orgulho de exibir em público corpos desejáveis. É preciso evitar o otimismo imprudente. A plástica permite a aquisição de capacidades novas, mas o uso das tecnologias tem um efeito perverso nas mulheres: ocultar os efeitos da velhice é promover a reprodução das desigualdades”, analisa Guita Grin Debert, professora titular do Departamento de Antropologia da Universidade Estadual de Campinas (Unicamp), autora da pesquisa Velhice e tecnologias de rejuvenescimento (apoiada pela FAPESP).

Entre os efeitos está o “ataque” à maternidade. “A retórica da indústria é da liberdade do destino biológico, mas permanecem as tensões entre ser mãe e continuar um ser sexual. A cirurgia acirra o conflito, pois permitiria, teoricamente, à mulher ser mãe e continuar a ter apelo sexual, corrigindo os ‘defeitos’ provocados pela maternidade no corpo pós-parto e na anatomia vaginal”, observa Edmonds. Ou, nas palavras de Diana Zuckerman, do Centro Nacional de Pesquisa de Mulheres e Famílias, dos EUA: “O sonho dos homens de marketing é fazer as mulheres acreditarem que seus corpos ficam repugnantes após o nascimento de um filho”. “A medicalização do corpo pelas cirurgias não se legitima pelo discurso biológico do passado cuja beleza ideal do corpo da mulher proveria da maternidade, com o corpo arredondado, volumoso, ancas desenvolvidas e seios generosos. Agora tudo se baseia no discurso ‘psi’, que traz uma submissão à ordem médica ao afirmar o desejo de possuir um ‘corpo perfeito’ em função da autoestima. Nesse discurso, tudo se explica na ênfase da interioridade, o que leva as pessoas a justificar a necessidade de todos se adequarem a modelos estéticos por causa da autoestima”, analisa a antropóloga Liliane Brum Ribeiro, autora da pesquisa A medicalização da diferença. Essa preocupação antecipa-se cada vez mais e atinge os adolescentes, que se “preparam” para o futuro corrigindo “defeitos” de seus corpos jovens e, acima de tudo, aumentando o seu apelo sexual. Daí o crescimento no percentual de jovens operados, na faixa dos 19 anos (25% do total). “A cirurgia coloca as mulheres em competição por mais tempo e mesmo as diferenças geracionais desaparecem com mães e filhas ‘lutando’ entre si por homens, aumentando ainda mais o ‘valor de mercado’ da aparência de juventude”, nota o americano.

Se os adolescentes foram sexualizados, os mais velhos também sofrem com isso. “A cirurgia significa ‘continuar competitivo’ em qualquer idade. No passado, uma mulher de 40 anos se sentia velha e feia, pronta a ser trocada por uma mais jovem ou condenada à solidão. Agora essa mulher está no mercado competindo com a menina de 20 anos graças à plástica”, diz Edmonds. A plástica trouxe, assim, mudanças culturais intensas. “A partir dos anos 1960, a mulher feia era acusada de o ser por não se amar. Ser moderna virou cultivo da aparência bela e do bem-estar corporal. Recusar a beleza é sinal de negligência a ser combatido, um problema psíquico solucionado pela plástica”, observa Liliane. Os impactos são fortes sobre os idosos. “A cirurgia é uma forma de fugir das marcas do tempo, desnaturalizando processos normais e impedindo que a natureza siga seu destino. Transforma-se a velhice numa questão de negligência corporal, negando os constrangimentos dados pelos limites biológicos do corpo”, avalia Guita. “O envelhecimento é o monstro que a medicina tenta combater. Não é para banir cirurgias, mas não se deve restringir a velhice a um ‘desequilíbrio hormonal’, equipará-la a uma doença, uma questão estética, magicamente resolvida com operação, o que só repete a antiga forma de controle sobre a mulher”, analisa Joana.

Afinal, como observou Guita, há uma tendência a transformar a velhice numa questão de negligência corporal e os médicos se empenham em estimular os idosos a adotarem estratégias para combater as marcas do envelhecimento, negando os constrangimentos dados pelos limites biológicos do corpo. “As operações mostram a aversão ao diferente, e a cirurgia é uma tentativa de fugir das marcas do tempo, desnaturalizando processos naturais, e impedir que a natureza siga o seu destino”, avisa a antropóloga. “A aversão ao corpo envelhecido organiza as tecnologias de rejuvenescimento. Os ideais de perfeição corporal encantam a mídia, mas todos sabem que é uma imagem que jamais se pode atingir. É a materialidade do corpo envelhecido que se transforma em norma pela qual o corpo vivido é julgado e suas possibilidades restringidas.” Com o crescimento de pessoas velhas na população, o mercado se esmera em mostrar como devem os jovens de idade avançada se comportar para reparar as marcas do envelhecimento. “Essa projeção do corpo jovem na materialidade do envelhecido e a negação do curso natural impedem a criação de uma estética da velhice”, nota Guita. Mirian Goldenberg, numa pesquisa recente feita na Alemanha sobre a visão do envelhecimento, encontrou diferenças sintomáticas. “Observando a aparência de alemãs e brasileiras, as últimas parecem mais jovens e em melhor forma, mas se sentem subjetivamente mais velhas e desvalorizadas do que as primeiras. Essa avaliação equivocada me fez perceber que, aqui, a velhice é um problema grande, o que explica o enorme sacrifício que muitas fazem para parecer mais jovens”, avalia Mirian. “Elas constroem seus discursos enfatizando as faltas que sentem, não suas conquistas objetivas. A liberdade das brasileiras aparece como conquista tardia após terem cumprido seus papéis de mãe e esposa. Na nossa cultura, em que o corpo é um capital importante, envelhecer é vivenciado como um momento de grandes perdas (de capital), de falta de homem e de invisibilidade social, na contramão do que sentem as mulheres alemãs mais velhas, que valorizam menos a aparência do que as novas experiências, a realização profissional e a qualidade de vida”, conta a antropóloga.

Nem tudo, porém, são espinhos nas cirurgias estéticas. “Há um elemento democratizante nisso tudo. A plástica, ao enfatizar o corpo nu, em detrimento de roupas e ornamentos, naturaliza e ‘biologiza’ o corpo, já que, nesse estado, ele é menos legível como um ‘corpo social’”, analisa Edmonds. “Ela incita uma visão da beleza como igualitária, um capital social que não depende de nascimento, educação ou redes sociais para avançar. Quando o acesso à educação é limitado, o corpo, em relação à mente, se transforma numa base importante para a identidade, uma fonte de poder.” Para o antropólogo, é esse contexto cultural que faz o Brasil único no uso da cirurgia plástica. “É um país lembrado pela graça, pela sensualidade e dificilmente pela disciplina. Talvez, por isso, a cirurgia plástica no país não se ligue a uma alienação do corpo, um ódio das formas, mas a um ethos mais bem adaptado à indústria da beleza: o amor compulsório pelo corpo.”

Climatic fluctuations drove key events in human evolution (University of Liverpool)

21-Sep-2011 – University of Liverpool

Research at the University of Liverpool has found that periods of rapid fluctuation in temperature coincided with the emergence of the first distant relatives of human beings and the appearance and spread of stone tools.

Dr Matt Grove from the School of Archaeology, Classics and Egyptology reconstructed likely responses of human ancestors to the climate of the past five million years using genetic modelling techniques. When results were mapped against the timeline of human evolution, Dr Grove found that key events coincided with periods of high variability in recorded temperatures.

Dr Grove said: “The study confirmed that a major human adaptive radiation – a pattern whereby the number of coexisting species increases rapidly before crashing again to near previous levels – coincided with an extended period of climatic fluctuation. Following the onset of high climatic variability around 2.7 million years ago a number of new species appear in the fossil record, with most disappearing by 1.5 million years ago. The first stone tools appear at around 2.6 million years ago, and doubtless assisted some of these species in responding to the rapidly changing climatic conditions.

“By 1.5 million years ago we are left with a single human ancestor – Homo erectus. The key to the survival of Homo erectus appears to be its behavioural flexibility – it is the most geographically widespread species of the period, and endures for over one and a half million years. Whilst other species may have specialized in environments that subsequently disappeared – causing their extinction – Homo erectus appears to have been a generalist, able to deal with many climatic and environmental contingencies.”

Dr Grove’s research is the first to explicitly model ‘Variability Selection’, an evolutionary process proposed by Professor Rick Potts in the late 1990s, and supports the pervasive influence of this process during human evolution. Variability selection suggests that evolution, when faced with rapid climatic fluctuation, should respond to the range of habitats encountered rather than to each individual habitat in turn; the timeline of variability selection established by Dr Grove suggests that Homo erectus could be a product of exactly this process.

Linking climatic fluctuation to the evolutionary process has implications for the current global climate change debate. Dr Grove said: “Though often discussed under the banner term of ‘global warming’, what we see in many areas of the world today is in fact an increased annual range of temperatures and conditions; this means in particular that third world human populations, many living in what are already marginal environments, will face ever more difficult situations. The current pattern of human-induced climate change is unlike anything we have seen before, and is disproportionately affecting areas whose inhabitants do not have the technology required to deal with it.”

The research is published in The Journal of Human Evolution and The Journal of Archaeological Science.

Science and religion do mix (Rice University)

9/20/2011 – News & Media Relations

Rice University study reveals only 15 percent of scientists at major research universities see religion and science always in conflict

Throughout history, science and religion have appeared as being in perpetual conflict, but a new study by Rice University suggests that only a minority of scientists at major research universities see religion and science as requiring distinct boundaries.

“When it comes to questions about the meaning of life, ways of understanding reality, origins of Earth and how life developed on it, many have seen religion and science as being at odds and even in irreconcilable conflict,” said Rice sociologist Elaine Howard Ecklund. But a majority of scientists interviewed by Ecklund and colleagues viewed both religion and science as “valid avenues of knowledge” that can bring broader understanding to important questions, she said.

Ecklund summarized her findings in “Scientists Negotiate Boundaries Between Religion and Science,” which appears in the September issue of the Journal for the Scientific Study of Religion. Her co-authors were sociologists Jerry Park of Baylor University and Katherine Sorrell, a former postbaccalaureate fellow at Rice and current Ph.D. student at the University of Notre Dame.

They interviewed a scientifically selected sample of 275 participants, pulled from a survey of 2,198 tenured and tenure-track faculty in the natural and social sciences at 21 elite U.S. research universities. Only 15 percent of those surveyed view religion and science as always in conflict. Another 15 percent say the two are never in conflict, and 70 percent believe religion and science are only sometimes in conflict. Approximately half of the original survey population expressed some form of religious identity, whereas the other half did not.

“Much of the public believes that as science becomes more prominent, secularization increases and religion decreases,” Ecklund said. “Findings like these among elite scientists, who many individuals believe are most likely to be secular in their beliefs, definitely call into question ideas about the relationship between secularization and science.”

Many of those surveyed cited issues in the public realm (teaching of creationism versus evolution, stem cell research) as reasons for believing there is conflict between the two. The study showed that these individuals generally have a particular kind of religion in mind (and religious people and institutions) when they say that religion and science are in conflict.

The study identified three strategies of action used by these scientists to manage the religion-science boundaries and the circumstances that the two could overlap.

  • Redefining categories – Scientists manage the science-religion relationship by changing the definition of religion, broadening it to include noninstitutionalized forms of spirituality.
  • Integration models – Scientists deliberately use the views of influential scientists who they believe have successfully integrated their religious and scientific beliefs.
  • Intentional talk – Scientists actively engage in discussions about the boundaries between science and religion.

“The kind of narrow research available on religion and science seems to ask if they are in conflict or not, when it should really ask the conditions under which they are in conflict,” Ecklund said. “Our research has found that even within the same person, there can be differing views. It’s very important to dispel the myth that people believe that religion and science either do or don’t conflict. Our study found that many people have much more nuanced views.”

These nuanced views often find their way into the classroom, according to those interviewed. One biologist, an atheist not part of any religious tradition, admitted that she makes a sincere effort to present science such that “religious students do not need to compromise their own selves.” Although she is not reconsidering her personal views on religion, she seeks out resources to keep her religious students engaged with science.

Other findings:

  • Scientists as a whole are substantially different from the American public in how they view teaching “intelligent design” in public schools. Nearly all of the scientists – religious and nonreligious alike – have a negative impression of the theory of intelligent design.
  • Sixty-eight percent of scientists surveyed consider themselves spiritual to some degree.
  • Scientists who view themselves as spiritual/religious are less likely to see religion and science in conflict.
  • Overall, under some circumstances even the most religious of scientists were described in very positive terms by their nonreligious peers; this suggests that the integration of religion and science is not so distasteful to all scientists.

Ecklund said the study’s findings will go far in improving the public’s perception of science. “I think it would be helpful for the public to see what scientists are actually saying about these topics, rather than just believe stereotypes,” she said. “It would definitely benefit public dialogue about the relationship between science and religion.”

Ecklund is the author of “Science vs. Religion: What Scientists Really Think,” published by Oxford University Press last year.

The study was supported by a grant from the John Templeton Foundation and additional funding from Rice University.

Gamers crackeiam código que pode gerar novos tratamentos contra AIDS (Gizmodo Brasil)

Por Kwame Opam – 15:56 – 19-09-2011

 

Os cientistas passaram uma década tentando — e não conseguindo — mapear a estrutura de uma enzima que pode ajudar a resolver uma parte crucial do quebra-cabeça do vírus da AIDS. Um grupo de gamers precisou de apenas três semanas.

A enzima em questão é a protease do vírus dos macacos de Mason-Pfizer, e pesquisadores vêm buscando formas de desativá-lo para assim descobrir novas formas de desenvolver drogas anti-HIV. Infelizmente, os esforços convencionais de computadores e cientistas foram pouco durante anos.

Eis que entra no jogo a Foldit. a Foldit foi desenvolvida em 2008 como forma para descobrir estrutura de várias proteínas e aminoácidos — algo que computadores não sabem fazer muito bem — ao transformar o problema em um jogo. Ao adicionar as coordenadas experimentais à enzima do vírus do macaco, os gamers — vários deles sem nenhum tipo de conhecimento passado em biologia molelucar — foram capazes de prever a estrutura da proteína, permitindo que os cientistas marcassem localizações precisas e parassem o crescimento do vírus.

O estudo, publicado na Nature Structure & Molecular Biology, detalha quão incrível um passo desse é para o desenvolvimento de terapias mais efetivas para pacientes com AIDS. Trata-se também de um precedente importante que estabelece uma base para que cientistas e pessoas comuns trabalhem juntas para resolver novos problemas e salvar vidas. O que é algo incrível. [Sydney Morning Herald via The Next Web]

Witch tax hits Romanian witches and fortune tellers (The Christian Science Monitor)

Witch tax: Superstitions are no laughing matter in Romania and have been part of its culture for centuries. President Traian Basescu and his aides have been known to wear purple on certain days, supposedly to ward off evil.

By Alison Mutler, Associated Press / January 7, 2011

Romanian witch Mihaela Minca deals cards during an interview with The Associated Press in Mogosoaia, Romania, Wednesday, Jan. 5, 2011. Trouble is brewing for Romania’s witches, whose toil is being taxed for the first time despite their threats of putting curses on the government. Also being taxed for the first time are fortune tellers, who probably saw this coming. Vadim Ghirda/AP

CHITILA, ROMANIA
Everyone curses the tax man, but Romanian witches angry about having to pay up for the first time hurled poisonous mandrake into the Danube River on Thursday to cast spells on the president and government.

Romania’s newest taxpayers also included fortune tellers — but they probably should have seen it coming.

Superstitions are no laughing matter in Romania — the land of the medieval ruler who inspired the “Dracula” tale — and have been part of its culture for centuries. President Traian Basescu and his aides have been known to wear purple on certain days, supposedly to ward off evil.

A witch at the Danube named Alisia called the new tax law “foolish.”

“What is there to tax, when we hardly earn anything?” she said, identifying herself with only one name as many Romanian witches do.

Yet on the Chitila River in southern Romania, other witches gathered around a fire Thursday and threw corn into an icy river to celebrate Epiphany. They praised the new government measure, saying it gives them official recognition.

Witch Melissa Minca told The Associated Press she was “happy that we are legal,” before chanting a spell to call for a good harvest, clutching a jar of charmed river water, a sprig of mistletoe and a candle.

The new tax law is part of the government’s drive to collect more revenue and crack down on tax evasion in a country that is in recession.

In the past, the less mainstream professions of witch, astrologer and fortune teller were not listed in the Romanian labor code, as were those of embalmer, valet and driving instructor. People who worked those jobs used their lack of registration to evade paying income tax.

Under the new law, like any self-employed person, they will pay 16 percent income tax and make contributions to health and pension programs.

Some argue the law will be hard to enforce, as the payments to witches and astrologers usually are small cash amounts of 20 to 30 lei ($7-$10) per consultation.

Mircea Geoana, who lost the presidential race to Basescu in 2009, performed poorly during a crucial debate, and his camp blamed attacks of negative energy by their opponent’s aides.

Geoana aide Viorel Hrebenciuc alleged there was a “violet flame” conspiracy during the campaign, saying Basescu and other aides dressed in purple on Thursdays to increase his chances of victory.

Romanian officials still wear purple clothing on important days, because the color supposedly makes the wearer superior and wards off evil.

Such spiritualism has long been tolerated by the Orthodox Church in Romania, and the late Communist dictator Nicolae Ceausescu and his wife, Elena, had their own personal witch.

Queen witch Bratara Buzea, 63, who was imprisoned in 1977 for witchcraft under Ceausescu’s repressive regime, is furious about the new law.

Sitting cross-legged in her villa in the lake resort of Mogosoaia, just north of Bucharest, she said Wednesday she planned to cast a spell using a particularly effective concoction of cat excrement and a dead dog.

“We do harm to those who harm us,” she said. “They want to take the country out of this crisis using us? They should get us out of the crisis because they brought us into it.”

“My curses always work!” she cackled in a smoky voice, sitting next to a wood-burning stove, surrounded by potions, charms, holy water and ceramic pots.

But not every witch threatened fire and brimstone.

“This law is very good,” said Mihaela Minca, sister of Melissa. “It means that our magic gifts are recognized and I can open my own practice.”

Nigerian car thief turns into goat! (The Christian Science Monitor)

In West Africa, widespread belief in witchcraft, black magic, and superstition undermine the fundamentals of journalism.

By Walter Rodgers / July 6, 2009

ABUJA, NIGERIA
In Nigeria recently, an angry mob demanded that police jail a goat. Vigilantes insisted the animal was a human car thief who transmogrified upon being apprehended. Nigerian law doesn’t recognize magic, witchcraft, or voodoo. Yet, faced with an angry mob, police acquiesced, arresting the goat.

This story was my object lesson for a Practical Reporting 101 class I taught to Nigerian journalism students this spring. There was just one problem: Some felt the goat was guilty. “These things actually happen,” one woman protested.

Objective truth is the ideal of journalism. It’s a destination reached through rigorous reporting rooted in skepticism. That’s a tall order in a society that’s so heavily riddled with superstition. In Nigeria, the sharp line between fact and fiction is badly blurred by centuries of animism and occultism that infects contemporary Muslim and Christian thinking as well as secular thought.

Journalistic skepticism is hard to teach where public imagination supersedes rational disbelief. As a result, journalism’s leavening effect on society is diminished. Reporters must always tread lightly in matters of religion, of course. Nearly all faiths hold to beliefs that defy everyday evidence. But, in the West at least, it’s understood that private religious beliefs – along with political beliefs – should be compartmentalized from the practice of journalism. A reporter’s religious beliefs, no matter how odd, don’t necessarily preclude good journalism. But when those beliefs clearly interfere with basic fact-checking and verification, then it’s worth examining how collective belief in magic can impede the civic development that good journalism fosters.

Black magic, malevolent curses, and witch doctors are woven into the fabric of West African society. “I don’t believe in witches, but I know they exist,” one of my students said. Television soap operas feature a villain sprinkling green powder on the doorstep of the woman next door. The following day she is shown writhing in agony. Great swaths of Nigerian society take these curses seriously.

Not infrequently, police hear reports that a man claims someone cast a spell to capture his spirit. Tradition here holds that if you sleep in bed with your feet at the headboard, you are communing with witches. Criminals buy charms from witch doctors to become invisible and escape arrest. A hairdresser tells of a client of another customer who reported a snake in her house that turned into a young woman. When the girl was taken to a Pentecostal church service she turned back into a snake. The journalistic canon of having two independent sources to confirm a news story becomes irrelevant when an entire congregation insists “it really happened.”

In Nigeria hearsay becomes conviction, then “truth,” and credibility grows in the retelling.

TV coverage lends currency to rumor. Take the story of four thieves apprehended by vigilantes who tied and bound them. According to dozens of village witnesses, there was supposedly a puff of smoke and the bound villains became four tethered crocodiles. One student insisted this was more credible than transubstantiation at Roman Catholic communion – the doctrine that the bread and wine become the body and blood of Jesus Christ – because “the TV news showed video of the four crocodiles.”

“We believe in God,” says Lydia Tolulope Adeleru, an American-educated daughter of a Baptist minister. “We also believe in our cultural gods like Sango, the god of iron, as well as Esu, the devil. We are a deeply religious people but we never left the old ways.” Africans often look for an unknown element to blame for disasters, floods, and crop failures. “If Christians have a God who makes Lucifer fall from heaven,” adds Ms. Adeleru, “what’s so strange about our juju [black magic]?”

The “rules of evidence” are easily contaminated here. Beatrice Funmilayo, a diplomat’s daughter, was a rare skeptic. “Nigerians have rich traditions of storytelling, but as journalists, we have to divorce ourselves from our cultural inclinations.” “Besides,” she said, “if these things really happened, wouldn’t they happen everywhere and not just [in] Nigeria?”

Shebanjo Ola is a university-educated attorney. He told of a woman in his village mixing sand and stones in a bowl and covering it with paper. When she removed the paper, the contents had magically turned into rice and meat. I asked, “Did you see it?” “No, but my mother did, and she never lies,” he replied. So much for the journalistic canon: “When your mother tells you she loves you, check it out.”

In one class I abruptly asked, “Has anyone here actually seen someone magically disappear?” Temple Ojutalayo assured me he had. He said his university professor teaching traditional folk medicine “disappeared in front of the entire class.”

I asked how many of these aspiring journalists believed in ghosts. The hands shot up. “What about UFOs?”

No response. Then a voice from the rear said, “Those only happen in America.”

Walter Rodgers is a former senior international correspondent for CNN. He writes a biweekly column for the Monitor’s weekly edition.

Ghana aims to abolish witches’ camps (The Christian Science Monitor)

For years, Ghanaians have banished women from their villages who were suspected of witchcraft. Now, Ghana is trying to ban this practice.

By Clair MacDougall, Correspondent / September 15, 2011

ACCRA, GHANA
Ghanaian leaders and civil society groups met in the nation’s capital, Accra earlier this week to develop a plan to abolish the witches’ camps in the northern region, where over a thousand women and children who have been accused of sorcery are currently living in exile.

Deputy Minister for Women and Children’s Affairs Hajia Hawawu Boya Gariba said the ministry would be doing everything that it could to ensure the practice of families and neighbors banishing women from communities whom they suspected of being witches is abolished by developing legislation that would make it illegal to accuse someone of being a witch and gradually closing down camps and reintegrating women back into their communities.

“This practice has become an indictment on the conscience of our society,” Ms. Gariba said at the conference called Towards Banning “Witches” Camps. “The labeling of some of our kinsmen and women as witches and wizards and banishing them into camps where they live in inhuman and deplorable conditions is a violation of their fundamental human rights.”

Supreme Court Justice Rose Owusu also said that the practice violated numerous clauses in section 5 of Ghana’s 1992 Constitution. That section protects human rights and outlaws cultural practices which “dehumanize or are injurious to the physical and mental well-being of a person.” Ms. Owusu also called for the development of new legislation to outlaw the camps and the practice.

The witch camps of Ghana’s north

There are currently around 1,000 women and 700 children living in 6 of the witches’ camps in Ghana’s northern region.

Many of them are elderly women who have been accused of inflicting death, misfortune, and calamity on their neighbors and villages through sorcery, witchcraft, or “juju,” a term used throughout West Africa.

The women enjoy a certain degree of protection within these camps, located some distance from their communities in which they could be tortured, beaten to death, or lynched, but the conditions of the camps are often poor. The “accused witches,” as they are sometimes referred to, live in tiny thatched mud huts, and have limited access to food and must fetch water from nearby streams and creeks.

Forced to flee

An elderly woman named Bikamila Bagberi who has lived in Nabule witch camp in Gushegu a district in the Northern Region for the past 13 years, told the story of how she was forced to leave her village. Dressed in a headscarf, faded T-shirt, and cotton skirt, Ms. Bagberi spoke softly with her head bowed as a district assemblyman translated for the conference delegates.

Bagberi’s nephew, her brother-in-law’s son, had died unexpectedly and after the village soothsayer said she caused the death of the child her family tried make her confess to murdering him through sorcery. She said that when she refused she was beaten with an old bicycle chain, and later her nephew’s family members rubbed Ghanaian pepper sauce into her eyes and open wounds.

When asked whether she could return back to her village she said the family couldn’t bring her back into the community because of the fear that she will harm others. Bagberi said she expected to spend the rest of her life in the camp.

Catalyst for action

Human rights groups have been campaigning for the closure of the witches’ camps since the 1990s, but have had little success in abolishing the practice of sending women suspected of witchcraft into exile, in part because of lack of political will and the pervasiveness of the belief in witchcraft throughout Ghana. But the brutal murder of 72-year-old Ama Hemmah in the city of Tema in Novermber of last year, allegedly by six people, among them a Pentecostal pastor and his neighbors who are accused of dousing her with kerosene and setting her alight, caused public outrage and made headlines across the world. Since Hemmah’s death, opinion pieces and articles about the issue have featured in Ghana’s major newspapers, along with feature stores on local news programs.

Emmanuel Anukun-Dabson from Christian Outreach Fellowship, a group working with the accused witches at the Nabule camp and one of the organizers of the conference, suggested that a broader cultural shift needed to take place if the camps were to be abolished.

“In Ghana, we know that when a calamity happens or something befalls a family or a community the question is not what caused it, but rather who caused it?” Anukun-Dabson said. “We are a people who do not take responsibility for our actions; rather we find scapegoats and women are the targets.”

Chief Psychiatrist of Ghana’s Health Services Dr. Akwesi Osei, who spearheaded the conference, argued that a public awareness campaign on psychological disorders, dementia, and the mental and behavioral changes associated with menopause might help the public understand behaviors and perceived eccentricities that are often associated with witchcraft.

Belief in witchcraft and supernatural powers is common throughout Ghana, and Africa countries and is often encouraged by pastors who preach in the nation’s many charismatic churches. Supernatural themes and sorcery also feature strongly in Ghanaian and West African films and television programs.

Deputy Minister Gariba has called for another meeting to develop a more concrete road map and said that the National Disaster Management Organisation would be providing the witches’ camps with water tanks and additional food supplies.

Joojo Eenstua, another organizer of the camp who works with Christian Outreach Fellowship at Nabule, said the conference marked a new era in activism on the issue and believed that significant changes and improvements to the livelihoods of the women and children living in these witches camps would follow.

“There is more public awareness than before and there is more political will and momentum around this issue,” Ms. Eenstua says.

Unshakeable stereotypes of science (New Scientist)

13 September 2011 by Roger Highfield
Magazine issue 2829.

Science has transformed our world, so why does the public have such an old-fashioned view of scientists, asks Quentin Cooper

What is the problem with the public’s image of scientists?
If you ask anyone, they will tell you that science has transformed their world with amazing discoveries. But then if you invite them to draw a scientist, what they depict is precisely what people would have described 50 years ago, back when the anthropologist Margaret Mead came up with what we now call the “draw a scientist” test.

How do people generally depict scientists?
It is uncanny: they draw someone with a hangdog look, frizzy hair and test tube in hand, all in a scene where things are going wrong. There are national variations. In Italy, scientists tend to be scarred and have bolts in their necks, like Frankenstein’s monster. In general, though, they are mostly white, male, bald and wearing a white coat. No wonder we have a problem recruiting scientists.

What do you think of attempts to make scientists cool, like the Studmuffins of Science calendar and GQ’s Rock Stars of Science?
They are doomed because for geek calendars and suchlike to work, they have to bounce off the stereotype. As a result, they reinforce it.

On TV there are plenty of science presenters who defy the stereotype, such as the physicist Brian Cox. Surely that helps?
It is true. They are not all white, male and old. Some have hair. Some, like Brian, arguably have too much! But while people know them and are familiar with their TV programmes, it is surprising what happens when you ask the public about their favourite science presenters. In the UK they usually nominate veterans, such as David Attenborough. In fact, in the last poll I saw, half the people could not name a TV science presenter. They don’t seem to recognise them as scientists because they don’t conform to the stereotype.

And this stereotype also applies to the best known scientist of all time, Einstein?
The image of the old Einstein with tongue out is the one everyone knows – the one taken on his 72nd birthday. But he was a dapper 26-year-old when he had his “annus mirabilis” and wrote the four papers that changed physics.

What do you think about the depiction of scientists in films?
What I find striking is you almost never see scientists on screen unless they are doing science. There are very few characters who happen to be scientists. And those scientists shown tend to be at best eccentric, at worst mad and/or evil.

How can we improve the image of scientists?
Even though the “draw a scientist” test started half a century ago, it was only in the 1980s that someone had the idea of introducing children to a real scientist after they had drawn one, and then asking them to have another go at drawing. One of my favourite examples is of the schoolgirl who initially drew a man with frizzy hair and a white coat, but afterwards depicted a smiling young woman holding a test tube. Above it is the word “me”. I still find myself choking up when I show it.

Profile
Quentin Cooper is a science journalist and presenter of the BBC radio programme Material World. He is hosting the Cabaret of the Elements at the British Science Festival in Bradford on 10 September.

We Need To Do More When It Comes To Having Brief, Panicked Thoughts About Climate Change (The Onion)

COMMENTARY
BY RHETT STEVENSON
SEPTEMBER 6, 2011 | ISSUE 47•36

The 20 hottest years on record have all taken place in the past quarter century. The resulting floods, wildfires, and heat waves have all had deadly consequences, and if we don’t reduce carbon emissions immediately, humanity faces bleak prospects. We can no longer ignore this issue. Beginning today, we must all do more when it comes to our brief and panicked thoughts about climate change.

Indeed, if there was ever a time when a desperate call to take action against global warming should race through our heads as we lie in bed and stare at the ceiling, that time is now.

Many well-intentioned people will take 20 seconds out of their week to consider the consequences of the lifestyle they’ve chosen, perhaps contemplating how their reliance on fossil fuels has contributed to the rapid melting of the Arctic ice cap. But if progress is what we truly want, 20 seconds is simply not enough. Not by a long shot. An issue this critical demands at least 45 seconds to a solid minute of real, concentrated panic.

And I’m not talking about letting the image of a drowning polar bear play out in your mind now and then. If we’re at all serious, we need to let ourselves occasionally be struck with grim visions of coastal cities washing away and people starving as drought-stricken farmlands fail to yield crops—and we need to do this regularly, every couple days or so, before continuing to go about our routines as usual.

This may seem like a lot to ask, but no one ever said making an effort to think about change was easy.

So if you pick up a newspaper and see an article about 10 percent of all living species going extinct by the end of the century, don’t just turn the page. Stop, peruse it for a moment, look at the photos, freak out for a few seconds, and then turn the page.

And the next time you start up your car, stop to think how the exhaust from your vehicle and millions of others like it contributes to air pollution, increasing the likelihood that a child in your neighborhood will develop asthma or other respiratory ailments. Take your time with it. Feel the full, crushing weight of that guilt. Then go ahead and drive wherever it was you wanted to go.

To do anything less is irresponsible.

Suppose you’ve just sat down in a crisply air-conditioned movie theater. Why not take the length of a preview or two to consider the building’s massive carbon footprint? Imagine those greenhouse gases trapped in the atmosphere, disrupting ecosystems and causing infectious diseases to spread rampantly, particularly in regions of the world where the poorest people live. Visualize massive storm systems cutting widespread swaths of destruction. Think of your children’s children dying horrible, unnecessary deaths.

You might even go so far as to experience actual physical symptoms: shaking, hyperventilation, perhaps even a heart palpitation. These are entirely appropriate responses to have, and the kinds of reactions each of us ought to have briefly before casting such worries aside to enjoy Conan The Barbarian.

Ultimately, however, our personal moments of distress won’t matter much unless our government intervenes with occasional mentions of climate change in important speeches, or by passing nonbinding legislation on the subject. I implore you: Spend a couple minutes each year imagining yourself writing impassioned letters to your elected representatives demanding a federal cap on emissions.

Global warming must be met with immediate, short-lasting feelings of overwhelming dread, or else life as we know it will truly cease—oh, God, there’s nothing we can do, is there? Maybe we’re already too late. What am I supposed to do? Unplug my refrigerator? I recycle, I take shorter showers than I used to, doesn’t that count for something? Devastating famines and brutal wars fought over dwindling resources? Is that my fault? Jesus, holy shit, someone do something! Tell me what to do! For the love of God, what can possibly be done?

There you have it. I’ve done my part. Now it’s your turn.

Few insurers planning for climate change (Reuters)

By Ben Berkowitz

NEW YORK, Sept 1 (Reuters) – Only one in eight insurers has a formal policy in place to manage climate risk, despite rising evidence that environmental changes are exacerbating insurers’ disaster losses, according to a coalition of public interest groups.

The coalition, Ceres, looked at 88 filings from six states by insurance companies, using a form developed by the National Association of Insurance Commissioners. Ceres said it was the first-ever effort to quantify how U.S. insurers manage climate risk in their day-to-day operations.

Despite the broad lack of a formal policy, Ceres said insurers generally acknowledge the problem of climate change and the effect it can have on their business.

“Even those insurers with no formal climate policy, no climate risk management structure and a stated belief that the company is not vulnerable to the effects of climate change still name perils that may be affected by climate change 20 percent of the time,” Ceres said in its report.

Of the 11 companies with formal climate policies, two — Prudential Financial (PRU.N) and Genworth Financial (GNW.N) — are life insurers. The rest are mostly multi-line insurers or reinsurers. Among them are ACE Ltd (ACE.N), AIG’s (AIG.N) Chartis unit.

(For an Insider interview with the author of the Ceres report, click here: link.reuters.com/myk53s)

The Ceres report comes as insurers start paying claims for last week’s Hurricane Irene, which broke flood records across the U.S. Northeast, and as they look to the Atlantic for the approach of what may become Hurricane Katia.

Because of the potential for hurricanes to cause sudden and huge losses in the United States, Ceres said the insurance industry is especially focused on how climate change will affect hurricane exposure, potentially at the expense of studying the impact on other common perils.

Some insurance companies have taken a public stand on climate issues, particularly home and auto insurer Allstate (ALL.N), which has warned that recent severe weather is part of a permanent change in the environment, and German reinsurance heavyweight Munich Re (MUVGn.DE).

Ceres recommended that all states make the National Association of Insurance Commissioners disclosure form mandatory and public, and that they adopt the model of California insurance regulators, who put together detailed guidelines on how to fill out the form.

Ceres describes itself is a national coalition of investors, environmental organizations and public interest groups. (Reporting by Ben Berkowitz; editing by John Wallace)

SINCE SEPTEMBER 11, 2001 . . . (SSRC)

10 years after september 11 – A SOCIAL SCIENCE RESEARCH COUNCIL ESSAY FORUM

By Veena Das

A decade of intense theorizing on the forms of violence and human degradation, on global connectivity, on demands that scholarship be done in “real time” . . . a sense of urgency . . . disciplines are aggressively asked to prove their relevance . . . a deep disquiet on the part of many radical scholars and public intellectuals that the American public is increasingly becoming complicit in projects of warfare. We ask, are our senses being so retrained now that we cannot see the suffering of others or hear their cries? We declare with anguish that whole populations are defined as nothing but targets for bombing . . . as those whose deaths do not count, and hence those dead literally need not be counted. There is a desperation to hone in on what is new—perhaps, some theorize, what we now have is “horror” and not “terror” . . . perhaps, say others, what is lost is not only meaning but any trust in what might count as real.

Despite repeated calls for invention of new vocabularies, my own sense is that we have yet to come to terms with the violence of the past and that we have allowed our scholarly terms to be defined in a manner that we are becoming trapped in, terms that are already given in the questions that we ask. After all, do we need to be reminded that the single-most important factor in the decline of the total number of wars since 1942 was the end of colonial wars? Or that in the 1990s the region in which the highest death toll occurred was sub-Saharan Africa, and that it was the indirect death through disease and malnutrition that contributed to the enormity of the violence? I use the collective first-person pronoun to include myself within this trap of not being quite able to define what the right questions should be.

Ten years ago, when I contributed a short reflection on September 11 to the SSRC’s forum, something of this disquiet I feel about the mode of theorizing was already present. I argued that in the political rhetoric that circulated right after September 11, with its talk of attacks on the values of civilization, the American nation was seen to embody universal values—hence the talk was not of many terrorisms with which several countries had lived for more than thirty years but of one grand terrorism, Islamic terrorism. If I am allowed to loop back to my words, I asked, “What could this mean except that while terrorist forms of warfare in other spaces in Africa, Asia, or the Middle East were against forms of particularism, the attack on America is seen as an attack on humanity itself?” Perhaps we should ask of ourselves now the permission to be released from the grip of this master trope of September 11 that organizes a whole discourse, both conservative and radical, in terms of terrorism as the gripping drama of our times. We might then ask, what other questions have been under discussion among different communities of scholars and how might debate be widened to take account of these discussions?

One point I might put forward as a candidate for discussion is how affect is invested in some terms that come to be the signifiers of the pressing problems of a particular decade but then are dropped as if their force has been exhausted by new discoveries. When these terms drop out of scholarly circulation, do they still have lives that are lived in other corners of the world or in the lives of individuals who continue to give them expression? Consider the history of the term “ethnic cleansing,” which came to signify and organize much discussion in the nineties as referring to the pathology of what was termed as ethno-nationalism. As is well known, the term emerged in the summer of 1992 during the tragic events of the dissolution of Yugoslavia and the emergence of new nation-states that were making claims for international recognition. Although the composite term “ethnic cleansing” came to be used only then, the idea of “cleaning” a territory by killing the local inhabitants and making it safe for military occupation was known in colonial wars as well as expressed extensively in Latin America with reference to undesirable groups, such as prostitutes, enemy collaborators, and the vagrant poor.

Norman Naimark has made the point that ethnic cleansing happens in the shadow of war. He cites the examples of the Greek expulsion as a result of the Greco-Turkish war, the intensification of ethnic cleansing when NATO bombing started in Kosovo in March 1999, and Stalin’s brutal dealings with the Chechen-Ingush and Crimean Tartars during the Second World War.1 A chilling aspect of ethnic cleansing is its totalistic character. As Naimark puts it:

The goal is to remove every member of the targeted nation; very few exceptions to ethnic cleansing are allowed. In premodern cases of assaults of one people on another, those attacked could give up, change sides, convert, pay tribute, or join the attackers. Ethnic cleansing, driven by the ideology of integral nationalism and the military and technological power of the modern state, rarely forgives, makes exceptions, or allows people to slip through the cracks.

Yet a concept that was said to be central to explaining major mass atrocities is now rarely encountered—except perhaps in international law discussions on the distinction between genocide and ethnic cleansing. Are the kinds of mass atrocities that have occurred since September 11 not amenable to discussion under any of the earlier terms? Do subjectivities shift so quickly? Are issues of intentionality as providing the criteria for distinguishing between genocide and ethnic cleansing already resolved? What is at stake in the fact that ethnic cleansing is a perpetrator’s term while genocide is a term that privileges the experience of the victims? What kind of footing in the world do enunciations made on behalf of all sides in conflicts that draw on such concepts as human rights and human dignity have?

While one can understand why the media might have moved on to other stories, have we as scholars come to terms with why some concepts disappear from our vocabularies so quickly? I want to suggest that a long-term perspective on how we come to speak of violence—the appearance and disappearance of different terms—provides a repertoire of concepts to be mined for understanding how representation of violence in the public sphere was closely tied up with the West’s self-definition that in turn defined the twists and turns in the social sciences. Ethnic cleansing in the nineties was widely understood as the violence of the other just as terrorism now is understood as the violence that the other perpetrates. September 11 and the subsequent wars in Iraq and Afghanistan then become events that need to be placed in the long history of warfare that has generated the concepts of social science—concepts that cannot be divested of their political plenitude even as we recognize that the technologies of war have changed considerably.

Are there other discussions on war that are not quite within the discursive fields that dominate the post–September 11 scenario and the notion of Islamic terrorism? I find it salutary to think that other theoretical discussions are taking place that are outside this frame of reference. For instance, the prolonged civil war in Sri Lanka, in which both Sinhala soldiers and Tamil militants engaged in killing, has led to discussions on the relation between Buddhism and violence and whether there are strains of Buddhism, especially within the Mahayana school, that make room for the exercise of violence. Interestingly, the issues here are not those of justifying warfare but rather of dealing with the anxieties about bad karma generated by the acts of violence.

A sustained analysis of what enabled such developments as samurai Zen, or soldier Zen, to appear in Japan or how it is that Buddhism could find a home within kingdoms as diverse as the Indians, the Mongols, the Chinese, and the Thai deepens our understanding of violence and nonviolence precisely because it has the potential to change the angle of our vision.2 Similar discussions from within other traditions, both religious and secular, would help to break the monopoly of concepts (biopolitics, state of exception, homo sacer) that are now routinely used to understand the world. This hope is not an expression of sheer nostalgia for non-Western concepts but a plea to cultivate some attentiveness to those discourses that are (or could be) part of the history of our disciplines. Scholarly discourse cannot simply mirror the ephemeral character of media stories—even when a particular kind of violence disappears, the institutions that were put in place for dealing with it continue to have lives of their own. The braiding of what is new and what is enduring might then define how we come to pose questions that are not simply corollaries of the common sense of our times.


Veena Das is Krieger-Eisenhower Professor of Anthropology and professor of humanities at the Johns Hopkins University. Her most recent books are Life and Words: Violence and the Descent into the Ordinaryand Sociology and Anthropology of Economic Life: The Moral Embedding of Economic Action (ed., with R. K. Das).

  1. Norman M. Naimark, Fires of Hatred: Ethnic Cleansing in Twentieth-Century Europe (Cambridge, MA: Harvard University Press, 2001).
  2. See Michael K. Jerryson and Mark Juergensmeyer, eds., Buddhist Warfare (New York: Oxford University Press, 2010).

Shooting the messenger (The Miami Herald)

Environment
Posted on Monday, 08.29.11
BY ANDREW DESSLER

Texas Gov. Rick Perry stirred up controversy on the campaign trail recently when he dismissed the problem of climate change and accused scientists of basically making up the problem.

As a born-and-bred Texan, it’s especially disturbing to hear this now, when our state is getting absolutely hammered by heat and drought. I’ve got to wonder how any resident of Texas – and particularly the governor who not so long ago was asking us to pray for rain – can be so cavalier about climate change.

As a climate scientist at Texas A&M University, I can also tell you from the data that the current heat wave and drought in Texas is so bad that calling it “extreme weather” does not do it justice. July was the single hottest month in the observational record, and the 12 months that ended in July were drier than any corresponding period in the record. I know that climate change does not cause any specific weather event. But I also know that humans have warmed the climate over the last century, and that this warming has almost certainly made the heat wave and drought more extreme than it would have otherwise been.

I am not alone in these views. There are dozens of atmospheric scientists at Texas institutions like Rice, the University of Texas, and Texas A&M, and none of them dispute the mainstream scientific view of climate change. This is not surprising, since there are only a handful of atmospheric scientists in the entire world who dispute the essential facts – and their ranks are not increasing, as Gov. Perry claimed.

And I can assure Gov. Perry that scientists are not just another special interest looking to line their own pockets. I left a job as an investment banker on Wall Street in 1988 to go to graduate school in chemistry. I certainly didn’t make that choice to get rich, and I didn’t do it to exert influence in the international arena either.

I went into science because I wanted to devote my life to the search for scientific knowledge. and to make the world a better place. That’s the same noble goal that motivates most scientists. The ultimate dream is to make a discovery so profound and revolutionary that it catapults one into the pantheon of the greatest scientific minds of history: Newton, Einstein, Maxwell, Planck, etc.

This is just one of the many reasons it is inconceivable for an entire scientific community to conspire en masse to mislead the public. In fact, if climate scientists truly wanted to maximize funding, we would be claiming that we had no idea why the climate is changing – a position that would certainly attract bipartisan support for increased research.

The economic costs of the Texas heat wave and drought are enormous. The cost to Texas alone will be many billion dollars (hundreds of dollars for every resident), and these costs will ripple through the economy so that everyone will eventually pay for it. Gov. Perry needs to squarely face the choice confronting us; either we pay to reduce emissions of greenhouse gases, or we pay for the impacts of a changing climate. There is no free lunch.

Economists have looked at this problem repeatedly over the last two decades, and virtually every mainstream economist has concluded that the costs of reducing emissions are less than the costs of unchecked climate change. The only disagreement is on the optimal level of emissions reductions.

I suppose it should not be surprising when politicians like Gov. Perry choose to shoot the messenger rather than face this hard choice. He may view this as a legitimate policy on climate change, but it’s not one that the facts support.

Read more here.

A Reality Check on Clouds and Climate (N.Y. Times)

September 6, 2011, 5:44 PM

Dot Earth

By ANDREW C. REVKIN

I am often in awe of clouds, as was the case when I shot this video of a remarkable thunderhead somewhere over the Midwest. But I’m tired of the recent burst of over-interpretation of a couple of papers examining aspects of clouds in the context of a changing climate.

I’ve long pointed out that anyone trumpeting a conclusion about greenhouse-driven climate change on the basis of a single paper should be treated with skepticism or outright suspicion. I trust climate science as an enterprise because — despite its flaws — it is a self-correcting process in which trajectory matters far more than individual steps in the road.

There is always a temptation, particularly for those with an agenda and for media in search of the “front-page thought,” to overemphasize studies that fit some template, no matter how tentative, or flawed.

The flood of celebratory coverage that followed publication of a recent paper by Roy Spencer and Danny Braswell — proposing a big reduction in the sensitivity of the climate to greenhouse gases — was far more about pushing an agenda than providing guidance on the state of climate science. There’s a lot more on this below.

The same goes for the stampede on clouds and climate following publication of an important, but preliminary, laboratory finding from the European Organization for Nuclear Research (better known by its acronym, CERN) about how cosmic rays can stimulate the formation of atmospheric particles(an ingredient in cloud formation). It’s a long road from that conclusion to an argument that variations in cosmic rays can explain a meaningful portion of recent climate change.

There’s a long history of assertions that clouds can be a substantial driver of climate change, distinct from their clear potential to amplify or blunt(depending on the type of cloud) a change set in motion by some other force. But there’s still scant evidence to back up such assertions.

In weighing the new results on cosmic rays and the atmosphere, I find a lot of merit in Hank Campbell’s conclusion at Science 2.0:

[I]t isn’t evidence that the Sun’s magnetic field is controlling cosmic rays and therefore our temperature far more than mankind and pollution are doing.

It is simply science at work – finally, after a decade and a half of circling the wagons, hypotheses that were dismissed as conspiratorial nonsense by zealots get a chance to live or die by the scientific method and not by aggressive posturing.

new paper by Andrew Dessler of Texas A&M University bolsters the established view of clouds’ role as a feedback mechanism — but not driver — in climate dynamics through a decade of observation and analysis of El Nino and La Nina events (periodic warm and cool phases of the Pacific Ocean).

The paper directly challenges conclusions of Spencer and Braswell and anearlier paper positing a role of clouds in driving climate change.

Dessler, setting his findings and other work on clouds and climate in broader context, offered this observation this morning about the polarized, and distorted, public discourse:

To me, the real story here is that, every month, dozens if not hundreds of papers are published that are in agreement with the mainstream theory of climate science.

[ACR: I did a quick Google Scholar search for “CO2 climate change greenhouse” to put a rough upper bound on this and got ~9,000 papers so far in 2011.]

But, every year, one or two skeptical papers get published, and these are then trumpeted by sympathetic media outlets as if they’d discovered the wheel. It therefore appears to the general public that there’s a debate.

Here’s more from Dessler on his new paper:

A separate question has emerged around the Spencer-Braswell paper. Should it have been published in the first place?

As Retraction Watch (a fascinating and worthwhile blog) chronicled last week, the editor of Remote Sensing, the journal in which the paper appeared, emphatically — if after the fact — said no, emphasizing his view by very publicly resigning.

This move was hailed by defenders of the climate status quo in a piece run inThe Daily Climate and Climate Progress. Peter Gleick of the Pacific Institute, remarkably given space in Forbes, called the resignation “staggering news.”

But others, including the folks at Retraction Watch, wondered why the editor at Remote Sensing, Wolfgang Wagner, didn’t simply seek to have the paper retracted?

Roger A. Pielke, Jr., whose focus at the University of Colorado is climate in the context of political science, echoed that question, urging the new team at the journal to initiate retraction proceedings, adding:

If the charges of “error” and “false claims” are upheld the paper should certainly be retracted.  If the charges are not upheld then the authors have every right to have such a judgment announced publicly.

Absent such an adjudication we are left with climate science played out as political theater in the media and on blogs — with each side claiming the righteousness of their views, while everyone else just sees the peer review process in climate science getting another black eye.

Over the weekend, I asked Kerry Emanuel at the Massachusetts Institute of Technology for his thoughts both on the Spencer-Braswell paper and the histrionic resignation by the editor. Here’s Emanuel:

About the paper: I read it when it first came out, and thought that some of their findings were significant and important. Basically, it presented evidence that feedbacks inferred from short-period and/or local climate change observations might not be relevant to long-period global change. I suppose I thought that rather obvious, but not everyone agrees. The one statement in the paper, to the effect that climate models might be overestimating positive feedback, struck me as unsubstantiated, but the authors themselves phrased it as speculative.

But the interesting and unusual thing about this is that that what pundits said about the paper, and indeed what Spencer said about it in press releases, etc., in my view had very little to do with the paper itself. I have seldom seen such a degree of disconnect between the substance of a paper and what has been said about it.

Gavin Schmidt of Real Climate and NASA has posted a thorough and useful dissection of the situation, “Resignations, retractions and the process of science,” that comes to what I see as the right conclusion:

I think (rightly) that people feel that the best way to deal with these papers is within the literature itself, and in this case it is happening this week in GRL (Dessler, 2011) [the Dessler paper discussed above], and in Remote Sensing in a few months. That’s the way it should be, and neither resignations nor retractions are likely to become more dominant – despite the amount of popcorn being passed around.

There’s more useful context and analysis from Keith Kloor, who notes the role played by the Drudge Report in amping up the story (blogging at the Yale Forum on Climate Change and the Media), Mike LemonickJudith Curry and many others.

As always happens after such episodes, the one clear finding is that clouds remain a complicating component in efforts to project warming from the building greenhouse effect.

Joni Mitchell’s classic, with a bit of mangling, sums things up well:


They’ve looked at clouds from all sides now, as feedback and forcing, and still somehow, it’s clouds’ illusions most often recalled. More work is needed to know clouds at all.

8:52 p.m. | Postscript |
There’s more coverage of the Spencer-Braswell paper at Knight Science Journalism Tracker and the blogs of Roger Pielke, Sr. and William M. Briggs. Roy Spencer has posted a piece titled “More Thoughts on the War Being Waged Against Us.”

Philosophers Notwithstanding, Kansas School Board Redefines Science (N.Y. Times)

By DENNIS OVERBYE
Published: November 15, 2005

Once it was the left who wanted to redefine science.

In the early 1990’s, writers like the Czech playwright and former president Vaclav Havel and the French philosopher Bruno Latour proclaimed “the end of objectivity.” The laws of science were constructed rather than discovered, some academics said; science was just another way of looking at the world, a servant of corporate and military interests. Everybody had a claim on truth.

The right defended the traditional notion of science back then. Now it is the right that is trying to change it.

On Tuesday, fueled by the popular opposition to the Darwinian theory of evolution, the Kansas State Board of Education stepped into this fraught philosophical territory. In the course of revising the state’s science standards to include criticism of evolution, the board promulgated a new definition of science itself.

The changes in the official state definition are subtle and lawyerly, and involve mainly the removal of two words: “natural explanations.” But they are a red flag to scientists, who say the changes obliterate the distinction between the natural and the supernatural that goes back to Galileo and the foundations of science.

The old definition reads in part, “Science is the human activity of seeking natural explanations for what we observe in the world around us.” The new one calls science “a systematic method of continuing investigation that uses observation, hypothesis testing, measurement, experimentation, logical argument and theory building to lead to more adequate explanations of natural phenomena.”

Adrian Melott, a physics professor at the University of Kansas who has long been fighting Darwin’s opponents, said, “The only reason to take out ‘natural explanations’ is if you want to open the door to supernatural explanations.”

Gerald Holton, a professor of the history of science at Harvard, said removing those two words and the framework they set means “anything goes.”

The authors of these changes say that presuming the laws of science can explain all natural phenomena promotes materialism, secular humanism, atheism and leads to the idea that life is accidental. Indeed, they say in material online at kansasscience2005.com, it may even be unconstitutional to promulgate that attitude in a classroom because it is not ideologically “neutral.”

But many scientists say that characterization is an overstatement of the claims of science. The scientist’s job description, said Steven Weinberg, a physicist and Nobel laureate at the University of Texas, is to search for natural explanations, just as a mechanic looks for mechanical reasons why a car won’t run.

“This doesn’t mean that they commit themselves to the view that this is all there is,” Dr. Weinberg wrote in an e-mail message. “Many scientists (including me) think that this is the case, but other scientists are religious, and believe that what is observed in nature is at least in part a result of God’s will.”

The opposition to evolution, of course, is as old as the theory itself. “This is a very long story,” said Dr. Holton, who attributed its recent prominence to politics and the drive by many religious conservatives to tar science with the brush of materialism.

How long the Kansas changes will last is anyone’s guess. The state board tried to abolish the teaching of evolution and the Big Bang in schools six years ago, only to reverse course in 2001.

As it happened, the Kansas vote last week came on the same day that voters in Dover, Pa., ousted the local school board that had been sued for introducing the teaching of intelligent design.

As Dr. Weinberg noted, scientists and philosophers have been trying to define science, mostly unsuccessfully, for centuries.

When pressed for a definition of what they do, many scientists eventually fall back on the notion of falsifiability propounded by the philosopher Karl Popper. A scientific statement, he said, is one that can be proved wrong, like “the sun always rises in the east” or “light in a vacuum travels 186,000 miles a second.” By Popper’s rules, a law of science can never be proved; it can only be used to make a prediction that can be tested, with the possibility of being proved wrong.

But the rules get fuzzy in practice. For example, what is the role of intuition in analyzing a foggy set of data points? James Robert Brown, a philosopher of science at the University of Toronto, said in an e-mail message: “It’s the widespread belief that so-called scientific method is a clear, well-understood thing. Not so.” It is learned by doing, he added, and for that good examples and teachers are needed.

One thing scientists agree on, though, is that the requirement of testability excludes supernatural explanations. The supernatural, by definition, does not have to follow any rules or regularities, so it cannot be tested. “The only claim regularly made by the pro-science side is that supernatural explanations are empty,” Dr. Brown said.

The redefinition by the Kansas board will have nothing to do with how science is performed, in Kansas or anywhere else. But Dr. Holton said that if more states changed their standards, it could complicate the lives of science teachers and students around the nation.

He added that Galileo – who started it all, and paid the price – had “a wonderful way” of separating the supernatural from the natural. There are two equally worthy ways to understand the divine, Galileo said. “One was reverent contemplation of the Bible, God’s word,” Dr. Holton said. “The other was through scientific contemplation of the world, which is his creation.

“That is the view that I hope the Kansas school board would have adopted.”

In the Land of Denial (N.Y. Times)

NY Times editorial
September 6, 2011

The Republican presidential contenders regard global warming as a hoax or, at best, underplay its importance. The most vocal denier is Rick Perry, the Texas governor and longtime friend of the oil industry, who insists that climate change is an unproven theory created by “a substantial number of scientists who have manipulated data so that they will have dollars rolling into their projects.”

Never mind that nearly all the world’s scientists regard global warming as a serious threat to the planet, with human activities like the burning of fossil fuels a major cause. Never mind that multiple investigations have found no evidence of scientific manipulation. Never mind that America needs a national policy. Mr. Perry has a big soapbox, and what he says, however fallacious, reaches a bigger audience than any scientist can command.

With one exception — make that one-and-one-half — the rest of the Republican presidential field also rejects the scientific consensus. The exception is Jon Huntsman Jr., a former ambassador to China and former governor of Utah, who recently wrote on Twitter: “I believe in evolution and trust scientists on global warming. Call me crazy.” The one-half exception is Mitt Romney, who accepted the science when he was governor of Massachusetts and argued for reducing emissions. Lately, he’s retreated into mush: “Do I think the world’s getting hotter? Yeah, I don’t know that, but I think that it is.” As for the human contribution: “It could be a little. It could be a lot.”

The others flatly repudiate the science. Ron Paul of Texas calls global warming “the greatest hoax I think that has been around for many, many years.” Michele Bachmann of Minnesota once said that carbon dioxide was nothing to fear because it is a “natural byproduct of nature” and has complained of “manufactured science.” Rick Santorum, a former senator from Pennsylvania, has called climate change “a beautifully concocted scheme” that is “just an excuse for more government control of your life.”

Newt Gingrich’s full record on climate change has been a series of epic flip-flops. In 2008, he appeared on television with Nancy Pelosi, the former House speaker, to say that “our country must take action to address climate change.” He now says the appearance was a mistake.

None of the candidates endorse a mandatory limit on emissions or, for that matter, a truly robust clean energy program. This includes Mr. Huntsman. In 2007, as Utah governor, he joined with Arnold Schwarzenegger, then the governor of California, in creating the Western Climate Initiative, a market-based cap-and-trade program aimed at reducing emissions in Western states. Cap-and-trade has since acquired a toxic political reputation, especially among Republicans, and Mr. Huntsman has backed away.

The economic downturn has made addressing climate change less urgent for voters. But the issue is not going away. The nation badly needs a candidate with a coherent, disciplined national strategy. So far, there is no Republican who fits that description.

Primeira cidade planejada do Ceará, Jaguaribara tem energia cortada por falta de pagamento (O Globo)

Publicada em 07/09/2011 às 08h24m
Globo.com/Portal Verdes Mares

SÃO PAULO – Primeira cidade totalmente planejada do Ceará, Jaguaribara está parcialmente no escuro há um mês. Falta luz em praças, ruas e até no cemitério. Por falta de pagamento, a Coelce, empresa de energia, ganhou na Justiça o direito de cortar o fornecimento de energia para o município. Nas casas e em locais de interesse público, como hospitais, a energia chega normalmente. A prefeitura da cidade admite que não pagou a energia há cinco ano. O prefeito diz que houve corte no repasse de verbas do governo do estado.

De acordo com o prefeito Edvaldo Silveira, o governo do estado deixou de repassar cerca de R$ 96 mil mensais, fruto de um convênio firmado em 2000, quando foi inaugurada a Nova Jaguaribara, a cidade planejada. A velha Jaguaribara foi inundada pelas águas do Açude Castanhão. Esse dinheiro, segundo o prefeito, era destinado ao pagamento da iluminação pública.

Segundo o prefeito, a nova cidade também foi projetada para ter 70 mil habitantes. Isso significa que a infraestrutura da cidade, que antes abrigava 9 mil pessoas, foi ampliada. O município ganhou uma vila olímpica em lugar da antiga quadra de esportes. Também foram construídas 14 praças públicas. Só o cemitério, recebeu 25 postes. Na prática, a conta de luz aumentou para a prefeitura. A cidade que era rural, hoje tem quase 70% dos moradores vivendo em áreas urbanas.

– A despesa com a ailuminação pública aumentou – afirma o prefeito.

Ele diz que a conta da iluminação pública não é repassada à população.

Depois de ficar às escuras, nesta semana a prefeitura também teve cortadas suas linhas telefônicas. O motivo é o mesmo: falta de pagamento.

O prefeito diz que aguarda verba do Governo do Estado para pagar a conta e regularizar a situação do município. Ele acredita que o problema e da energia e do telefone devem ser solucionados ainda esta semana.

Leia mais sobre esse assunto aqui.
© 1996 – 2011. Todos os direitos reservados a Infoglobo Comunicação e Participações S.A.

Flânerie bipolar (FSP)

A melancolia, da excentricidade romântica à patologia farmacêutica

Folha de S.Paulo, Ilustríssima
São Paulo, Domingo, 04 de Setembro de 2011
Por MARIA RITA KEHL

Descrita até a modernidade como um fenômeno da cultura, sinal de excentricidade e reclusão, a melancolia perdeu, com o advento da psicanálise, o caráter criativo. No século 21, se converte em patologia “bipolar”. Publicação de clássico do século 17 e filme de Lars von Trier trazem o melancólico de volta à cena.

O PLANETA MELANCHOLIA não é o Sol negro do poema de Nerval. É uma Lua incansável, cuja órbita desgovernada a aproxima da Terra indefesa até provocar uma colisão devastadora.

O filme de Lars von Trier mistura ficção científica com parábola moral, sofisticada e um tanto ingênua, como convém ao gênero. A destruição do mundo pela melancolia é precedida de um longo comentário sobre a perda de sentido da vida, pelo menos entre os habitantes da sociedade que Trier critica desde “Dançando no Escuro” (2000) e cujo imaginário o cineasta dinamarquês, confiante em seu método paranoico-crítico, conhece pelo cinema sem jamais ter pisado lá: os EUA.

Ao longo do filme, Trier semeia indicações de sua familiaridade com a história da melancolia no Ocidente. O cineasta, que se fez “persona non grata” em Cannes com provocações descabidas em defesa de Hitler, mostrou compreender a posição do melancólico como a de um sujeito em desacordo com o que se considera o Bem, no mundo em que vive. Em “Melancholia”, esta é a posição de Justine (Kirsten Dunst), prestes a se casar com um rapaz tão obsequioso em contentá-la que presenteia a noiva com a foto das macieiras em cuja sombra ela deverá ser feliz.

Feliz? A perspectiva do futuro congelado numa imagem perpétua congela também o desejo de Justine, que se desajusta de seu papel e estraga a festa caríssima organizada pela irmã, cheia de rituais destinados a produzir os efeitos de “happiness” exigidos dos filhos da sociedade da abundância.

SINTOMA SOCIAL Se não tivesse o mérito de desvendar a estupidez da fé contemporânea nos “efeitos de felicidade” como medida de todas as coisas, o filme de Trier já terá valido por reabilitar a figura da melancolia como indicador do sintoma social.

Por mais de dois milênios, as oscilações da sensibilidade melancólica indagaram a cultura ocidental a respeito da fronteira que separa o louco e o gênio. Desde a Antiguidade clássica, o melancólico, incapaz de corresponder à “demanda do Outro”, denunciava o que não ia bem, no laço social.

A crise que leva Justine a arrebentar seu compromisso amoroso, sua festa de casamento e seu emprego numa única noite é conduzida com precisão didática pelo diretor. Uma observação cruel da mãe (representação perfeita da mãe do melancólico freudiano), seguida da indiferença do pai, deflagra em Justine uma verdadeira crise de fé. De repente, a noiva se exclui da cena na qual deveria ser a principal protagonista. Não acredita mais. Despenca da rede imaginária que sustenta o que se costuma chamar de realidade, ficção coletiva capaz de dotar a vida de significado e valor.

Justine, incapaz de olhar o mundo através do véu de fantasia que conforta aos outros, “os tais sãos” (como no verso de Pessoa), enxerga o que a cena encobre. Ela não teme a chegada de Melancholia porque nunca foi capaz de se iludir sobre a finitude de tudo o que existe. Justine “vê coisas”. Árida vida a de quem vê demais porque não sabe fantasiar.

EXCEÇÃO Desde a Antiguidade o melancólico foi entendido, no Ocidente, como aquele que ocupa um lugar de exceção na cultura. O pathos melancólico foi explicado por Hipócrates e Galeno com base na teoria dos quatro humores que regulam o funcionamento do corpo e da alma. As oscilações da bile negra fariam do melancólico um ser inconstante, a um só tempo doentio e genial, impelido a criar para aplacar as oscilações de seu temperamento.

No cerne de sua reflexão “O Homem de Gênio e a Melancolia” (O Problema XXX), Aristóteles já discernira uma questão ética a respeito dos excessos emocionais do melancólico e uma questão estética sobre o gênio criador. Daí o incômodo papel que lhe coube: questionar os significantes que sustentam o imaginário de sua época.

SÉCULO 19 A tradição inaugurada por Aristóteles termina com Baudelaire já no século 19 -o último dos românticos, o primeiro dos modernos, segundo outro melancólico genial, Walter Benjamin. Para suportar os altos e baixos de seu temperamento e dar algum destino à sua excentricidade, alguns melancólicos dedicaram-se a tentar compreender seu mal.

O classicismo inglês produziu o mais completo compêndio sobre a melancolia de que se tem conhecimento, obra da vida inteira do bibliotecário de Oxford Robert Burton (1577-1640).

Sua “A Anatomia da Melancolia”, publicada em 1621 e reeditada várias vezes nas décadas seguintes, é um compêndio de mais de 1.400 páginas contendo tudo o que se podia saber sobre a “doença” de seu autor. A editora da Universidade Federal do Paraná acaba de lançar no Brasil o primeiro volume de “A Anatomia da Melancolia” [trad. Guilherme Gontijo Flores, 265 págs., preço não definido].

É pena que o primeiro volume se limite ao longo introito do autor a seus leitores. Esperamos que em breve a Editora UFPR publique uma seleção dos capítulos do livro, que inicia com as causas da melancolia -“Delírio, frenesi, loucura” […] “Solidão e ócio” […] “A força da imaginação”…- segue com a descrição dos paliativos para aliviar o sofrimento (“alegria, boa companhia, belos objetos…”) para ao final abordar a melancolia amorosa e a melancolia religiosa.

O autor assinou a obra como Demócrito Júnior, a afirmar sua identificação com o filósofo que, segundo a descrição de Hipócrates, afastou-se do convívio com os homens e, diante da vacuidade do mundo, costumava rir de tudo. O riso do melancólico é expressão do escárnio ante as ilusões alheias.

A empreitada de Burton só foi possível em uma época em que a melancolia era entendida não apenas como uma doença, mas como um fenômeno da cultura. O texto seminal de Aristóteles já continha uma reflexão sobre a capacidade criativa do melancólico, atribuída à instabilidade que o impele a expandir sua alma em todas as direções do universo.

FREUD Tal processo de desidentificação encontra-se também no diagnóstico freudiano, ao qual falta, entretanto, a contrapartida da mimesis. Solto da rede imaginária que o enlaça a si mesmo e ao mundo, o melancólico contemporâneo só conta de encarar o Real com a aridez do simbólico.

Algo se passou, na modernidade, para que a inconsistência imaginária do melancólico deixasse de estimulá-lo a reinventar as representações do mundo e ficasse à mercê da Coisa. A receita preparada para Justine tem gosto de cinzas; fios de lã invisíveis impedem suas pernas de andar. Diante desse horror, ela prefere a colisão com Melancholia.

A melancolia deixou de ser entendida como um desajuste referido às normas da vida pública quando Freud arrebatou o significante de seu sentido tradicional a fim de trazer para o campo da psicanálise o diagnóstico psiquiátrico da então chamada psicose maníaco-depressiva -que hoje a medicina retomou sob a designação de transtorno bipolar.

Freud não privatizou a melancolia por acaso: a própria psicanálise deve sua existência ao surgimento do sujeito neurótico gerado nas tramas da família burguesa, fechada sobre si mesma e fundada em compromissos de amor. A psicanálise freudiana é contemporânea ao acabamento da forma subjetiva do indivíduo e à privatização das tarefas de socialização das crianças.

Vem daí que o melancólico freudiano não se pareça em nada com seus colegas pré-modernos: o valente guerreiro exposto à vergonha diante de seus pares (Ajax), o anacoreta em crise de fé (santo Antônio), o pensador renascentista ocupado em restaurar a ordem de um mundo em constante transformação (como na gravura de Dürer). Nem faz lembrar, na aurora modernidade, o “flâneur” a recolher restos de um mundo em ruínas pelas ruas de uma grande cidade (Baudelaire) de modo a compor um monumento poético para fazer face à barbárie.

O melancólico freudiano é o bebê repudiado pela mãe, pobre eu transformado em dejeto sobre o qual caiu a sombra de um objeto mau. O que se perdeu na transição efetuada pela psicanálise foi o valor criativo que se atribuía ao melancólico, da Antiguidade ao romantismo. Perdeu-se o valor do polo maníaco do que hoje a medicina chama de transtorno bipolar.

Onde o melancólico pré-moderno, em seus momentos de euforia, era dado a expansões da imaginação poética, hoje a mania leva os pacientes “bipolares” a torrar dinheiro no cartão de crédito. O consumo é o ato que expressa os atuais clientes da psicofarmacologia, apartados da potência criadora que sua inadaptação ao mundo poderia lhes conferir.

DEPRESSÃO Já não existem melancólicos como os de antigamente? Os neurocientistas que o digam. A psiquiatria e a indústria farmacêutica já escolheram seu substituto no século 21: no lugar do significante melancolia, instala-se a depressão como grande sintoma do mal-estar na civilização do terceiro milênio. Quanto mais se sofistica a oferta de antidepressivos, mais a depressão se anuncia no horizonte como expressão privilegiada do mal-estar, a ameaçar sociedades que se dedicam a ignorar o saber que ela contém.

Tal produção ativa de ignorância a respeito do sentido da melancolia está no centro da parábola de Lars von Trier. John, cunhado de Justine, afirma sua fé no mundo das mercadorias. Abastece a casa com comida, combustível, geradores de energia. Confia na informação científica divulgada pela internet. Verifica no telescópio a aproximação do planeta ameaçador.

Sua defesa é tão frágil que, diante do inevitável, suicida-se com uma overdose das pílulas da esposa. Claire, por sua vez, tem grande fé na encenação da vida. O fracasso do casamento espetacular da irmã não a impede de planejar outro pequeno ritual, na bela varanda da casa, com música e vinho, para esperar a chegada de Melancholia. Excelente final para um melodrama hollywoodiano, que Justine descarta com desprezo.

Justine não tem ilusões a respeito do fim. Mesmo assim, para proteger o sobrinho do horror final, mostra-se capaz de criar a mais onipotente das fantasias. Constrói com ele uma frágil tenda “mágica” sob a qual se abrigam para esperar a explosão de luz trazida pela colisão com Melancholia.

O triângulo formado por três galhos presos na ponta não chega a criar uma ilusão: são como traços de uma escrita, como um significante a demarcar, “in extremis”, um território humano em face do Real.

The Responsibility of Intellectuals, Redux (Boston Review)

Boston Review – SEPTEMBER/OCTOBER 2011

Using Privilege to Challenge the State

Noam Chomsky

A San Francisco mural depicting Archbishop Óscar Romero / Photograph: Franco Folini

Since we often cannot see what is happening before our eyes, it is perhaps not too surprising that what is at a slight distance removed is utterly invisible. We have just witnessed an instructive example: President Obama’s dispatch of 79 commandos into Pakistan on May 1 to carry out what was evidently a planned assassination of the prime suspect in the terrorist atrocities of 9/11, Osama bin Laden. Though the target of the operation, unarmed and with no protection, could easily have been apprehended, he was simply murdered, his body dumped at sea without autopsy. The action was deemed “just and necessary” in the liberal press. There will be no trial, as there was in the case of Nazi criminals—a fact not overlooked by legal authorities abroad who approve of the operation but object to the procedure. As Elaine Scarry reminds us, the prohibition of assassination in international law traces back to a forceful denunciation of the practice by Abraham Lincoln, who condemned the call for assassination as “international outlawry” in 1863, an “outrage,” which “civilized nations” view with “horror” and merits the “sternest retaliation.”

In 1967, writing about the deceit and distortion surrounding the American invasion of Vietnam, I discussed the responsibility of intellectuals, borrowing the phrase from an important essay of Dwight Macdonald’s after World War II. With the tenth anniversary of 9/11 arriving, and widespread approval in the United States of the assassination of the chief suspect, it seems a fitting time to revisit that issue. But before thinking about the responsibility of intellectuals, it is worth clarifying to whom we are referring.

The concept of intellectuals in the modern sense gained prominence with the 1898 “Manifesto of the Intellectuals” produced by the Dreyfusards who, inspired by Emile Zola’s open letter of protest to France’s president, condemned both the framing of French artillery officer Alfred Dreyfus on charges of treason and the subsequent military cover-up. The Dreyfusards’ stance conveys the image of intellectuals as defenders of justice, confronting power with courage and integrity. But they were hardly seen that way at the time. A minority of the educated classes, the Dreyfusards were bitterly condemned in the mainstream of intellectual life, in particular by prominent figures among “the immortals of the strongly anti-Dreyfusard Académie Française,” Steven Lukes writes. To the novelist, politician, and anti-Dreyfusard leader Maurice Barrès, Dreyfusards were “anarchists of the lecture-platform.” To another of these immortals, Ferdinand Brunetière, the very word “intellectual” signified “one of the most ridiculous eccentricities of our time—I mean the pretension of raising writers, scientists, professors and philologists to the rank of supermen,” who dare to “treat our generals as idiots, our social institutions as absurd and our traditions as unhealthy.”

Who then were the intellectuals? The minority inspired by Zola (who was sentenced to jail for libel, and fled the country)? Or the immortals of the academy? The question resonates through the ages, in one or another form, and today offers a framework for determining the “responsibility of intellectuals.” The phrase is ambiguous: does it refer to intellectuals’ moral responsibility as decent human beings in a position to use their privilege and status to advance the causes of freedom, justice, mercy, peace, and other such sentimental concerns? Or does it refer to the role they are expected to play, serving, not derogating, leadership and established institutions?

• • •

One answer came during World War I, when prominent intellectuals on all sides lined up enthusiastically in support of their own states.

In their “Manifesto of 93 German Intellectuals,” leading figures in one of the world’s most enlightened states called on the West to “have faith in us! Believe, that we shall carry on this war to the end as a civilized nation, to whom the legacy of a Goethe, a Beethoven, and a Kant, is just as sacred as its own hearths and homes.” Their counterparts on the other side of the intellectual trenches matched them in enthusiasm for the noble cause, but went beyond in self-adulation. In The New Republic they proclaimed, “The effective and decisive work on behalf of the war has been accomplished by . . . a class which must be comprehensively but loosely described as the ‘intellectuals.’” These progressives believed they were ensuring that the United States entered the war “under the influence of a moral verdict reached, after the utmost deliberation by the more thoughtful members of the community.” They were, in fact, the victims of concoctions of the British Ministry of Information, which secretly sought “to direct the thought of most of the world,” but particularly the thought of American progressive intellectuals who might help to whip a pacifist country into war fever.

John Dewey was impressed by the great “psychological and educational lesson” of the war, which proved that human beings—more precisely, “the intelligent men of the community”—can “take hold of human affairs and manage them . . . deliberately and intelligently” to achieve the ends sought, admirable by definition.

Not everyone toed the line so obediently, of course. Notable figures such as Bertrand Russell, Eugene Debs, Rosa Luxemburg, and Karl Liebknecht were, like Zola, sentenced to prison. Debs was punished with particular severity—a ten-year prison term for raising questions about President Wilson’s “war for democracy and human rights.” Wilson refused him amnesty after the war ended, though Harding finally relented. Some, such as Thorstein Veblen, were chastised but treated less harshly; Veblen was fired from his position in the Food Administration after preparing a report showing that the shortage of farm labor could be overcome by ending Wilson’s brutal persecution of labor, specifically the International Workers of the World. Randolph Bourne was dropped by the progressive journals after criticizing the “league of benevolently imperialistic nations” and their exalted endeavors.

The pattern of praise and punishment is a familiar one throughout history: those who line up in the service of the state are typically praised by the general intellectual community, and those who refuse to line up in service of the state are punished. Thus in retrospect Wilson and the progressive intellectuals who offered him their services are greatly honored, but not Debs. Luxemburg and Liebknecht were murdered and have hardly been heroes of the intellectual mainstream. Russell continued to be bitterly condemned until after his death—and in current biographies still is.

Since power tends to prevail, intellectuals who serve their governments are considered the responsible ones.

In the 1970s prominent scholars distinguished the two categories of intellectuals more explicitly. A 1975 study, The Crisis of Democracy, labeled Brunetière’s ridiculous eccentrics “value-oriented intellectuals” who pose a “challenge to democratic government which is, potentially at least, as serious as those posed in the past by aristocratic cliques, fascist movements, and communist parties.” Among other misdeeds, these dangerous creatures “devote themselves to the derogation of leadership, the challenging of authority,” and they challenge the institutions responsible for “the indoctrination of the young.” Some even sink to the depths of questioning the nobility of war aims, as Bourne had. This castigation of the miscreants who question authority and the established order was delivered by the scholars of the liberal internationalist Trilateral Commission; the Carter administration was largely drawn from their ranks.

Like The New Republic progressives during World War I, the authors of The Crisis of Democracy extend the concept of the “intellectual” beyond Brunetière’s ridiculous eccentrics to include the better sort as well: the “technocratic and policy-oriented intellectuals,” responsible and serious thinkers who devote themselves to the constructive work of shaping policy within established institutions and to ensuring that indoctrination of the young proceeds on course.

It took Dewey only a few years to shift from the responsible technocratic and policy-oriented intellectual of World War I to an anarchist of the lecture-platform, as he denounced the “un-free press” and questioned “how far genuine intellectual freedom and social responsibility are possible on any large scale under the existing economic regime.”

What particularly troubled the Trilateral scholars was the “excess of democracy” during the time of troubles, the 1960s, when normally passive and apathetic parts of the population entered the political arena to advance their concerns: minorities, women, the young, the old, working people . . . in short, the population, sometimes called the “special interests.” They are to be distinguished from those whom Adam Smith called the “masters of mankind,” who are “the principal architects” of government policy and pursue their “vile maxim”: “All for ourselves and nothing for other people.” The role of the masters in the political arena is not deplored, or discussed, in the Trilateral volume, presumably because the masters represent “the national interest,” like those who applauded themselves for leading the country to war “after the utmost deliberation by the more thoughtful members of the community” had reached its “moral verdict.”

To overcome the excessive burden imposed on the state by the special interests, the Trilateralists called for more “moderation in democracy,” a return to passivity on the part of the less deserving, perhaps even a return to the happy days when “Truman had been able to govern the country with the cooperation of a relatively small number of Wall Street lawyers and bankers,” and democracy therefore flourished.

The Trilateralists could well have claimed to be adhering to the original intent of the Constitution, “intrinsically an aristocratic document designed to check the democratic tendencies of the period” by delivering power to a “better sort” of people and barring “those who were not rich, well born, or prominent from exercising political power,” in the accurate words of the historian Gordon Wood. In Madison’s defense, however, we should recognize that his mentality was pre-capitalist. In determining that power should be in the hands of “the wealth of the nation,” “a the more capable set of men,” he envisioned those men on the model of the “enlightened Statesmen” and “benevolent philosopher” of the imagined Roman world. They would be “pure and noble,” “men of intelligence, patriotism, property, and independent circumstances” “whose wisdom may best discern the true interest of their country, and whose patriotism and love of justice will be least likely to sacrifice it to temporary or partial considerations.” So endowed, these men would “refine and enlarge the public views,” guarding the public interest against the “mischiefs” of democratic majorities. In a similar vein, the progressive Wilsonian intellectuals might have taken comfort in the discoveries of the behavioral sciences, explained in 1939 by the psychologist and education theorist Edward Thorndike:

It is the great good fortune of mankind that there is a substantial correlation between intelligence and morality including good will toward one’s fellows . . . . Consequently our superiors in ability are on the average our benefactors, and it is often safer to trust our interests to them than to ourselves.

A comforting doctrine, though some might feel that Adam Smith had the sharper eye.

• • •

Since power tends to prevail, intellectuals who serve their governments are considered responsible, and value-oriented intellectuals are dismissed or denigrated. At home that is.

With regard to enemies, the distinction between the two categories of intellectuals is retained, but with values reversed. In the old Soviet Union, the value-oriented intellectuals were the honored dissidents, while we had only contempt for the apparatchiks and commissars, the technocratic and policy-oriented intellectuals. Similarly in Iran we honor the courageous dissidents and condemn those who defend the clerical establishment. And elsewhere generally.

The honorable term “dissident” is used selectively. It does not, of course, apply, with its favorable connotations, to value-oriented intellectuals at home or to those who combat U.S.-supported tyranny abroad. Take the interesting case of Nelson Mandela, who was removed from the official terrorist list in 2008, and can now travel to the United States without special authorization.

Father Ignacio Ellacuría / Photograph: Gervasio Sánchez

Twenty years earlier, he was the criminal leader of one of the world’s “more notorious terrorist groups,” according to a Pentagon report. That is why President Reagan had to support the apartheid regime, increasing trade with South Africa in violation of congressional sanctions and supporting South Africa’s depredations in neighboring countries, which led, according to a UN study, to 1.5 million deaths. That was only one episode in the war on terrorism that Reagan declared to combat “the plague of the modern age,” or, as Secretary of State George Shultz had it, “a return to barbarism in the modern age.” We may add hundreds of thousands of corpses in Central America and tens of thousands more in the Middle East, among other achievements. Small wonder that the Great Communicator is worshipped by Hoover Institution scholars as a colossus whose “spirit seems to stride the country, watching us like a warm and friendly ghost,” recently honored further by a statue that defaces the American Embassy in London.

What particularly troubled the Trilateral scholars was the ‘excess of democracy’ in the 1960s.

The Latin American case is revealing. Those who called for freedom and justice in Latin America are not admitted to the pantheon of honored dissidents. For example, a week after the fall of the Berlin Wall, six leading Latin American intellectuals, all Jesuit priests, had their heads blown off on the direct orders of the Salvadoran high command. The perpetrators were from an elite battalion armed and trained by Washington that had already left a gruesome trail of blood and terror, and had just returned from renewed training at the John F. Kennedy Special Warfare Center and School at Fort Bragg, North Carolina. The murdered priests are not commemorated as honored dissidents, nor are others like them throughout the hemisphere. Honored dissidents are those who called for freedom in enemy domains in Eastern Europe, who certainly suffered, but not remotely like their counterparts in Latin America.

The distinction is worth examination, and tells us a lot about the two senses of the phrase “responsibility of intellectuals,” and about ourselves. It is not seriously in question, as John Coatsworth writes in the recently published Cambridge University History of the Cold War, that from 1960 to “the Soviet collapse in 1990, the numbers of political prisoners, torture victims, and executions of nonviolent political dissenters in Latin America vastly exceeded those in the Soviet Union and its East European satellites.” Among the executed were many religious martyrs, and there were mass slaughters as well, consistently supported or initiated by Washington.

Why then the distinction? It might be argued that what happened in Eastern Europe is far more momentous than the fate of the South at our hands. It would be interesting to see the argument spelled out. And also to see the argument explaining why we should disregard elementary moral principles, among them that if we are serious about suffering and atrocities, about justice and rights, we will focus our efforts on where we can do the most good—typically, where we share responsibility for what is being done. We have no difficulty demanding that our enemies follow such principles.

Few of us care, or should, what Andrei Sakharov or Shirin Ebadi say about U.S. or Israeli crimes; we admire them for what they say and do about those of their own states, and the conclusion holds far more strongly for those who live in more free and democratic societies, and therefore have far greater opportunities to act effectively. It is of some interest that in the most respected circles, practice is virtually the opposite of what elementary moral values dictate.

But let us conform and keep only to the matter of historical import.

The U.S. wars in Latin America from 1960 to 1990, quite apart from their horrors, have long-term historical significance. To consider just one important aspect, in no small measure they were wars against the Church, undertaken to crush a terrible heresy proclaimed at Vatican II in 1962, which, under the leadership of Pope John XXIII, “ushered in a new era in the history of the Catholic Church,” in the words of the distinguished theologian Hans Küng, restoring the teachings of the gospels that had been put to rest in the fourth century when the Emperor Constantine established Christianity as the religion of the Roman Empire, instituting “a revolution” that converted “the persecuted church” to a “persecuting church.” The heresy of Vatican II was taken up by Latin American bishops who adopted the “preferential option for the poor.” Priests, nuns, and laypersons then brought the radical pacifist message of the gospels to the poor, helping them organize to ameliorate their bitter fate in the domains of U.S. power.

That same year, 1962, President Kennedy made several critical decisions. One was to shift the mission of the militaries of Latin America from “hemispheric defense”—an anachronism from World War II—to “internal security,” in effect, war against the domestic population, if they raise their heads. Charles Maechling, who led U.S. counterinsurgency and internal defense planning from 1961 to 1966, describes the unsurprising consequences of the 1962 decision as a shift from toleration “of the rapacity and cruelty of the Latin American military” to “direct complicity” in their crimes to U.S. support for “the methods of Heinrich Himmler’s extermination squads.” One major initiative was a military coup in Brazil, planned in Washington and implemented shortly after Kennedy’s assassination, instituting a murderous and brutal national security state. The plague of repression then spread through the hemisphere, including the 1973 coup installing the Pinochet dictatorship, and later the most vicious of all, the Argentine dictatorship, Reagan’s favorite. Central America’s turn—not for the first time—came in the 1980s under the leadership of the “warm and friendly ghost” who is now revered for his achievements.

The murder of the Jesuit intellectuals as the Berlin wall fell was a final blow in defeating the heresy, culminating a decade of horror in El Salvador that opened with the assassination, by much the same hands, of Archbishop Óscar Romero, the “voice for the voiceless.” The victors in the war against the Church declare their responsibility with pride. The School of the Americas (since renamed), famous for its training of Latin American killers, announces as one of its “talking points” that the liberation theology that was initiated at Vatican II was “defeated with the assistance of the US army.”

Actually, the November 1989 assassinations were almost a final blow. More was needed.

A year later Haiti had its first free election, and to the surprise and shock of Washington, which like others had anticipated the easy victory of its own candidate from the privileged elite, the organized public in the slums and hills elected Jean-Bertrand Aristide, a popular priest committed to liberation theology. The United States at once moved to undermine the elected government, and after the military coup that overthrew it a few months later, lent substantial support to the vicious military junta and its elite supporters. Trade was increased in violation of international sanctions and increased further under Clinton, who also authorized the Texaco oil company to supply the murderous rulers, in defiance of his own directives.

I will skip the disgraceful aftermath, amply reviewed elsewhere, except to point out that in 2004, the two traditional torturers of Haiti, France and the United States, joined by Canada, forcefully intervened, kidnapped President Aristide (who had been elected again), and shipped him off to central Africa. He and his party were effectively barred from the farcical 2010–11 elections, the most recent episode in a horrendous history that goes back hundreds of years and is barely known among the perpetrators of the crimes, who prefer tales of dedicated efforts to save the suffering people from their grim fate.

If we are serious about justice, we will focus our efforts where we share responsibility for what is being done.

Another fateful Kennedy decision in 1962 was to send a special forces mission to Colombia, led by General William Yarborough, who advised the Colombian security forces to undertake “paramilitary, sabotage and/or terrorist activities against known communist proponents,” activities that “should be backed by the United States.” The meaning of the phrase “communist proponents” was spelled out by the respected president of the Colombian Permanent Committee for Human Rights, former Minister of Foreign Affairs Alfredo Vázquez Carrizosa, who wrote that the Kennedy administration “took great pains to transform our regular armies into counterinsurgency brigades, accepting the new strategy of the death squads,” ushering in

what is known in Latin America as the National Security Doctrine. . . . [not] defense against an external enemy, but a way to make the military establishment the masters of the game . . . [with] the right to combat the internal enemy, as set forth in the Brazilian doctrine, the Argentine doctrine, the Uruguayan doctrine, and the Colombian doctrine: it is the right to fight and to exterminate social workers, trade unionists, men and women who are not supportive of the establishment, and who are assumed to be communist extremists. And this could mean anyone, including human rights activists such as myself.

In a 1980 study, Lars Schoultz, the leading U.S. academic specialist on human rights in Latin America, found that U.S. aid “has tended to flow disproportionately to Latin American governments which torture their citizens . . . to the hemisphere’s relatively egregious violators of fundamental human rights.” That included military aid, was independent of need, and continued through the Carter years. Ever since the Reagan administration, it has been superfluous to carry out such a study. In the 1980s one of the most notorious violators was El Salvador, which accordingly became the leading recipient of U.S. military aid, to be replaced by Colombia when it took the lead as the worst violator of human rights in the hemisphere. Vázquez Carrizosa himself was living under heavy guard in his Bogotá residence when I visited him there in 2002 as part of a mission of Amnesty International, which was opening its year-long campaign to protect human rights defenders in Colombia because of the country’s horrifying record of attacks against human rights and labor activists, and mostly the usual victims of state terror: the poor and defenseless. Terror and torture in Colombia were supplemented by chemical warfare (“fumigation”), under the pretext of the war on drugs, leading to huge flight to urban slums and misery for the survivors. Colombia’s attorney general’s office now estimates that more than 140,000 people have been killed by paramilitaries, often acting in close collaboration with the U.S.-funded military.

Signs of the slaughter are everywhere. On a nearly impassible dirt road to a remote village in southern Colombia a year ago, my companions and I passed a small clearing with many simple crosses marking the graves of victims of a paramilitary attack on a local bus. Reports of the killings are graphic enough; spending a little time with the survivors, who are among the kindest and most compassionate people I have ever had the privilege of meeting, makes the picture more vivid, and only more painful.

This is the briefest sketch of terrible crimes for which Americans bear substantial culpability, and that we could easily ameliorate, at the very least.

But it is more gratifying to bask in praise for courageously protesting the abuses of official enemies, a fine activity, but not the priority of a value-oriented intellectual who takes the responsibilities of that stance seriously.

The victims within our domains, unlike those in enemy states, are not merely ignored and quickly forgotten, but are also cynically insulted. One striking illustration came a few weeks after the murder of the Latin American intellectuals in El Salvador. Vaclav Havel visited Washington and addressed a joint session of Congress. Before his enraptured audience, Havel lauded the “defenders of freedom” in Washington who “understood the responsibility that flowed from” being “the most powerful nation on earth”—crucially, their responsibility for the brutal assassination of his Salvadoran counterparts shortly before.

The liberal intellectual class was enthralled by his presentation. Havel reminds us that “we live in a romantic age,” Anthony Lewis gushed. Other prominent liberal commentators reveled in Havel’s “idealism, his irony, his humanity,” as he “preached a difficult doctrine of individual responsibility” while Congress “obviously ached with respect” for his genius and integrity; and asked why America lacks intellectuals so profound, who “elevate morality over self-interest” in this way, praising us for the tortured and mutilated corpses that litter the countries that we have left in misery. We need not tarry on what the reaction would have been had Father Ellacuría, the most prominent of the murdered Jesuit intellectuals, spoken such words at the Duma after elite forces armed and trained by the Soviet Union assassinated Havel and half a dozen of his associates—a performance that is inconceivable.

John Dewey / Photograph: New York Public Library / Photoresearchers, Inc.

The assassination of bin Laden, too, directs our attention to our insulted victims. There is much more to say about the operation—including Washington’s willingness to face a serious risk of major war and even leakage of fissile materials to jihadis, as I have discussed elsewhere—but let us keep to the choice of name: Operation Geronimo. The name caused outrage in Mexico and was protested by indigenous groups in the United States, but there seems to have been no further notice of the fact that Obama was identifying bin Laden with the Apache Indian chief. Geronimo led the courageous resistance to invaders who sought to consign his people to the fate of “that hapless race of native Americans, which we are exterminating with such merciless and perfidious cruelty, among the heinous sins of this nation, for which I believe God will one day bring [it] to judgement,” in the words of the grand strategist John Quincy Adams, the intellectual architect of manifest destiny, uttered long after his own contributions to these sins. The casual choice of the name is reminiscent of the ease with which we name our murder weapons after victims of our crimes: Apache, Blackhawk, Cheyenne . . . We might react differently if the Luftwaffe were to call its fighter planes “Jew” and “Gypsy.”

The first 9/11, unlike the second, did not change the world. It was ‘nothing of very great consequence,’ Kissinger said.

Denial of these “heinous sins” is sometimes explicit. To mention a few recent cases, two years ago in one of the world’s leading left-liberal intellectual journals, The New York Review of Books, Russell Baker outlined what he learned from the work of the “heroic historian” Edmund Morgan: namely, that when Columbus and the early explorers arrived they “found a continental vastness sparsely populated by farming and hunting people . . . . In the limitless and unspoiled world stretching from tropical jungle to the frozen north, there may have been scarcely more than a million inhabitants.” The calculation is off by many tens of millions, and the “vastness” included advanced civilizations throughout the continent. No reactions appeared, though four months later the editors issued a correction, noting that in North America there may have been as many as 18 million people—and, unmentioned, tens of millions more “from tropical jungle to the frozen north.” This was all well known decades ago—including the advanced civilizations and the “merciless and perfidious cruelty” of the “extermination”—but not important enough even for a casual phrase. In London Review of Books a year later, the noted historian Mark Mazower mentioned American “mistreatment of the Native Americans,” again eliciting no comment. Would we accept the word “mistreatment” for comparable crimes committed by enemies?

• • •

If the responsibility of intellectuals refers to their moral responsibility as decent human beings in a position to use their privilege and status to advance the cause of freedom, justice, mercy, and peace—and to speak out not simply about the abuses of our enemies, but, far more significantly, about the crimes in which we are implicated and can ameliorate or terminate if we choose—how should we think of 9/11?

The notion that 9/11 “changed the world” is widely held, understandably. The events of that day certainly had major consequences, domestic and international. One was to lead President Bush to re-declare Ronald Reagan’s war on terrorism—the first one has been effectively “disappeared,” to borrow the phrase of our favorite Latin American killers and torturers, presumably because the consequences do not fit well with preferred self images. Another consequence was the invasion of Afghanistan, then Iraq, and more recently military interventions in several other countries in the region and regular threats of an attack on Iran (“all options are open,” in the standard phrase). The costs, in every dimension, have been enormous. That suggests a rather obvious question, not asked for the first time: was there an alternative?

A number of analysts have observed that bin Laden won major successes in his war against the United States. “He repeatedly asserted that the only way to drive the U.S. from the Muslim world and defeat its satraps was by drawing Americans into a series of small but expensive wars that would ultimately bankrupt them,” the journalist Eric Margolis writes.

The United States, first under George W. Bush and then Barack Obama, rushed right into bin Laden’s trap. . . . Grotesquely overblown military outlays and debt addiction . . . . may be the most pernicious legacy of the man who thought he could defeat the United States.

A report from the Costs of War project at Brown University’s Watson Institute for International Studies estimates that the final bill will be $3.2–4 trillion. Quite an impressive achievement by bin Laden.

That Washington was intent on rushing into bin Laden’s trap was evident at once. Michael Scheuer, the senior CIA analyst responsible for tracking bin Laden from 1996 to 1999, writes, “Bin Laden has been precise in telling America the reasons he is waging war on us.” The al Qaeda leader, Scheuer continues, “is out to drastically alter U.S. and Western policies toward the Islamic world.”

And, as Scheuer explains, bin Laden largely succeeded: “U.S. forces and policies are completing the radicalization of the Islamic world, something Osama bin Laden has been trying to do with substantial but incomplete success since the early 1990s. As a result, I think it is fair to conclude that the United States of America remains bin Laden’s only indispensable ally.” And arguably remains so, even after his death.

There is good reason to believe that the jihadi movement could have been split and undermined after the 9/11 attack, which was criticized harshly within the movement. Furthermore, the “crime against humanity,” as it was rightly called, could have been approached as a crime, with an international operation to apprehend the likely suspects. That was recognized in the immediate aftermath of the attack, but no such idea was even considered by decision-makers in government. It seems no thought was given to the Taliban’s tentative offer—how serious an offer, we cannot know—to present the al Qaeda leaders for a judicial proceeding.

At the time, I quoted Robert Fisk’s conclusion that the horrendous crime of 9/11 was committed with “wickedness and awesome cruelty”—an accurate judgment. The crimes could have been even worse. Suppose that Flight 93, downed by courageous passengers in Pennsylvania, had bombed the White House, killing the president. Suppose that the perpetrators of the crime planned to, and did, impose a military dictatorship that killed thousands and tortured tens of thousands. Suppose the new dictatorship established, with the support of the criminals, an international terror center that helped impose similar torture-and-terror states elsewhere, and, as icing on the cake, brought in a team of economists—call them “the Kandahar boys”—who quickly drove the economy into one of the worst depressions in its history. That, plainly, would have been a lot worse than 9/11.

As we all should know, this is not a thought experiment. It happened. I am, of course, referring to what in Latin America is often called “the first 9/11”: September 11, 1973, when the United States succeeded in its intensive efforts to overthrow the democratic government of Salvador Allende in Chile with a military coup that placed General Pinochet’s ghastly regime in office. The dictatorship then installed the Chicago Boys—economists trained at the University of Chicago—to reshape Chile’s economy. Consider the economic destruction, the torture and kidnappings, and multiply the numbers killed by 25 to yield per capita equivalents, and you will see just how much more devastating the first 9/11 was.

Privilege yields opportunity, and opportunity confers responsibilities.

The goal of the overthrow, in the words of the Nixon administration, was to kill the “virus” that might encourage all those “foreigners [who] are out to screw us”—screw us by trying to take over their own resources and more generally to pursue a policy of independent development along lines disliked by Washington. In the background was the conclusion of Nixon’s National Security Council that if the United States could not control Latin America, it could not expect “to achieve a successful order elsewhere in the world.” Washington’s “credibility” would be undermined, as Henry Kissinger put it.

The first 9/11, unlike the second, did not change the world. It was “nothing of very great consequence,” Kissinger assured his boss a few days later. And judging by how it figures in conventional history, his words can hardly be faulted, though the survivors may see the matter differently.

These events of little consequence were not limited to the military coup that destroyed Chilean democracy and set in motion the horror story that followed. As already discussed, the first 9/11 was just one act in the drama that began in 1962 when Kennedy shifted the mission of the Latin American militaries to “internal security.” The shattering aftermath is also of little consequence, the familiar pattern when history is guarded by responsible intellectuals.

• • •

It seems to be close to a historical universal that conformist intellectuals, the ones who support official aims and ignore or rationalize official crimes, are honored and privileged in their own societies, and the value-oriented punished in one or another way. The pattern goes back to the earliest records. It was the man accused of corrupting the youth of Athens who drank the hemlock, much as Dreyfusards were accused of “corrupting souls, and, in due course, society as a whole” and the value-oriented intellectuals of the 1960s were charged with interference with “indoctrination of the young.”

In the Hebrew scriptures there are figures who by contemporary standards are dissident intellectuals, called “prophets” in the English translation. They bitterly angered the establishment with their critical geopolitical analysis, their condemnation of the crimes of the powerful, their calls for justice and concern for the poor and suffering. King Ahab, the most evil of the kings, denounced the Prophet Elijah as a hater of Israel, the first “self-hating Jew” or “anti-American” in the modern counterparts. The prophets were treated harshly, unlike the flatterers at the court, who were later condemned as false prophets. The pattern is understandable. It would be surprising if it were otherwise.

As for the responsibility of intellectuals, there does not seem to me to be much to say beyond some simple truths. Intellectuals are typically privileged—merely an observation about usage of the term. Privilege yields opportunity, and opportunity confers responsibilities. An individual then has choices.