Arquivo mensal: setembro 2011

In the Land of Denial (N.Y. Times)

NY Times editorial
September 6, 2011

The Republican presidential contenders regard global warming as a hoax or, at best, underplay its importance. The most vocal denier is Rick Perry, the Texas governor and longtime friend of the oil industry, who insists that climate change is an unproven theory created by “a substantial number of scientists who have manipulated data so that they will have dollars rolling into their projects.”

Never mind that nearly all the world’s scientists regard global warming as a serious threat to the planet, with human activities like the burning of fossil fuels a major cause. Never mind that multiple investigations have found no evidence of scientific manipulation. Never mind that America needs a national policy. Mr. Perry has a big soapbox, and what he says, however fallacious, reaches a bigger audience than any scientist can command.

With one exception — make that one-and-one-half — the rest of the Republican presidential field also rejects the scientific consensus. The exception is Jon Huntsman Jr., a former ambassador to China and former governor of Utah, who recently wrote on Twitter: “I believe in evolution and trust scientists on global warming. Call me crazy.” The one-half exception is Mitt Romney, who accepted the science when he was governor of Massachusetts and argued for reducing emissions. Lately, he’s retreated into mush: “Do I think the world’s getting hotter? Yeah, I don’t know that, but I think that it is.” As for the human contribution: “It could be a little. It could be a lot.”

The others flatly repudiate the science. Ron Paul of Texas calls global warming “the greatest hoax I think that has been around for many, many years.” Michele Bachmann of Minnesota once said that carbon dioxide was nothing to fear because it is a “natural byproduct of nature” and has complained of “manufactured science.” Rick Santorum, a former senator from Pennsylvania, has called climate change “a beautifully concocted scheme” that is “just an excuse for more government control of your life.”

Newt Gingrich’s full record on climate change has been a series of epic flip-flops. In 2008, he appeared on television with Nancy Pelosi, the former House speaker, to say that “our country must take action to address climate change.” He now says the appearance was a mistake.

None of the candidates endorse a mandatory limit on emissions or, for that matter, a truly robust clean energy program. This includes Mr. Huntsman. In 2007, as Utah governor, he joined with Arnold Schwarzenegger, then the governor of California, in creating the Western Climate Initiative, a market-based cap-and-trade program aimed at reducing emissions in Western states. Cap-and-trade has since acquired a toxic political reputation, especially among Republicans, and Mr. Huntsman has backed away.

The economic downturn has made addressing climate change less urgent for voters. But the issue is not going away. The nation badly needs a candidate with a coherent, disciplined national strategy. So far, there is no Republican who fits that description.

Primeira cidade planejada do Ceará, Jaguaribara tem energia cortada por falta de pagamento (O Globo)

Publicada em 07/09/2011 às 08h24m
Globo.com/Portal Verdes Mares

SÃO PAULO – Primeira cidade totalmente planejada do Ceará, Jaguaribara está parcialmente no escuro há um mês. Falta luz em praças, ruas e até no cemitério. Por falta de pagamento, a Coelce, empresa de energia, ganhou na Justiça o direito de cortar o fornecimento de energia para o município. Nas casas e em locais de interesse público, como hospitais, a energia chega normalmente. A prefeitura da cidade admite que não pagou a energia há cinco ano. O prefeito diz que houve corte no repasse de verbas do governo do estado.

De acordo com o prefeito Edvaldo Silveira, o governo do estado deixou de repassar cerca de R$ 96 mil mensais, fruto de um convênio firmado em 2000, quando foi inaugurada a Nova Jaguaribara, a cidade planejada. A velha Jaguaribara foi inundada pelas águas do Açude Castanhão. Esse dinheiro, segundo o prefeito, era destinado ao pagamento da iluminação pública.

Segundo o prefeito, a nova cidade também foi projetada para ter 70 mil habitantes. Isso significa que a infraestrutura da cidade, que antes abrigava 9 mil pessoas, foi ampliada. O município ganhou uma vila olímpica em lugar da antiga quadra de esportes. Também foram construídas 14 praças públicas. Só o cemitério, recebeu 25 postes. Na prática, a conta de luz aumentou para a prefeitura. A cidade que era rural, hoje tem quase 70% dos moradores vivendo em áreas urbanas.

– A despesa com a ailuminação pública aumentou – afirma o prefeito.

Ele diz que a conta da iluminação pública não é repassada à população.

Depois de ficar às escuras, nesta semana a prefeitura também teve cortadas suas linhas telefônicas. O motivo é o mesmo: falta de pagamento.

O prefeito diz que aguarda verba do Governo do Estado para pagar a conta e regularizar a situação do município. Ele acredita que o problema e da energia e do telefone devem ser solucionados ainda esta semana.

Leia mais sobre esse assunto aqui.
© 1996 – 2011. Todos os direitos reservados a Infoglobo Comunicação e Participações S.A.

Flânerie bipolar (FSP)

A melancolia, da excentricidade romântica à patologia farmacêutica

Folha de S.Paulo, Ilustríssima
São Paulo, Domingo, 04 de Setembro de 2011
Por MARIA RITA KEHL

Descrita até a modernidade como um fenômeno da cultura, sinal de excentricidade e reclusão, a melancolia perdeu, com o advento da psicanálise, o caráter criativo. No século 21, se converte em patologia “bipolar”. Publicação de clássico do século 17 e filme de Lars von Trier trazem o melancólico de volta à cena.

O PLANETA MELANCHOLIA não é o Sol negro do poema de Nerval. É uma Lua incansável, cuja órbita desgovernada a aproxima da Terra indefesa até provocar uma colisão devastadora.

O filme de Lars von Trier mistura ficção científica com parábola moral, sofisticada e um tanto ingênua, como convém ao gênero. A destruição do mundo pela melancolia é precedida de um longo comentário sobre a perda de sentido da vida, pelo menos entre os habitantes da sociedade que Trier critica desde “Dançando no Escuro” (2000) e cujo imaginário o cineasta dinamarquês, confiante em seu método paranoico-crítico, conhece pelo cinema sem jamais ter pisado lá: os EUA.

Ao longo do filme, Trier semeia indicações de sua familiaridade com a história da melancolia no Ocidente. O cineasta, que se fez “persona non grata” em Cannes com provocações descabidas em defesa de Hitler, mostrou compreender a posição do melancólico como a de um sujeito em desacordo com o que se considera o Bem, no mundo em que vive. Em “Melancholia”, esta é a posição de Justine (Kirsten Dunst), prestes a se casar com um rapaz tão obsequioso em contentá-la que presenteia a noiva com a foto das macieiras em cuja sombra ela deverá ser feliz.

Feliz? A perspectiva do futuro congelado numa imagem perpétua congela também o desejo de Justine, que se desajusta de seu papel e estraga a festa caríssima organizada pela irmã, cheia de rituais destinados a produzir os efeitos de “happiness” exigidos dos filhos da sociedade da abundância.

SINTOMA SOCIAL Se não tivesse o mérito de desvendar a estupidez da fé contemporânea nos “efeitos de felicidade” como medida de todas as coisas, o filme de Trier já terá valido por reabilitar a figura da melancolia como indicador do sintoma social.

Por mais de dois milênios, as oscilações da sensibilidade melancólica indagaram a cultura ocidental a respeito da fronteira que separa o louco e o gênio. Desde a Antiguidade clássica, o melancólico, incapaz de corresponder à “demanda do Outro”, denunciava o que não ia bem, no laço social.

A crise que leva Justine a arrebentar seu compromisso amoroso, sua festa de casamento e seu emprego numa única noite é conduzida com precisão didática pelo diretor. Uma observação cruel da mãe (representação perfeita da mãe do melancólico freudiano), seguida da indiferença do pai, deflagra em Justine uma verdadeira crise de fé. De repente, a noiva se exclui da cena na qual deveria ser a principal protagonista. Não acredita mais. Despenca da rede imaginária que sustenta o que se costuma chamar de realidade, ficção coletiva capaz de dotar a vida de significado e valor.

Justine, incapaz de olhar o mundo através do véu de fantasia que conforta aos outros, “os tais sãos” (como no verso de Pessoa), enxerga o que a cena encobre. Ela não teme a chegada de Melancholia porque nunca foi capaz de se iludir sobre a finitude de tudo o que existe. Justine “vê coisas”. Árida vida a de quem vê demais porque não sabe fantasiar.

EXCEÇÃO Desde a Antiguidade o melancólico foi entendido, no Ocidente, como aquele que ocupa um lugar de exceção na cultura. O pathos melancólico foi explicado por Hipócrates e Galeno com base na teoria dos quatro humores que regulam o funcionamento do corpo e da alma. As oscilações da bile negra fariam do melancólico um ser inconstante, a um só tempo doentio e genial, impelido a criar para aplacar as oscilações de seu temperamento.

No cerne de sua reflexão “O Homem de Gênio e a Melancolia” (O Problema XXX), Aristóteles já discernira uma questão ética a respeito dos excessos emocionais do melancólico e uma questão estética sobre o gênio criador. Daí o incômodo papel que lhe coube: questionar os significantes que sustentam o imaginário de sua época.

SÉCULO 19 A tradição inaugurada por Aristóteles termina com Baudelaire já no século 19 -o último dos românticos, o primeiro dos modernos, segundo outro melancólico genial, Walter Benjamin. Para suportar os altos e baixos de seu temperamento e dar algum destino à sua excentricidade, alguns melancólicos dedicaram-se a tentar compreender seu mal.

O classicismo inglês produziu o mais completo compêndio sobre a melancolia de que se tem conhecimento, obra da vida inteira do bibliotecário de Oxford Robert Burton (1577-1640).

Sua “A Anatomia da Melancolia”, publicada em 1621 e reeditada várias vezes nas décadas seguintes, é um compêndio de mais de 1.400 páginas contendo tudo o que se podia saber sobre a “doença” de seu autor. A editora da Universidade Federal do Paraná acaba de lançar no Brasil o primeiro volume de “A Anatomia da Melancolia” [trad. Guilherme Gontijo Flores, 265 págs., preço não definido].

É pena que o primeiro volume se limite ao longo introito do autor a seus leitores. Esperamos que em breve a Editora UFPR publique uma seleção dos capítulos do livro, que inicia com as causas da melancolia -“Delírio, frenesi, loucura” […] “Solidão e ócio” […] “A força da imaginação”…- segue com a descrição dos paliativos para aliviar o sofrimento (“alegria, boa companhia, belos objetos…”) para ao final abordar a melancolia amorosa e a melancolia religiosa.

O autor assinou a obra como Demócrito Júnior, a afirmar sua identificação com o filósofo que, segundo a descrição de Hipócrates, afastou-se do convívio com os homens e, diante da vacuidade do mundo, costumava rir de tudo. O riso do melancólico é expressão do escárnio ante as ilusões alheias.

A empreitada de Burton só foi possível em uma época em que a melancolia era entendida não apenas como uma doença, mas como um fenômeno da cultura. O texto seminal de Aristóteles já continha uma reflexão sobre a capacidade criativa do melancólico, atribuída à instabilidade que o impele a expandir sua alma em todas as direções do universo.

FREUD Tal processo de desidentificação encontra-se também no diagnóstico freudiano, ao qual falta, entretanto, a contrapartida da mimesis. Solto da rede imaginária que o enlaça a si mesmo e ao mundo, o melancólico contemporâneo só conta de encarar o Real com a aridez do simbólico.

Algo se passou, na modernidade, para que a inconsistência imaginária do melancólico deixasse de estimulá-lo a reinventar as representações do mundo e ficasse à mercê da Coisa. A receita preparada para Justine tem gosto de cinzas; fios de lã invisíveis impedem suas pernas de andar. Diante desse horror, ela prefere a colisão com Melancholia.

A melancolia deixou de ser entendida como um desajuste referido às normas da vida pública quando Freud arrebatou o significante de seu sentido tradicional a fim de trazer para o campo da psicanálise o diagnóstico psiquiátrico da então chamada psicose maníaco-depressiva -que hoje a medicina retomou sob a designação de transtorno bipolar.

Freud não privatizou a melancolia por acaso: a própria psicanálise deve sua existência ao surgimento do sujeito neurótico gerado nas tramas da família burguesa, fechada sobre si mesma e fundada em compromissos de amor. A psicanálise freudiana é contemporânea ao acabamento da forma subjetiva do indivíduo e à privatização das tarefas de socialização das crianças.

Vem daí que o melancólico freudiano não se pareça em nada com seus colegas pré-modernos: o valente guerreiro exposto à vergonha diante de seus pares (Ajax), o anacoreta em crise de fé (santo Antônio), o pensador renascentista ocupado em restaurar a ordem de um mundo em constante transformação (como na gravura de Dürer). Nem faz lembrar, na aurora modernidade, o “flâneur” a recolher restos de um mundo em ruínas pelas ruas de uma grande cidade (Baudelaire) de modo a compor um monumento poético para fazer face à barbárie.

O melancólico freudiano é o bebê repudiado pela mãe, pobre eu transformado em dejeto sobre o qual caiu a sombra de um objeto mau. O que se perdeu na transição efetuada pela psicanálise foi o valor criativo que se atribuía ao melancólico, da Antiguidade ao romantismo. Perdeu-se o valor do polo maníaco do que hoje a medicina chama de transtorno bipolar.

Onde o melancólico pré-moderno, em seus momentos de euforia, era dado a expansões da imaginação poética, hoje a mania leva os pacientes “bipolares” a torrar dinheiro no cartão de crédito. O consumo é o ato que expressa os atuais clientes da psicofarmacologia, apartados da potência criadora que sua inadaptação ao mundo poderia lhes conferir.

DEPRESSÃO Já não existem melancólicos como os de antigamente? Os neurocientistas que o digam. A psiquiatria e a indústria farmacêutica já escolheram seu substituto no século 21: no lugar do significante melancolia, instala-se a depressão como grande sintoma do mal-estar na civilização do terceiro milênio. Quanto mais se sofistica a oferta de antidepressivos, mais a depressão se anuncia no horizonte como expressão privilegiada do mal-estar, a ameaçar sociedades que se dedicam a ignorar o saber que ela contém.

Tal produção ativa de ignorância a respeito do sentido da melancolia está no centro da parábola de Lars von Trier. John, cunhado de Justine, afirma sua fé no mundo das mercadorias. Abastece a casa com comida, combustível, geradores de energia. Confia na informação científica divulgada pela internet. Verifica no telescópio a aproximação do planeta ameaçador.

Sua defesa é tão frágil que, diante do inevitável, suicida-se com uma overdose das pílulas da esposa. Claire, por sua vez, tem grande fé na encenação da vida. O fracasso do casamento espetacular da irmã não a impede de planejar outro pequeno ritual, na bela varanda da casa, com música e vinho, para esperar a chegada de Melancholia. Excelente final para um melodrama hollywoodiano, que Justine descarta com desprezo.

Justine não tem ilusões a respeito do fim. Mesmo assim, para proteger o sobrinho do horror final, mostra-se capaz de criar a mais onipotente das fantasias. Constrói com ele uma frágil tenda “mágica” sob a qual se abrigam para esperar a explosão de luz trazida pela colisão com Melancholia.

O triângulo formado por três galhos presos na ponta não chega a criar uma ilusão: são como traços de uma escrita, como um significante a demarcar, “in extremis”, um território humano em face do Real.

The Responsibility of Intellectuals, Redux (Boston Review)

Boston Review – SEPTEMBER/OCTOBER 2011

Using Privilege to Challenge the State

Noam Chomsky

A San Francisco mural depicting Archbishop Óscar Romero / Photograph: Franco Folini

Since we often cannot see what is happening before our eyes, it is perhaps not too surprising that what is at a slight distance removed is utterly invisible. We have just witnessed an instructive example: President Obama’s dispatch of 79 commandos into Pakistan on May 1 to carry out what was evidently a planned assassination of the prime suspect in the terrorist atrocities of 9/11, Osama bin Laden. Though the target of the operation, unarmed and with no protection, could easily have been apprehended, he was simply murdered, his body dumped at sea without autopsy. The action was deemed “just and necessary” in the liberal press. There will be no trial, as there was in the case of Nazi criminals—a fact not overlooked by legal authorities abroad who approve of the operation but object to the procedure. As Elaine Scarry reminds us, the prohibition of assassination in international law traces back to a forceful denunciation of the practice by Abraham Lincoln, who condemned the call for assassination as “international outlawry” in 1863, an “outrage,” which “civilized nations” view with “horror” and merits the “sternest retaliation.”

In 1967, writing about the deceit and distortion surrounding the American invasion of Vietnam, I discussed the responsibility of intellectuals, borrowing the phrase from an important essay of Dwight Macdonald’s after World War II. With the tenth anniversary of 9/11 arriving, and widespread approval in the United States of the assassination of the chief suspect, it seems a fitting time to revisit that issue. But before thinking about the responsibility of intellectuals, it is worth clarifying to whom we are referring.

The concept of intellectuals in the modern sense gained prominence with the 1898 “Manifesto of the Intellectuals” produced by the Dreyfusards who, inspired by Emile Zola’s open letter of protest to France’s president, condemned both the framing of French artillery officer Alfred Dreyfus on charges of treason and the subsequent military cover-up. The Dreyfusards’ stance conveys the image of intellectuals as defenders of justice, confronting power with courage and integrity. But they were hardly seen that way at the time. A minority of the educated classes, the Dreyfusards were bitterly condemned in the mainstream of intellectual life, in particular by prominent figures among “the immortals of the strongly anti-Dreyfusard Académie Française,” Steven Lukes writes. To the novelist, politician, and anti-Dreyfusard leader Maurice Barrès, Dreyfusards were “anarchists of the lecture-platform.” To another of these immortals, Ferdinand Brunetière, the very word “intellectual” signified “one of the most ridiculous eccentricities of our time—I mean the pretension of raising writers, scientists, professors and philologists to the rank of supermen,” who dare to “treat our generals as idiots, our social institutions as absurd and our traditions as unhealthy.”

Who then were the intellectuals? The minority inspired by Zola (who was sentenced to jail for libel, and fled the country)? Or the immortals of the academy? The question resonates through the ages, in one or another form, and today offers a framework for determining the “responsibility of intellectuals.” The phrase is ambiguous: does it refer to intellectuals’ moral responsibility as decent human beings in a position to use their privilege and status to advance the causes of freedom, justice, mercy, peace, and other such sentimental concerns? Or does it refer to the role they are expected to play, serving, not derogating, leadership and established institutions?

• • •

One answer came during World War I, when prominent intellectuals on all sides lined up enthusiastically in support of their own states.

In their “Manifesto of 93 German Intellectuals,” leading figures in one of the world’s most enlightened states called on the West to “have faith in us! Believe, that we shall carry on this war to the end as a civilized nation, to whom the legacy of a Goethe, a Beethoven, and a Kant, is just as sacred as its own hearths and homes.” Their counterparts on the other side of the intellectual trenches matched them in enthusiasm for the noble cause, but went beyond in self-adulation. In The New Republic they proclaimed, “The effective and decisive work on behalf of the war has been accomplished by . . . a class which must be comprehensively but loosely described as the ‘intellectuals.’” These progressives believed they were ensuring that the United States entered the war “under the influence of a moral verdict reached, after the utmost deliberation by the more thoughtful members of the community.” They were, in fact, the victims of concoctions of the British Ministry of Information, which secretly sought “to direct the thought of most of the world,” but particularly the thought of American progressive intellectuals who might help to whip a pacifist country into war fever.

John Dewey was impressed by the great “psychological and educational lesson” of the war, which proved that human beings—more precisely, “the intelligent men of the community”—can “take hold of human affairs and manage them . . . deliberately and intelligently” to achieve the ends sought, admirable by definition.

Not everyone toed the line so obediently, of course. Notable figures such as Bertrand Russell, Eugene Debs, Rosa Luxemburg, and Karl Liebknecht were, like Zola, sentenced to prison. Debs was punished with particular severity—a ten-year prison term for raising questions about President Wilson’s “war for democracy and human rights.” Wilson refused him amnesty after the war ended, though Harding finally relented. Some, such as Thorstein Veblen, were chastised but treated less harshly; Veblen was fired from his position in the Food Administration after preparing a report showing that the shortage of farm labor could be overcome by ending Wilson’s brutal persecution of labor, specifically the International Workers of the World. Randolph Bourne was dropped by the progressive journals after criticizing the “league of benevolently imperialistic nations” and their exalted endeavors.

The pattern of praise and punishment is a familiar one throughout history: those who line up in the service of the state are typically praised by the general intellectual community, and those who refuse to line up in service of the state are punished. Thus in retrospect Wilson and the progressive intellectuals who offered him their services are greatly honored, but not Debs. Luxemburg and Liebknecht were murdered and have hardly been heroes of the intellectual mainstream. Russell continued to be bitterly condemned until after his death—and in current biographies still is.

Since power tends to prevail, intellectuals who serve their governments are considered the responsible ones.

In the 1970s prominent scholars distinguished the two categories of intellectuals more explicitly. A 1975 study, The Crisis of Democracy, labeled Brunetière’s ridiculous eccentrics “value-oriented intellectuals” who pose a “challenge to democratic government which is, potentially at least, as serious as those posed in the past by aristocratic cliques, fascist movements, and communist parties.” Among other misdeeds, these dangerous creatures “devote themselves to the derogation of leadership, the challenging of authority,” and they challenge the institutions responsible for “the indoctrination of the young.” Some even sink to the depths of questioning the nobility of war aims, as Bourne had. This castigation of the miscreants who question authority and the established order was delivered by the scholars of the liberal internationalist Trilateral Commission; the Carter administration was largely drawn from their ranks.

Like The New Republic progressives during World War I, the authors of The Crisis of Democracy extend the concept of the “intellectual” beyond Brunetière’s ridiculous eccentrics to include the better sort as well: the “technocratic and policy-oriented intellectuals,” responsible and serious thinkers who devote themselves to the constructive work of shaping policy within established institutions and to ensuring that indoctrination of the young proceeds on course.

It took Dewey only a few years to shift from the responsible technocratic and policy-oriented intellectual of World War I to an anarchist of the lecture-platform, as he denounced the “un-free press” and questioned “how far genuine intellectual freedom and social responsibility are possible on any large scale under the existing economic regime.”

What particularly troubled the Trilateral scholars was the “excess of democracy” during the time of troubles, the 1960s, when normally passive and apathetic parts of the population entered the political arena to advance their concerns: minorities, women, the young, the old, working people . . . in short, the population, sometimes called the “special interests.” They are to be distinguished from those whom Adam Smith called the “masters of mankind,” who are “the principal architects” of government policy and pursue their “vile maxim”: “All for ourselves and nothing for other people.” The role of the masters in the political arena is not deplored, or discussed, in the Trilateral volume, presumably because the masters represent “the national interest,” like those who applauded themselves for leading the country to war “after the utmost deliberation by the more thoughtful members of the community” had reached its “moral verdict.”

To overcome the excessive burden imposed on the state by the special interests, the Trilateralists called for more “moderation in democracy,” a return to passivity on the part of the less deserving, perhaps even a return to the happy days when “Truman had been able to govern the country with the cooperation of a relatively small number of Wall Street lawyers and bankers,” and democracy therefore flourished.

The Trilateralists could well have claimed to be adhering to the original intent of the Constitution, “intrinsically an aristocratic document designed to check the democratic tendencies of the period” by delivering power to a “better sort” of people and barring “those who were not rich, well born, or prominent from exercising political power,” in the accurate words of the historian Gordon Wood. In Madison’s defense, however, we should recognize that his mentality was pre-capitalist. In determining that power should be in the hands of “the wealth of the nation,” “a the more capable set of men,” he envisioned those men on the model of the “enlightened Statesmen” and “benevolent philosopher” of the imagined Roman world. They would be “pure and noble,” “men of intelligence, patriotism, property, and independent circumstances” “whose wisdom may best discern the true interest of their country, and whose patriotism and love of justice will be least likely to sacrifice it to temporary or partial considerations.” So endowed, these men would “refine and enlarge the public views,” guarding the public interest against the “mischiefs” of democratic majorities. In a similar vein, the progressive Wilsonian intellectuals might have taken comfort in the discoveries of the behavioral sciences, explained in 1939 by the psychologist and education theorist Edward Thorndike:

It is the great good fortune of mankind that there is a substantial correlation between intelligence and morality including good will toward one’s fellows . . . . Consequently our superiors in ability are on the average our benefactors, and it is often safer to trust our interests to them than to ourselves.

A comforting doctrine, though some might feel that Adam Smith had the sharper eye.

• • •

Since power tends to prevail, intellectuals who serve their governments are considered responsible, and value-oriented intellectuals are dismissed or denigrated. At home that is.

With regard to enemies, the distinction between the two categories of intellectuals is retained, but with values reversed. In the old Soviet Union, the value-oriented intellectuals were the honored dissidents, while we had only contempt for the apparatchiks and commissars, the technocratic and policy-oriented intellectuals. Similarly in Iran we honor the courageous dissidents and condemn those who defend the clerical establishment. And elsewhere generally.

The honorable term “dissident” is used selectively. It does not, of course, apply, with its favorable connotations, to value-oriented intellectuals at home or to those who combat U.S.-supported tyranny abroad. Take the interesting case of Nelson Mandela, who was removed from the official terrorist list in 2008, and can now travel to the United States without special authorization.

Father Ignacio Ellacuría / Photograph: Gervasio Sánchez

Twenty years earlier, he was the criminal leader of one of the world’s “more notorious terrorist groups,” according to a Pentagon report. That is why President Reagan had to support the apartheid regime, increasing trade with South Africa in violation of congressional sanctions and supporting South Africa’s depredations in neighboring countries, which led, according to a UN study, to 1.5 million deaths. That was only one episode in the war on terrorism that Reagan declared to combat “the plague of the modern age,” or, as Secretary of State George Shultz had it, “a return to barbarism in the modern age.” We may add hundreds of thousands of corpses in Central America and tens of thousands more in the Middle East, among other achievements. Small wonder that the Great Communicator is worshipped by Hoover Institution scholars as a colossus whose “spirit seems to stride the country, watching us like a warm and friendly ghost,” recently honored further by a statue that defaces the American Embassy in London.

What particularly troubled the Trilateral scholars was the ‘excess of democracy’ in the 1960s.

The Latin American case is revealing. Those who called for freedom and justice in Latin America are not admitted to the pantheon of honored dissidents. For example, a week after the fall of the Berlin Wall, six leading Latin American intellectuals, all Jesuit priests, had their heads blown off on the direct orders of the Salvadoran high command. The perpetrators were from an elite battalion armed and trained by Washington that had already left a gruesome trail of blood and terror, and had just returned from renewed training at the John F. Kennedy Special Warfare Center and School at Fort Bragg, North Carolina. The murdered priests are not commemorated as honored dissidents, nor are others like them throughout the hemisphere. Honored dissidents are those who called for freedom in enemy domains in Eastern Europe, who certainly suffered, but not remotely like their counterparts in Latin America.

The distinction is worth examination, and tells us a lot about the two senses of the phrase “responsibility of intellectuals,” and about ourselves. It is not seriously in question, as John Coatsworth writes in the recently published Cambridge University History of the Cold War, that from 1960 to “the Soviet collapse in 1990, the numbers of political prisoners, torture victims, and executions of nonviolent political dissenters in Latin America vastly exceeded those in the Soviet Union and its East European satellites.” Among the executed were many religious martyrs, and there were mass slaughters as well, consistently supported or initiated by Washington.

Why then the distinction? It might be argued that what happened in Eastern Europe is far more momentous than the fate of the South at our hands. It would be interesting to see the argument spelled out. And also to see the argument explaining why we should disregard elementary moral principles, among them that if we are serious about suffering and atrocities, about justice and rights, we will focus our efforts on where we can do the most good—typically, where we share responsibility for what is being done. We have no difficulty demanding that our enemies follow such principles.

Few of us care, or should, what Andrei Sakharov or Shirin Ebadi say about U.S. or Israeli crimes; we admire them for what they say and do about those of their own states, and the conclusion holds far more strongly for those who live in more free and democratic societies, and therefore have far greater opportunities to act effectively. It is of some interest that in the most respected circles, practice is virtually the opposite of what elementary moral values dictate.

But let us conform and keep only to the matter of historical import.

The U.S. wars in Latin America from 1960 to 1990, quite apart from their horrors, have long-term historical significance. To consider just one important aspect, in no small measure they were wars against the Church, undertaken to crush a terrible heresy proclaimed at Vatican II in 1962, which, under the leadership of Pope John XXIII, “ushered in a new era in the history of the Catholic Church,” in the words of the distinguished theologian Hans Küng, restoring the teachings of the gospels that had been put to rest in the fourth century when the Emperor Constantine established Christianity as the religion of the Roman Empire, instituting “a revolution” that converted “the persecuted church” to a “persecuting church.” The heresy of Vatican II was taken up by Latin American bishops who adopted the “preferential option for the poor.” Priests, nuns, and laypersons then brought the radical pacifist message of the gospels to the poor, helping them organize to ameliorate their bitter fate in the domains of U.S. power.

That same year, 1962, President Kennedy made several critical decisions. One was to shift the mission of the militaries of Latin America from “hemispheric defense”—an anachronism from World War II—to “internal security,” in effect, war against the domestic population, if they raise their heads. Charles Maechling, who led U.S. counterinsurgency and internal defense planning from 1961 to 1966, describes the unsurprising consequences of the 1962 decision as a shift from toleration “of the rapacity and cruelty of the Latin American military” to “direct complicity” in their crimes to U.S. support for “the methods of Heinrich Himmler’s extermination squads.” One major initiative was a military coup in Brazil, planned in Washington and implemented shortly after Kennedy’s assassination, instituting a murderous and brutal national security state. The plague of repression then spread through the hemisphere, including the 1973 coup installing the Pinochet dictatorship, and later the most vicious of all, the Argentine dictatorship, Reagan’s favorite. Central America’s turn—not for the first time—came in the 1980s under the leadership of the “warm and friendly ghost” who is now revered for his achievements.

The murder of the Jesuit intellectuals as the Berlin wall fell was a final blow in defeating the heresy, culminating a decade of horror in El Salvador that opened with the assassination, by much the same hands, of Archbishop Óscar Romero, the “voice for the voiceless.” The victors in the war against the Church declare their responsibility with pride. The School of the Americas (since renamed), famous for its training of Latin American killers, announces as one of its “talking points” that the liberation theology that was initiated at Vatican II was “defeated with the assistance of the US army.”

Actually, the November 1989 assassinations were almost a final blow. More was needed.

A year later Haiti had its first free election, and to the surprise and shock of Washington, which like others had anticipated the easy victory of its own candidate from the privileged elite, the organized public in the slums and hills elected Jean-Bertrand Aristide, a popular priest committed to liberation theology. The United States at once moved to undermine the elected government, and after the military coup that overthrew it a few months later, lent substantial support to the vicious military junta and its elite supporters. Trade was increased in violation of international sanctions and increased further under Clinton, who also authorized the Texaco oil company to supply the murderous rulers, in defiance of his own directives.

I will skip the disgraceful aftermath, amply reviewed elsewhere, except to point out that in 2004, the two traditional torturers of Haiti, France and the United States, joined by Canada, forcefully intervened, kidnapped President Aristide (who had been elected again), and shipped him off to central Africa. He and his party were effectively barred from the farcical 2010–11 elections, the most recent episode in a horrendous history that goes back hundreds of years and is barely known among the perpetrators of the crimes, who prefer tales of dedicated efforts to save the suffering people from their grim fate.

If we are serious about justice, we will focus our efforts where we share responsibility for what is being done.

Another fateful Kennedy decision in 1962 was to send a special forces mission to Colombia, led by General William Yarborough, who advised the Colombian security forces to undertake “paramilitary, sabotage and/or terrorist activities against known communist proponents,” activities that “should be backed by the United States.” The meaning of the phrase “communist proponents” was spelled out by the respected president of the Colombian Permanent Committee for Human Rights, former Minister of Foreign Affairs Alfredo Vázquez Carrizosa, who wrote that the Kennedy administration “took great pains to transform our regular armies into counterinsurgency brigades, accepting the new strategy of the death squads,” ushering in

what is known in Latin America as the National Security Doctrine. . . . [not] defense against an external enemy, but a way to make the military establishment the masters of the game . . . [with] the right to combat the internal enemy, as set forth in the Brazilian doctrine, the Argentine doctrine, the Uruguayan doctrine, and the Colombian doctrine: it is the right to fight and to exterminate social workers, trade unionists, men and women who are not supportive of the establishment, and who are assumed to be communist extremists. And this could mean anyone, including human rights activists such as myself.

In a 1980 study, Lars Schoultz, the leading U.S. academic specialist on human rights in Latin America, found that U.S. aid “has tended to flow disproportionately to Latin American governments which torture their citizens . . . to the hemisphere’s relatively egregious violators of fundamental human rights.” That included military aid, was independent of need, and continued through the Carter years. Ever since the Reagan administration, it has been superfluous to carry out such a study. In the 1980s one of the most notorious violators was El Salvador, which accordingly became the leading recipient of U.S. military aid, to be replaced by Colombia when it took the lead as the worst violator of human rights in the hemisphere. Vázquez Carrizosa himself was living under heavy guard in his Bogotá residence when I visited him there in 2002 as part of a mission of Amnesty International, which was opening its year-long campaign to protect human rights defenders in Colombia because of the country’s horrifying record of attacks against human rights and labor activists, and mostly the usual victims of state terror: the poor and defenseless. Terror and torture in Colombia were supplemented by chemical warfare (“fumigation”), under the pretext of the war on drugs, leading to huge flight to urban slums and misery for the survivors. Colombia’s attorney general’s office now estimates that more than 140,000 people have been killed by paramilitaries, often acting in close collaboration with the U.S.-funded military.

Signs of the slaughter are everywhere. On a nearly impassible dirt road to a remote village in southern Colombia a year ago, my companions and I passed a small clearing with many simple crosses marking the graves of victims of a paramilitary attack on a local bus. Reports of the killings are graphic enough; spending a little time with the survivors, who are among the kindest and most compassionate people I have ever had the privilege of meeting, makes the picture more vivid, and only more painful.

This is the briefest sketch of terrible crimes for which Americans bear substantial culpability, and that we could easily ameliorate, at the very least.

But it is more gratifying to bask in praise for courageously protesting the abuses of official enemies, a fine activity, but not the priority of a value-oriented intellectual who takes the responsibilities of that stance seriously.

The victims within our domains, unlike those in enemy states, are not merely ignored and quickly forgotten, but are also cynically insulted. One striking illustration came a few weeks after the murder of the Latin American intellectuals in El Salvador. Vaclav Havel visited Washington and addressed a joint session of Congress. Before his enraptured audience, Havel lauded the “defenders of freedom” in Washington who “understood the responsibility that flowed from” being “the most powerful nation on earth”—crucially, their responsibility for the brutal assassination of his Salvadoran counterparts shortly before.

The liberal intellectual class was enthralled by his presentation. Havel reminds us that “we live in a romantic age,” Anthony Lewis gushed. Other prominent liberal commentators reveled in Havel’s “idealism, his irony, his humanity,” as he “preached a difficult doctrine of individual responsibility” while Congress “obviously ached with respect” for his genius and integrity; and asked why America lacks intellectuals so profound, who “elevate morality over self-interest” in this way, praising us for the tortured and mutilated corpses that litter the countries that we have left in misery. We need not tarry on what the reaction would have been had Father Ellacuría, the most prominent of the murdered Jesuit intellectuals, spoken such words at the Duma after elite forces armed and trained by the Soviet Union assassinated Havel and half a dozen of his associates—a performance that is inconceivable.

John Dewey / Photograph: New York Public Library / Photoresearchers, Inc.

The assassination of bin Laden, too, directs our attention to our insulted victims. There is much more to say about the operation—including Washington’s willingness to face a serious risk of major war and even leakage of fissile materials to jihadis, as I have discussed elsewhere—but let us keep to the choice of name: Operation Geronimo. The name caused outrage in Mexico and was protested by indigenous groups in the United States, but there seems to have been no further notice of the fact that Obama was identifying bin Laden with the Apache Indian chief. Geronimo led the courageous resistance to invaders who sought to consign his people to the fate of “that hapless race of native Americans, which we are exterminating with such merciless and perfidious cruelty, among the heinous sins of this nation, for which I believe God will one day bring [it] to judgement,” in the words of the grand strategist John Quincy Adams, the intellectual architect of manifest destiny, uttered long after his own contributions to these sins. The casual choice of the name is reminiscent of the ease with which we name our murder weapons after victims of our crimes: Apache, Blackhawk, Cheyenne . . . We might react differently if the Luftwaffe were to call its fighter planes “Jew” and “Gypsy.”

The first 9/11, unlike the second, did not change the world. It was ‘nothing of very great consequence,’ Kissinger said.

Denial of these “heinous sins” is sometimes explicit. To mention a few recent cases, two years ago in one of the world’s leading left-liberal intellectual journals, The New York Review of Books, Russell Baker outlined what he learned from the work of the “heroic historian” Edmund Morgan: namely, that when Columbus and the early explorers arrived they “found a continental vastness sparsely populated by farming and hunting people . . . . In the limitless and unspoiled world stretching from tropical jungle to the frozen north, there may have been scarcely more than a million inhabitants.” The calculation is off by many tens of millions, and the “vastness” included advanced civilizations throughout the continent. No reactions appeared, though four months later the editors issued a correction, noting that in North America there may have been as many as 18 million people—and, unmentioned, tens of millions more “from tropical jungle to the frozen north.” This was all well known decades ago—including the advanced civilizations and the “merciless and perfidious cruelty” of the “extermination”—but not important enough even for a casual phrase. In London Review of Books a year later, the noted historian Mark Mazower mentioned American “mistreatment of the Native Americans,” again eliciting no comment. Would we accept the word “mistreatment” for comparable crimes committed by enemies?

• • •

If the responsibility of intellectuals refers to their moral responsibility as decent human beings in a position to use their privilege and status to advance the cause of freedom, justice, mercy, and peace—and to speak out not simply about the abuses of our enemies, but, far more significantly, about the crimes in which we are implicated and can ameliorate or terminate if we choose—how should we think of 9/11?

The notion that 9/11 “changed the world” is widely held, understandably. The events of that day certainly had major consequences, domestic and international. One was to lead President Bush to re-declare Ronald Reagan’s war on terrorism—the first one has been effectively “disappeared,” to borrow the phrase of our favorite Latin American killers and torturers, presumably because the consequences do not fit well with preferred self images. Another consequence was the invasion of Afghanistan, then Iraq, and more recently military interventions in several other countries in the region and regular threats of an attack on Iran (“all options are open,” in the standard phrase). The costs, in every dimension, have been enormous. That suggests a rather obvious question, not asked for the first time: was there an alternative?

A number of analysts have observed that bin Laden won major successes in his war against the United States. “He repeatedly asserted that the only way to drive the U.S. from the Muslim world and defeat its satraps was by drawing Americans into a series of small but expensive wars that would ultimately bankrupt them,” the journalist Eric Margolis writes.

The United States, first under George W. Bush and then Barack Obama, rushed right into bin Laden’s trap. . . . Grotesquely overblown military outlays and debt addiction . . . . may be the most pernicious legacy of the man who thought he could defeat the United States.

A report from the Costs of War project at Brown University’s Watson Institute for International Studies estimates that the final bill will be $3.2–4 trillion. Quite an impressive achievement by bin Laden.

That Washington was intent on rushing into bin Laden’s trap was evident at once. Michael Scheuer, the senior CIA analyst responsible for tracking bin Laden from 1996 to 1999, writes, “Bin Laden has been precise in telling America the reasons he is waging war on us.” The al Qaeda leader, Scheuer continues, “is out to drastically alter U.S. and Western policies toward the Islamic world.”

And, as Scheuer explains, bin Laden largely succeeded: “U.S. forces and policies are completing the radicalization of the Islamic world, something Osama bin Laden has been trying to do with substantial but incomplete success since the early 1990s. As a result, I think it is fair to conclude that the United States of America remains bin Laden’s only indispensable ally.” And arguably remains so, even after his death.

There is good reason to believe that the jihadi movement could have been split and undermined after the 9/11 attack, which was criticized harshly within the movement. Furthermore, the “crime against humanity,” as it was rightly called, could have been approached as a crime, with an international operation to apprehend the likely suspects. That was recognized in the immediate aftermath of the attack, but no such idea was even considered by decision-makers in government. It seems no thought was given to the Taliban’s tentative offer—how serious an offer, we cannot know—to present the al Qaeda leaders for a judicial proceeding.

At the time, I quoted Robert Fisk’s conclusion that the horrendous crime of 9/11 was committed with “wickedness and awesome cruelty”—an accurate judgment. The crimes could have been even worse. Suppose that Flight 93, downed by courageous passengers in Pennsylvania, had bombed the White House, killing the president. Suppose that the perpetrators of the crime planned to, and did, impose a military dictatorship that killed thousands and tortured tens of thousands. Suppose the new dictatorship established, with the support of the criminals, an international terror center that helped impose similar torture-and-terror states elsewhere, and, as icing on the cake, brought in a team of economists—call them “the Kandahar boys”—who quickly drove the economy into one of the worst depressions in its history. That, plainly, would have been a lot worse than 9/11.

As we all should know, this is not a thought experiment. It happened. I am, of course, referring to what in Latin America is often called “the first 9/11”: September 11, 1973, when the United States succeeded in its intensive efforts to overthrow the democratic government of Salvador Allende in Chile with a military coup that placed General Pinochet’s ghastly regime in office. The dictatorship then installed the Chicago Boys—economists trained at the University of Chicago—to reshape Chile’s economy. Consider the economic destruction, the torture and kidnappings, and multiply the numbers killed by 25 to yield per capita equivalents, and you will see just how much more devastating the first 9/11 was.

Privilege yields opportunity, and opportunity confers responsibilities.

The goal of the overthrow, in the words of the Nixon administration, was to kill the “virus” that might encourage all those “foreigners [who] are out to screw us”—screw us by trying to take over their own resources and more generally to pursue a policy of independent development along lines disliked by Washington. In the background was the conclusion of Nixon’s National Security Council that if the United States could not control Latin America, it could not expect “to achieve a successful order elsewhere in the world.” Washington’s “credibility” would be undermined, as Henry Kissinger put it.

The first 9/11, unlike the second, did not change the world. It was “nothing of very great consequence,” Kissinger assured his boss a few days later. And judging by how it figures in conventional history, his words can hardly be faulted, though the survivors may see the matter differently.

These events of little consequence were not limited to the military coup that destroyed Chilean democracy and set in motion the horror story that followed. As already discussed, the first 9/11 was just one act in the drama that began in 1962 when Kennedy shifted the mission of the Latin American militaries to “internal security.” The shattering aftermath is also of little consequence, the familiar pattern when history is guarded by responsible intellectuals.

• • •

It seems to be close to a historical universal that conformist intellectuals, the ones who support official aims and ignore or rationalize official crimes, are honored and privileged in their own societies, and the value-oriented punished in one or another way. The pattern goes back to the earliest records. It was the man accused of corrupting the youth of Athens who drank the hemlock, much as Dreyfusards were accused of “corrupting souls, and, in due course, society as a whole” and the value-oriented intellectuals of the 1960s were charged with interference with “indoctrination of the young.”

In the Hebrew scriptures there are figures who by contemporary standards are dissident intellectuals, called “prophets” in the English translation. They bitterly angered the establishment with their critical geopolitical analysis, their condemnation of the crimes of the powerful, their calls for justice and concern for the poor and suffering. King Ahab, the most evil of the kings, denounced the Prophet Elijah as a hater of Israel, the first “self-hating Jew” or “anti-American” in the modern counterparts. The prophets were treated harshly, unlike the flatterers at the court, who were later condemned as false prophets. The pattern is understandable. It would be surprising if it were otherwise.

As for the responsibility of intellectuals, there does not seem to me to be much to say beyond some simple truths. Intellectuals are typically privileged—merely an observation about usage of the term. Privilege yields opportunity, and opportunity confers responsibilities. An individual then has choices.

A novela perdeu o bonde da história (Fapesp)

HUMANIDADES | COMUNICAÇÃO

Cai status do gênero como lugar privilegiado de discussão das questões nacionais
Carlos Haag
Edição Impressa – Agosto 2011
© FOTOGRAFIAS TADEU VILANI

Em 1981, durante uma crise política grave no governo Figueiredo, o todo-poderoso Golbery do Couto e Silva pediu demissão do governo. Aos jornalistas justificou-se: “Não me perguntem nada. Eu acabo de sair de Sucupira”. A referência à cidade fictícia da novela O bem-amado (1973) e à minissérie homônima (1980-1984), de Dias Gomes, num momento delicado como aquele, revela o poder, à época, das telenovelas como representação da realidade nacional e de como os brasileiros se reconheciam nessas representações. “A partir de conflitos de gênero, geração, classe e religião, a novela fez crônicas do cotidiano que a transformaram num palco privilegiado de interpretação do Brasil. O país, que se modernizava num contexto de modernização centrada no consumo, e não na afirmação da cidadania, se reconhecia na tela da TV em um universo branco e glamoroso”, explica Esther Hamburger, professora do Departamento de Cinema, Rádio e Televisão da Universidade de São Paulo (USP) e autora do estudo O Brasil antenado (Jorge Zahar Editor). Ela analisou os novos rumos do gênero na pesquisa Formação do campo intelectual e da indústria cultural no Brasil contemporâneo, apoiada pela FAPESP e coordenada pelo sociólogo da USP Sérgio Miceli. O projeto reúne, além de Esther, outros pesquisadores de várias áreas e temas.

“No Brasil que se democratizava, a novela tratou em primeira mão de assuntos que pautariam a cena política na década seguinte. Mas, hoje, ela perdeu o seu status privilegiado de problematização das questões nacionais. Não consegue mobilizar a opinião pública, não é mais totalmente nacional e tampouco a vitrine do país. É provável que não seja mais capaz de sintetizar o país”, avisa a pesquisadora. “Afinal, aquele país centralizado, passível de uma representação hegemônica, não existe mais. Novos meios como TV a cabo e a internet tiraram da novela o seu caráter de arena de problematização. A sociedade mudou e há muita diversificação. A alfabetização aumentou e a TV não é mais o único lugar para achar informações”, observa. Para Esther, no país atual não é mais possível uma novela falar para toda a nação. “Não há mais um Brasil na TV, mas vários”, avalia.

Queda – “A novela permanece estratégica na receita e na competição entre as emissoras de televisão, mas sua capacidade de polarizar audiências nacionais está em queda. O gênero abusa de mensagens de conteúdo social, enquanto perde seu diferencial estético e sua força polêmica. A nação já não é mais o tema central, porque os temas extrapolam fronteiras. Há cada vez me-nos referências a assuntos atuais e polêmicos. A opção é por campanhas politicamente corretas, muitas vezes em detrimento da dramaturgia, amarrando a criatividade dos autores”, diz Esther. Segundo a mas vários”, avaliapesquisadora, a estrutura de conflitos melodramáticos que sustenta a narrativa ainda se mantém, mas em histórias que voltam a se restringir a espaços imaginados como femininos, o público inicial dos primórdios da telenovela nacional, e de menor valor cultural. O gênero também não atrai mais tantos talentos criativos, com textos fracos e enredos repetitivos que insistem em velhos clichês e convenções que fizeram sucesso no passado. “Ainda assim, não se pode negar que a novela pode voltar a ter o impacto político e cultural de antes, influindo no comportamento e na moda. Ela ainda é um lugar onde se pode aprender algo, em especial o novo público predominante, abaixo das classes A e B”, fala.

Do apogeu à crise recente de queda de audiências foi um longo caminho. No início imperava o estilo “fantasia”, cheio de sentimentalismo, em produções dos anos 1960, como o exótico Sheik de Agadir, paradigma quebrado com o realismo de Beto Rockfeller, representação da contemporaneidade das classes médias emergentes. Nos anos 1970 romperam-se os limites do dramalhão, mas as novelas viraram vitrines do ser moderno: a moda e o comportamento. “A Globo, durante a ditadura, adotou o discurso oficial, mas entendeu que, nas novelas, ao invés de esconder os problemas, era melhor incorporá-los nas tramas, como fez em O bem-amado. Foi o início de uma crítica crescente ao processo de modernização”, lembra Mauro Porto, professor da Tulane University e autor da pesquisa Telenovelas and national identity in Brazil. O realismo tomou conta do gênero: uma pesquisa de 1988 revelou que 58% dos entrevistados queriam ver “a realidade” nas novelas e 60% desejavam que as tramas falassem da política. “Os autores, de uma geração de esquerda, se viam como responsáveis por um projeto nacional e de consciên-cia popular”, nota Porto. “As novelas registraram os dramas da urbanização, das diferenças sociais, da fragmentação da família, da liberalização das relações conjugais e dos padrões de consumo. Chegaram ao seu ápice quando falaram dos problemas da modernização como Vale tudo (1988) e Roque Santeiro (1985)”, diz Esther. Mas a TV Manchete trouxe uma leitura alternativa do país com Pantanal, pleno do exótico e do erótico, o que rompeu o ciclo político das novelas, inclusive na Globo, que se viu obrigada a emular o novo conceito. “O ‘efeito Pantanal’, porém, não deixou herdeiros e hoje foi esquecido.”

Intimidade – “Nesse percurso, a telenovela criou um repertório comum pelo qual pessoas de classes sociais, gerações, sexo, raça e regiões diferentes se reconheciam, uma ‘comunidade imaginada’ de problematização do Brasil, da intimidade com os problemas sociais, veículo ideal para se construir a cidadania, uma narrativa da nação”, analisa Maria Immacolata Lopes, professora da Escola de Comunicações e Artes (ECA-USP) e coordenadora do Núcleo de Pesquisa de Telenovelas. O modelo se desgastou e o país mudou. “Entre 1970 e 1980 houve uma mágica entre público e novela. Em Vale tudo, pela primeira vez se viu a corrupção num espaço público não político e as novelas estavam na vanguarda”, nota Esther. “Hoje a corrupção é banal, não é mais polêmica, só traz o tédio da repetição. Em 1988 era novidade; em 2011 é algo batido.” As novelas não estão mais antenadas com o país. “Mesmo a literatura contemporânea acadêmica estrangeira sobre televisão já não discute mais a telenovela brasileira e o ‘caso’ brasileiro perdeu espaço interna e externamente diante de uma renovação da ficção televisiva internacional, em especial os seriados americanos, que ganham espaço nos canais nacionais, um novo fluxo de importação de programação que as novelas haviam substituído nas décadas anteriores”, explica. Os sitcons de hoje, ao contrário do passado, quando eram “obras fechadas” e sem improviso, estão abertos aos indicadores de sucesso e podem mudar seu rumo enquanto estão no ar, trazendo alusões a elementos políticos e culturais da realidade americana e problematizando os EUA.

“Não temos a mesma audiência nacional com todas as classes e lugares. Tudo ficou mais popular e as novelas atendem esse público espectador com merchandising social, sexo, dinâmica de tramas que mudam toda hora, ação, assassinatos”, analisa. Para a pesquisadora, essa quebra na dramaturgia reduz ainda mais o escopo do público ao fazer cair o interesse de uma grande parte da au-diência. Esther cita novas alternativas como Cordel encantado, que remete às novelas fantasiosas. Há também a procura de novos autores e diretores ou o remake de antigos sucessos, como O astro, para recuperar fórmulas de sucesso do passado, mas, mesmo adaptadas, conservam sabor de “coisa velha”. “Não sabemos se os brasileiros ainda desejam o realismo, mas é certo que se cansaram das novelas urbanas no eixo Rio-São Paulo. Gostariam de conhecer novas rea-lidades e o aspecto regional antes desprezado ou caricaturado.” A renovação não é fácil, como mostra o fracasso de experimentações como Cidade de Deus ou Antonia. “Uma solução seria mostrar a violência das cidades, do tráfico, mas isso ainda é tabu nas novelas. O cinema se revelou mais ‘antenado’ ao mostrar os poderes paralelos das periferias, como em Tropa de elite. Ou, Dois filhos de Francisco, filme que traz um Brasil onde os humildes se realizam.” A novela, pela primeira vez, perdeu o bonde da história. Num escândalo recente, um colunista político não usou uma citação de novela, como Golbery, para falar do caso, mas o bordão do filme Tropa de elite: “Palocci, pede pra sair!”.