Arquivo da tag: História

Opinion | Why We Are Not Facing the Prospect of a Second Civil War (New York Times)

nytimes.com


Jamelle Bouie

Feb. 15, 2022

At the Georgia State Capitol, demonstrating against the inauguration of President Biden on Jan. 20, 2021.
Credit: Joshua Rashaad McFadden for The New York Times

It has not been uncommon, in recent years, to hear Americans worry about the advent of a new civil war.

Is Civil War Ahead?” The New Yorker asked last month. “Is America heading to civil war or secession?” CNN wondered on the anniversary of the Jan. 6 attack on the Capitol. Last week, Representative Adam Kinzinger of Illinois told “The View” that “we have to recognize” the possibility of a civil war. “I don’t think it’s too far of a bridge to think that’s a possibility,” he said.

This isn’t just the media or the political class; it’s public opinion too. In a 2019 survey for the Georgetown Institute of Politics, the average respondent said that the United States was two-thirds of the way toward the “edge of a civil war.” In a recent poll conducted by the Institute of Politics at Harvard, 35 percent of voting-age Americans under 30 placed the odds of a second civil war at 50 percent or higher.

And in a result that says something about the divisions at hand, 52 percent of Trump voters and 41 percent of Biden voters said that they at least “somewhat agree” that it’s time to split the country, with either red or blue states leaving the union and forming their own country, according to a survey conducted by the Center for Politics at the University of Virginia (where I am a visiting scholar).

Several related forces are fueling this anxiety, from deepening partisan polarization and our winner-take-all politics to our sharp division across lines of identity, culture and geography. There is the fact that this country is saturated with guns, as well as the reality that many Americans fear demographic change to the point that they’re willing to do pretty much anything to stop it. There is also the issue of Donald Trump, his strongest supporters and their effort to overturn the results of the 2020 presidential election. Americans feel farther apart than at any point in recent memory, and as a result, many Americans fear the prospect of organized political violence well beyond what we saw on Jan. 6, 2021.

There is, however, a serious problem with this narrative: The Civil War we fought in the 19th century was not sparked by division qua division.

White Americans had been divided over slavery for 50 years before the crisis that led to war in 1861. The Missouri crisis of 1820, the nullification crisis of 1832, the conflict over the 1846 war with Mexico and the Compromise of 1850 all reflect the degree to which American politics rested on a sectional divide over the future of the slave system.

What made the 1850s different was the extent to which that division threatened the political economy of slavery. At the start of the decade, the historian Matthew Karp writes in “This Vast Southern Empire: Slaveholders at the Helm of American Foreign Policy,” “slaveholding power and slaveholding confidence seemed at their zenith,” the result of “spiking global demand for cotton” and the “dependence of the entire industrial world on a commodity that only American slaves could produce with profit.”

But with power came backlash. “Over the course of the decade,” Karp notes, “slavery was prohibited in the Pacific states, came under attack in Kansas and appeared unable to attach itself to any of the great open spaces of the new Southwest.” The growth of an avowedly antislavery public in the North wasn’t just a challenge to the political influence of the slaveholding South; it also threatened to undermine the slave economy itself and thus the economic basis for Southern power.

Plantation agriculture rapidly exhausted the soil. The sectional balance of Congress aside, planters needed new land to grow the cotton that secured their influence on the national (and international) stage. As Karp explains, “Slaveholders in the 1850s seldom passed up an opportunity to sketch the inexorable syllogism of King Cotton: The American South produced nearly all the world’s usable raw cotton; this cotton fueled the industrial development of the North Atlantic; therefore, the advanced economies of France, the northern United States and Great Britain were ruled, in effect, by Southern planters.” The backlash to slavery — the effort to restrain its growth and contain its spread — was an existential threat to the Southern elite.

It was the realization of that threat with the election of Abraham Lincoln — whose Republican Party was founded to stop the spread of slavery and who inherited a federal state with the power to do so — that pushed Southern elites to gamble their future on secession. They would leave the union and attempt to forge a slave empire on their own.

The point of this compact history, with regard to the present, is that it is irresponsible to talk about civil war as a function of polarization or division or rival ideologies. If those things matter, and they do, it is in how they both reflect and shape the objective interests of the people and factions in dispute.

Which is to say that if you’re worried about a second Civil War, the question to ask isn’t whether people hate each other — they always have, and we tend to grossly exaggerate the extent of this country’s political and cultural unity over time — but whether that hate results from the irreconcilable social and economic interests of opposing groups within the society. If it must be one way or the other, then you might have a conflict on your hands.

That’s where America was with slavery. That’s why our actual Civil War has been called the impending crisis. I’m not sure there’s anything in American society right now that plays the same role that the conflict over slavery did. Whatever our current challenges, whatever our current divisions, I do not think the United States is where it was in 1860. We have enough problems ahead of us already without having to worry about war breaking out here.

Cerejeiras florescem mais cedo no Japão em 1,2 mil anos (Folha de S.Paulo)

f5.folha.uol.com.br

Kazuhiro Nogi – 24.mar.2021/AFP 4-5 minutos


São Paulo

O florescer das famosas cerejeiras brancas e rosas leva milhares às ruas e parques do Japão para observar o fenômeno, que dura poucos dias e é reverenciado há mais de mil anos. Mas este ano a antecipação da florada tem preocupado cientistas, pois indica impacto nas mudanças climáticas.

Segundo registros da Universidade da Prefeitura de Osaka, em 2021, as famosas cerejeiras brancas e rosas floresceram totalmente em 26 de março em Quioto, a data mais antecipada em 12 séculos. As floradas mais cedo foram registradas em 27 de março dos anos 1612, 1409 e 1236.

A instituição conseguiu identificar a antecipação do fenômeno porque tem um banco de dados completo dos registros das floradas ao longo dos séculos. Os registros começaram no ano 812 e incluem documentos judiciais da Quioto Imperial, a antiga capital do Japão e diários medievais.

O professor de ciência ambiental da universidade da Prefeitura de Osaka, Yasuyuki Aono, responsável por compilar um banco de dados, disse à Agência Reuters que o fenômeno costuma ocorrer em abril, mas à medida que as temperaturas sobem, o início da floração é mais cedo.

Kazuhiro Nogui, 24.mar.2021/AFP

“As flores de cerejeira são muito sensíveis à temperatura. A floração e a plena floração podem ocorrer mais cedo ou mais tarde, dependendo apenas da temperatura. A temperatura era baixa na década de 1820, mas subiu cerca de 3,5 graus Celsius até hoje”, disse.

Segundo ele, as estações deste ano, em particular, influenciaram as datas de floração. O inverno foi muito frio, mas a primavera veio rápida e excepcionalmente quente, então “os botões estão completamente despertos depois de um descanso suficiente”.

Na capital Tóquio, as cerejeiras atingiram o máximo da florada em 22 de março, o segundo ano mais cedo já registrado. “À medida que as temperaturas globais aumentam, as geadas da última Primavera estão ocorrendo mais cedo e a floração está ocorrendo mais cedo”, afirmou Lewis Ziska, da Universidade de Columbia, à CNN.

A Agência Meteorológica do Japão acompanha ainda 58 cerejeiras “referência” no país. Neste ano, 40 já atingiram o pico de floração e 14 o fizeram em tempo recorde. As árvores normalmente florescem por cerca de duas semanas todos os anos. “Podemos dizer que é mais provável por causa do impacto do aquecimento global”, disse Shunji Anbe, funcionário da divisão de observações da agência.

Dados Organização Meteorológica Mundial divulgados em janeiro mostram que as temperaturas globais em 2020 estiveram entre as mais altas já registradas e rivalizaram com 2016 com o ano mais quente de todos os tempos.

As flores de cerejeira têm longas raízes históricas e culturais no Japão, anunciando a Primavera e inspirando artistas e poetas ao longo dos séculos. Sua fragilidade é vista como um símbolo de vida, morte e renascimento.

Atualmente, as pessoas se reúnem sob as flores de cerejeiras a cada primavera para festas hanami (observação das flores), passeiam em parques e fazem piqueniques embaixo dos galhos e abusar das selfies. Mas, neste ano, a florada de cerejeiras veio e se foi em um piscar de olhos.

Com o fim do estado de emergência para conter a pandemia de Covid-19 em todas as regiões do Japão, muitas pessoas se aglomeraram em locais populares de exibição no fim de semana, embora o número de pessoas tenha sido menor do que em anos normais.

He Wants to Save Classics From Whiteness. Can the Field Survive? (The New York Times Magazine)

Original article

Dan-el Padilla Peralta thinks classicists should knock ancient Greece and Rome off their pedestal — even if that means destroying their discipline.

Padilla at Princeton in January. Credit: D’Angelo Lovell Williams for The New York Times

By Rachel Poser

Feb. 2, 2021

In the world of classics, the exchange between Dan-el Padilla Peralta and Mary Frances Williams has become known simply as “the incident.” Their back-and-forth took place at a Society of Classical Studies conference in January 2019 — the sort of academic gathering at which nothing tends to happen that would seem controversial or even interesting to those outside the discipline. But that year, the conference featured a panel on “The Future of Classics,” which, the participants agreed, was far from secure. On top of the problems facing the humanities as a whole — vanishing class sizes caused by disinvestment, declining prominence and student debt — classics was also experiencing a crisis of identity. Long revered as the foundation of “Western civilization,” the field was trying to shed its self-imposed reputation as an elitist subject overwhelmingly taught and studied by white men. Recently the effort had gained a new sense of urgency: Classics had been embraced by the far right, whose members held up the ancient Greeks and Romans as the originators of so-called white culture. Marchers in Charlottesville, Va., carried flags bearing a symbol of the Roman state; online reactionaries adopted classical pseudonyms; the white-supremacist website Stormfront displayed an image of the Parthenon alongside the tagline “Every month is white history month.”

Padilla, a leading historian of Rome who teaches at Princeton and was born in the Dominican Republic, was one of the panelists that day. For several years, he has been speaking openly about the harm caused by practitioners of classics in the two millenniums since antiquity: the classical justifications of slavery, race science, colonialism, Nazism and other 20th-century fascisms. Classics was a discipline around which the modern Western university grew, and Padilla believes that it has sown racism through the entirety of higher education. Last summer, after Princeton decided to remove Woodrow Wilson’s name from its School of Public and International Affairs, Padilla was a co-author of an open letter that pushed the university to do more. “We call upon the university to amplify its commitment to Black people,” it read, “and to become, for the first time in its history, an anti-racist institution.” Surveying the damage done by people who lay claim to the classical tradition, Padilla argues, one can only conclude that classics has been instrumental to the invention of “whiteness” and its continued domination.

In recent years, like-minded classicists have come together to dispel harmful myths about antiquity. On social media and in journal articles and blog posts, they have clarified that contrary to right-wing propaganda, the Greeks and Romans did not consider themselves “white,” and their marble sculptures, whose pale flesh has been fetishized since the 18th century, would often have been painted in antiquity. They have noted that in fifth-century-B.C. Athens, which has been celebrated as the birthplace of democracy, participation in politics was restricted to male citizens; thousands of enslaved people worked and died in silver mines south of the city, and custom dictated that upper-class women could not leave the house unless they were veiled and accompanied by a male relative. They have shown that the concept of Western civilization emerged as a euphemism for “white civilization” in the writing of men like Lothrop Stoddard, a Klansman and eugenicist. Some classicists have come around to the idea that their discipline forms part of the scaffold of white supremacy — a traumatic process one described to me as “reverse red-pilling” — but they are also starting to see an opportunity in their position. Because classics played a role in constructing whiteness, they believed, perhaps the field also had a role to play in its dismantling.

On the morning of the panel, Padilla stood out among his colleagues, as he always did. He sat in a crisp white shirt at the front of a large conference hall at a San Diego Marriott, where most of the attendees wore muted shades of gray. Over the course of 10 minutes, Padilla laid out an indictment of his field. “If one were intentionally to design a discipline whose institutional organs and gatekeeping protocols were explicitly aimed at disavowing the legitimate status of scholars of color,” he said, “one could not do better than what classics has done.” Padilla’s vision of classics’ complicity in systemic injustice is uncompromising, even by the standards of some of his allies. He has condemned the field as “equal parts vampire and cannibal” — a dangerous force that has been used to murder, enslave and subjugate. “He’s on record as saying that he’s not sure the discipline deserves a future,” Denis Feeney, a Latinist at Princeton, told me. Padilla believes that classics is so entangled with white supremacy as to be inseparable from it. “Far from being extrinsic to the study of Greco-Roman antiquity,” he has written, “the production of whiteness turns on closer examination to reside in the very marrows of classics.”

When Padilla ended his talk, the audience was invited to ask questions. Williams, an independent scholar from California, was one of the first to speak. She rose from her seat in the front row and adjusted a standing microphone that had been placed in the center of the room. “I’ll probably offend all of you,” she began. Rather than kowtowing to criticism, Williams said, “maybe we should start defending our discipline.” She protested that it was imperative to stand up for the classics as the political, literary and philosophical foundation of European and American culture: “It’s Western civilization. It matters because it’s the West.” Hadn’t classics given us the concepts of liberty, equality and democracy?

‘There are some in the field who say: “Yes, we agree with your critique. Now let us go back to doing exactly what we’ve been doing.” ’

One panelist tried to interject, but Williams pressed on, her voice becoming harsh and staccato as the tide in the room moved against her. “I believe in merit. I don’t look at the color of the author.” She pointed a finger in Padilla’s direction. “You may have got your job because you’re Black,” Williams said, “but I would prefer to think you got your job because of merit.”

Discordant sounds went up from the crowd. Several people stood up from their seats and hovered around Williams at the microphone, seemingly unsure of whether or how to intervene. Padilla was smiling; it was the grimace of someone who, as he told me later, had been expecting something like this all along. At last, Williams ceded the microphone, and Padilla was able to speak. “Here’s what I have to say about the vision of classics that you outlined,” he said. “I want nothing to do with it. I hope the field dies that you’ve outlined, and that it dies as swiftly as possible.”

When Padilla was a child, his parents proudly referred to Santo Domingo, the capital of the Dominican Republic, as the “Athens of the New World” — a center of culture and learning. That idea had been fostered by Rafael Trujillo, the dictator who ruled the country from 1930 until his assassination in 1961. Like other 20th-century fascists, Trujillo saw himself, and his people, as the inheritors of a grand European tradition that originated in Greece and Rome. In a 1932 speech, he praised ancient Greece as the “mistress of beauty, rendered eternal in the impeccable whiteness of its marbles.” Trujillo’s veneration of whiteness was central to his message. By invoking the classical legacy, he could portray the residents of neighboring Haiti as darker and inferior, a campaign that reached its murderous peak in 1937 with the Parsley Massacre, or El Corte (“the Cutting”) in Spanish, in which Dominican troops killed as many as 30,000 Haitians and Black Dominicans, according to some estimates.

Padilla’s family didn’t talk much about their lives under the dictatorship, but he knew that his mother’s father had been beaten after arguing with some drunken Trujillistas. That grandfather, along with the rest of his mother’s relatives, were fishermen and sailors in Puerto Plata, a city on the coast; they lived in what Padilla describes as “immiserating poverty” but benefited from a degree of privilege in Dominican society because of their lighter skin. His father’s people, on the other hand, often joked that they were “black as night.” They had lived for generations in Pimentel, a city near the mountainous northeast where enslaved Africans had set up Maroon communities in the 1600s and 1700s, counting on the difficult terrain to give them a measure of safety. Like their counterparts in the United States, slavers in the Dominican Republic sometimes bestowed classical names on their charges as a mark of their civilizing mission, so the legacy of slavery — and its entanglement with classics — remains legible in the names of many Dominicans today. “Why are there Dominicans named Temístocles?” Padilla used to wonder as a kid. “Why is Manny Ramirez’s middle name Aristides?” Trujillo’s own middle name was Leónidas, after the Spartan king who martyred himself with 300 of his soldiers at Thermopylae, and who has become an icon of the far right. But in his early life, Padilla was aware of none of this. He only knew that he was Black like his father.

When Padilla was 4, he and his parents flew to the United States so that his mother, María Elena, could receive care for pregnancy complications at a New York City hospital. But after his brother, Yando, was born, the family decided to stay; they moved into an apartment in the Bronx and quietly tried to normalize their immigration status, spending their savings in the process. Without papers, it was hard to find steady work. Some time later, Padilla’s father returned to the Dominican Republic; he had been an accountant in Santo Domingo, and he was weary of poverty in the United States, where he had been driving a cab and selling fruit in the summers. That left María Elena with the two boys in New York. Because Yando was a U.S. citizen, she received $120 in food stamps and $85 in cash each month, but it was barely enough to feed one child, let alone a family of three. Over the next few months, María Elena and her sons moved between apartments in Manhattan, the Bronx and Queens, packing up and finding a new place each time they couldn’t make rent. For about three weeks, the landlord of a building in Queens let them stay in the basement as a favor, but when a sewage pipe burst over them as they were sleeping, María Elena found her way to a homeless shelter in Chinatown.

At the shelter, “the food tasted nasty,” and “pools of urine” marred the bathroom floor, Padilla wrote in his 2015 memoir, “Undocumented.” His one place of respite was the tiny library on the shelter’s top floor. Since leaving the Dominican Republic, Padilla had grown curious about Dominican history, but he couldn’t find any books about the Caribbean on the library’s shelves. What he did find was a slim blue-and-white textbook titled “How People Lived in Ancient Greece and Rome.” “Western civilization was formed from the union of early Greek wisdom and the highly organized legal minds of early Rome,” the book began. “The Greek belief in a person’s ability to use his powers of reason, coupled with Roman faith in military strength, produced a result that has come to us as a legacy, or gift from the past.” Thirty years later, Padilla can still recite those opening lines. “How many times have I taken an ax to this over the last decade of my career?” he said to me. “But at the moment of the initial encounter, there was something energizing about it.” Padilla took the textbook back to the room he shared with his mother and brother and never returned it to the library.

One day in the summer of 1994, a photographer named Jeff Cowen, who was teaching art at a shelter in Bushwick, where María Elena and the boys had been transferred, noticed 9-year-old Padilla tucked away by himself, reading a biography of Napoleon Bonaparte. “The kids were running around like crazy on their after-lunch sugar high, and there was a boy sitting in the corner with this enormous tome,” Cowen told me. “He stood up and shook my hand like a little gentleman, speaking like he’s some kind of Ivy League professor.” Cowen was taken aback. “I was really struggling at the time. I was living in an illegal building without a toilet, so I wasn’t really looking to be a do-gooder,” he said. “But within five minutes, it was obvious that this kid deserved the best education he could get. It was a responsibility.”

Dan-el Padilla Peralta in 1994 at the Bushwick shelter where he lived with his mother and younger brother.
Dan-el Padilla Peralta in 1994 at the Bushwick shelter where he lived with his mother and younger brother.Credit…Jeff Cowen

Cowen became a mentor to Padilla, and then his godfather. He visited the shelter with books and brain teasers, took Padilla and Yando roller-skating in Central Park and eventually helped Padilla apply to Collegiate, one of New York City’s elite prep schools, where he was admitted with a full scholarship. María Elena, elated, photocopied his acceptance letter and passed it around to her friends at church. At Collegiate, Padilla began taking Latin and Greek and found himself overwhelmed by the emotive power of classical texts; he was captivated by the sting of Greek philosophy, the heat and action of epic. Padilla told none of his new friends that he was undocumented. “There were some conversations I simply wasn’t ready to have,” he has said in an interview. When his classmates joked about immigrants, Padilla sometimes thought of a poem he had read by the Greek lyricist Archilochus, about a soldier who throws his shield in a bush and flees the battlefield. “At least I got myself safely out,” the soldier says. “Why should I care for that shield? Let it go. Some other time I’ll find another no worse.” Don’t expose yourself, he thought. There would be other battles.

Years passed before Padilla started to question the way the textbook had presented the classical world to him. He was accepted on a full scholarship to Princeton, where he was often the only Black person in his Latin and Greek courses. “The hardest thing for me as I was making my way into the discipline as a college student was appreciating how lonely I might be,” Padilla told me. In his sophomore year, when it came time to select a major, the most forceful resistance to his choice came from his close friends, many of whom were also immigrants or the children of immigrants. They asked Padilla questions he felt unprepared to answer. What are you doing with this blanquito stuff? How is this going to help us? Padilla argued that he and others shouldn’t shun certain pursuits just because the world said they weren’t for Black and brown people. There was a special joy and vindication in upending their expectations, but he found he wasn’t completely satisfied by his own arguments. The question of classics’ utility was not a trivial one. How could he take his education in Latin and Greek and make it into something liberatory? “That became the most urgent question that guided me through my undergraduate years and beyond,” Padilla said.

After graduating as Princeton’s 2006 salutatorian, Padilla earned a master’s degree from Oxford and a doctorate from Stanford. By then, more scholars than ever were seeking to understand not only the elite men who had written the surviving works of Greek and Latin literature, but also the ancient people whose voices were mostly silent in the written record: women, the lower classes, enslaved people and immigrants. Courses on gender and race in antiquity were becoming common and proving popular with students, but it wasn’t yet clear whether their imprint on the discipline would last. “There are some in the field,” Ian Morris, an adviser of Padilla’s at Stanford, told me, “who say: ‘Yes, we agree with your critique. Now let us go back to doing exactly what we’ve been doing.’” Reformers had learned from the old debates around “Black Athena” — Martin Bernal’s trilogy positing African and Semitic influence on ancient Greek culture — just how resistant some of their colleagues were to acknowledging the field’s role in whitewashing antiquity. “Classicists generally identify as liberal,” Joel Christensen, a professor of Greek literature at Brandeis University, told me. “But we are able to do that because most of the time we’re not in spaces or with people who push us about our liberalism and what that means.”

Thinking of his family’s own history, Padilla became interested in Roman slavery. Decades of research had focused on the ability of enslaved people to transcend their status through manumission, celebrating the fact that the buying and granting of freedom was much more common in Rome than in other slaveholding societies. But there were many who stood no chance of being freed, particularly those who worked in the fields or the mines, far from centers of power. “We have so many testimonies for how profoundly degrading enslavement was,” Padilla told me. Enslaved people in ancient Rome could be tortured and crucified; forced into marriage; chained together in work gangs; made to fight gladiators or wild animals; and displayed naked in marketplaces with signs around their necks advertising their age, character and health to prospective buyers. Owners could tattoo their foreheads so they could be recognized and captured if they tried to flee. Temple excavations have uncovered clay dedications from escapees, praying for the gods to remove the disfiguring marks from their faces. Archaeologists have also found metal collars riveted around the necks of skeletons in burials of enslaved people, among them an iron ring with a bronze tag preserved in the Museo Nazionale in Rome that reads: “I have run away; hold me. When you have brought me back to my master Zoninus, you will receive a gold coin.”

By 2015, when Padilla arrived at the Columbia Society of Fellows as a postdoctoral researcher, classicists were no longer apologists for ancient slavery, but many doubted that the inner worlds of enslaved people were recoverable, because no firsthand account of slavery had survived the centuries. That answer did not satisfy Padilla. He had begun to study the trans-Atlantic slave trade, which had shaped his mother’s mystical brand of Catholicism. María Elena moved through a world that was haunted by spirits, numinous presences who could give comfort and advice or demand sacrifice and appeasement. For a while, when Padilla was in high school, his mother invited a santero and his family to live with them at their Section 8 apartment in Harlem, where the man would conjure spirits that seethed at Padilla for his bad behavior. Padilla realized that his mother’s conception of the dead reminded him of the Romans’, which gave him an idea. In 2017, he published a paper in the journal Classical Antiquity that compared evidence from antiquity and the Black Atlantic to draw a more coherent picture of the religious life of the Roman enslaved. “It will not do merely to adopt a pose of ‘righteous indignation’ at the distortions and gaps in the archive,” he wrote. “There are tools available for the effective recovery of the religious experiences of the enslaved, provided we work with these tools carefully and honestly.”

Padilla began to feel that he had lost something in devoting himself to the classical tradition. As James Baldwin observed 35 years before, there was a price to the ticket. His earlier work on the Roman senatorial classes, which earned him a reputation as one of the best Roman historians of his generation, no longer moved him in the same way. Padilla sensed that his pursuit of classics had displaced other parts of his identity, just as classics and “Western civilization” had displaced other cultures and forms of knowledge. Recovering them would be essential to dismantling the white-supremacist framework in which both he and classics had become trapped. “I had to actively engage in the decolonization of my mind,” he told me. He revisited books by Frantz Fanon, Orlando Patterson and others working in the traditions of Afro-pessimism and psychoanalysis, Caribbean and Black studies. He also gravitated toward contemporary scholars like José Esteban Muñoz, Lorgia García Peña and Saidiya Hartman, who speak of race not as a physical fact but as a ghostly system of power relations that produces certain gestures, moods, emotions and states of being. They helped him think in more sophisticated terms about the workings of power in the ancient world, and in his own life.

Around the time that Padilla began working on the paper, Donald Trump made his first comments on the presidential campaign trail about Mexican “criminals, drug dealers, rapists” coming into the country. Padilla, who spent the previous 20 years dealing with an uncertain immigration status, had just applied for a green card after celebrating his marriage to a social worker named Missy from Sparta, N.J. Now he watched as alt-right figures like Richard Spencer, who had fantasized about creating a “white ethno-state on the North American continent” that would be “a reconstitution of the Roman Empire,” rose to national prominence. In response to rising anti-immigrant sentiment in Europe and the United States, Mary Beard, perhaps the most famous classicist alive, wrote in The Wall Street Journal that the Romans “would have been puzzled by our modern problems with migration and asylum,” because the empire was founded on the “principles of incorporation and of the free movement of people.”

‘I’m not interested in demolition for demolition’s sake. I want to build something.’

Padilla found himself frustrated by the manner in which scholars were trying to combat Trumpian rhetoric. In November 2015, he wrote an essay for Eidolon, an online classics journal, clarifying that in Rome, as in the United States, paeans to multiculturalism coexisted with hatred of foreigners. Defending a client in court, Cicero argued that “denying foreigners access to our city is patently inhumane,” but ancient authors also recount the expulsions of whole “suspect” populations, including a roundup of Jews in 139 B.C., who were not considered “suitable enough to live alongside Romans.” Padilla argues that exposing untruths about antiquity, while important, is not enough: Explaining that an almighty, lily-white Roman Empire never existed will not stop white nationalists from pining for its return. The job of classicists is not to “point out the howlers,” he said on a 2017 panel. “To simply take the position of the teacher, the qualified classicist who knows things and can point to these mistakes, is not sufficient.” Dismantling structures of power that have been shored up by the classical tradition will require more than fact-checking; it will require writing an entirely new story about antiquity, and about who we are today.

To find that story, Padilla is advocating reforms that would “explode the canon” and “overhaul the discipline from nuts to bolts,” including doing away with the label “classics” altogether. Classics was happy to embrace him when he was changing the face of the discipline, but how would the field react when he asked it to change its very being? The way it breathed and moved? “Some students and some colleagues have told me this is either too depressing or it’s sort of menacing in a way,” he said. “My only rejoinder is that I’m not interested in demolition for demolition’s sake. I want to build something.”

One day last February, shortly before the pandemic ended in-person teaching, I visited Padilla at Princeton. Campus was quiet and morose, the silences quivering with early-term nerves. A storm had swept the leaves from the trees and the color from the sky, which was now the milky gray of laundry water, and the air was so heavy with mist that it seemed to be blurring the outlines of the buildings. That afternoon, Padilla was teaching a Roman-history course in one of the oldest lecture halls at the university, a grand, vaulted room with creaking floorboards and mullioned windows. The space was not designed for innovative pedagogy. Each wooden chair was bolted to the floor with a paddle-shaped extension that served as a desk but was barely big enough to hold a notebook, let alone a laptop. “This was definitely back in the day when the students didn’t even take notes,” one student said as she sat down. “Like, ‘My dad’s going to give me a job.’”

Since returning to campus as a professor in 2016, Padilla has been working to make Princeton’s classics department a more welcoming place for students like him — first-generation students and students of color. In 2018, the department secured funding for a predoctoral fellowship to help a student with less exposure to Latin and Greek enter the Ph.D. program. That initiative, and the draw of Padilla as a mentor, has contributed to making Princeton’s graduate cohort one of the most diverse in the country. Pria Jackson, a Black predoctoral fellow who is the daughter of a mortician from New Mexico, told me that before she came to Princeton, she doubted that she could square her interest in classics with her commitment to social justice. “I didn’t think that I could do classics and make a difference in the world the way that I wanted to,” she said. “My perception of what it could do has changed.”

Padilla’s Roman-history course was a standard introductory survey, something the university had been offering for decades, if not centuries, but he was not teaching it in the standard way. He was experimenting with role play in order to prompt his students to imagine what it was like to be subjects of an imperial system. The previous week, he asked them to recreate a debate that took place in the Roman Senate in A.D. 15 about a proposed waterworks project that communities in central Italy feared would change the flow of the Tiber River, destroying animal habitats and flooding old shrines. (Unlike the Senate, the Princeton undergraduates decided to let the project go ahead as planned.) Today’s situation was inspired by the crises of succession that threatened to tear the early empire apart. Out of the 80 students in the lecture, Padilla had assigned four to be young military commanders — claimants vying for the throne — and four to be wealthy Roman senators; the rest were split between the Praetorian Guard and marauding legionaries whose swords could be bought in exchange for money, land and honors. It was designed to help his students “think as capaciously as possible about the many lives, human and nonhuman, that are touched by the shift from republic to empire.”

Padilla stood calmly behind the lectern as students filed into the room, wearing rectangular-framed glasses low on his nose and a maroon sweater over a collared shirt. The stillness of his body only heightened the sense of his mind churning. “He carries a big stick without having to show it off,” Cowen, Padilla’s childhood mentor, told me. “He’s kind of soft on the outside but very hard on the inside.” Padilla speaks in the highly baroque language of the academy — a style that can seem so deliberate as to function as a kind of protective armor. It is the flinty, guarded manner of someone who has learned to code-switch, someone who has always been aware that it is not only what he says but also how he says it that carries meaning. Perhaps it is for that reason that Padilla seems most at ease while speaking to students, when his phrasing loses some of its formality and his voice takes on the incantatory cadence of poetry. “Silence,” he said once the room had quieted, “my favorite sound.”

Padilla called the claimants up to the front of the room. At first, they stood uncertainly on the dais, like adolescents auditioning for a school play. Then, slowly, they moved into the rows of wooden desks. I watched as one of them, a young man wearing an Army-green football T-shirt that said “Support Our Troops,” propositioned a group of legionaries. “I’ll take land from non-Romans and give it to you, grant you citizenship,” he promised them. As more students left their seats and began negotiating, bids and counterbids reverberated against the stone walls. Not everyone was taking it seriously. At one point, another claimant approached a blue-eyed legionary in a lacrosse sweatshirt to ask what it would take to gain his support. “I just want to defend my right to party,” he responded. “Can I get a statue erected to my mother?” someone else asked. A stocky blond student kept charging to the front of the room and proposing that they simply “kill everybody.” But Padilla seemed energized by the chaos. He moved from group to group, sowing discord. “Why let someone else take over?” he asked one student. If you are a soldier or a peasant who is unhappy with imperial governance, he told another, how do you resist? “What kinds of alliances can you broker?”

Padilla teaching Roman history at Princeton in 2016.
Padilla teaching Roman history at Princeton in 2016.Credit…Princeton University/Office of Communications/Denise Applewhite

Over the next 40 minutes, there were speeches, votes, broken promises and bloody conflicts. Several people were assassinated. Eventually it seemed as though two factions were coalescing, and a count was called. The young man in the football shirt won the empire by seven votes, and Padilla returned to the lectern. “What I want to be thinking about in the next few weeks,” he told them, “is how we can be telling the story of the early Roman Empire not just through a variety of sources but through a variety of persons.” He asked the students to consider the lives behind the identities he had assigned them, and the way those lives had been shaped by the machinery of empire, which, through military conquest, enslavement and trade, creates the conditions for the large-scale movement of human beings.

Once the students had left the room, accompanied by the swish of umbrellas and waterproof synthetics, I asked Padilla why he hadn’t assigned any slave roles. Tracing his fingers along the crown of his head, he told me he had thought about it. It troubled him that he might be “re-enacting a form of silencing” by avoiding enslaved characters, given the fact that slavery was “arguably the most ubiquitous feature of the Roman imperial system.” As a historian, he knew that the assets at the disposal of the four wealthy senators — the 100 million sesterces he had given them to back one claimant over another — would have been made up in large part of the enslaved who worked in their mines and plowed the fields of their country estates. Was it harmful to encourage students to imagine themselves in roles of such comfort, status and influence, when a vast majority of people in the Roman world would never have been in a position to be a senator? But ultimately, he decided that leaving enslaved characters out of the role play was an act of care. “I’m not yet ready to turn to a student and say, ‘You are going to be a slave.’”

Even before “the incident,” Padilla was a target of right-wing anger because of the blistering language he uses and, many would say, because of the body he inhabits. In the aftermath of his exchange with Williams, which was covered in the conservative media, Padilla received a series of racist emails. “Maybe African studies would suit you better if you can’t hope with the reality of how advanced Europeans were,” one read. “You could figure out why the wheel had never made it sub-Saharan African you meathead. Lucky for you, your black, because you have little else on offer.” Breitbart ran a story accusing Padilla of “killing” classics. “If there was one area of learning guaranteed never to be hijacked by the forces of ignorance, political correctness, identity politics, social justice and dumbing down, you might have thought it would be classics,” it read. “Welcome, barbarians! The gates of Rome are wide open!”

Privately, even some sympathetic classicists worry that Padilla’s approach will only hasten the field’s decline. “I’ve spoken to undergrad majors who say that they feel ashamed to tell their friends they’re studying classics,” Denis Feeney, Padilla’s colleague at Princeton, told me. “I think it’s sad.” He noted that the classical tradition has often been put to radical and disruptive uses. Civil rights movements and marginalized groups across the world have drawn inspiration from ancient texts in their fights for equality, from African-Americans to Irish Republicans to Haitian revolutionaries, who viewed their leader, Toussaint L’Ouverture, as a Black Spartacus. The heroines of Greek tragedy — untamed, righteous, destructive women like Euripides’ Medea — became symbols of patriarchal resistance for feminists like Simone de Beauvoir, and the descriptions of same-sex love in the poetry of Sappho and in the Platonic dialogues gave hope and solace to gay writers like Oscar Wilde.

“I very much admire Dan-el’s work, and like him, I deplore the lack of diversity in the classical profession,” Mary Beard told me via email. But “to ‘condemn’ classical culture would be as simplistic as to offer it unconditional admiration.” She went on: “My line has always been that the duty of the academic is to make things seem more complicated.” In a 2019 talk, Beard argued that “although classics may become politicized, it doesn’t actually have a politics,” meaning that, like the Bible, the classical tradition is a language of authority — a vocabulary that can be used for good or ill by would-be emancipators and oppressors alike. Over the centuries, classical civilization has acted as a model for people of many backgrounds, who turned it into a matrix through which they formed and debated ideas about beauty, ethics, power, nature, selfhood, citizenship and, of course, race. Anthony Grafton, the great Renaissance scholar, put it this way in his preface to “The Classical Tradition”: “An exhaustive exposition of the ways in which the world has defined itself with regard to Greco-Roman antiquity would be nothing less than a comprehensive history of the world.”

How these two old civilizations became central to American intellectual life is a story that begins not in antiquity, and not even in the Renaissance, but in the Enlightenment. Classics as we know it today is a creation of the 18th and 19th centuries. During that period, as European universities emancipated themselves from the control of the church, the study of Greece and Rome gave the Continent its new, secular origin story. Greek and Latin writings emerged as a competitor to the Bible’s moral authority, which lent them a liberatory power. Figures like Diderot and Hume derived some of their ideas on liberty from classical texts, where they found declarations of political and personal freedoms. One of the most influential was Pericles’ funeral oration over the graves of the Athenian war dead in 431 B.C., recorded by Thucydides, in which the statesman praises his “glorious” city for ensuring “equal justice to all.” “Our government does not copy our neighbors’,” he says, “but is an example to them. It is true that we are called a democracy, for the administration is in the hands of the many and not of the few.”

Admiration for the ancients took on a fantastical, unhinged quality, like a strange sort of mania. Men draped themselves in Roman togas to proclaim in public, signed their letters with the names of famous Romans and filled etiquette manuals, sermons and schoolbooks with lessons from the classical past. Johann Joachim Winckelmann, a German antiquarian of the 18th century, assured his countrymen that “the only way for us to become great, or even inimitable if possible, is to imitate the Greeks.” Winckelmann, who is sometimes called the “father of art history,” judged Greek marble sculpture to be the summit of human achievement — unsurpassed by any other society, ancient or modern. He wrote that the “noble simplicity and quiet grandeur” of Athenian art reflected the “freedom” of the culture that produced it, an entanglement of artistic and moral value that would influence Hegel’s “Aesthetics” and appear again in the poetry of the Romantics. “Beauty is truth, truth beauty,” Keats wrote in “Ode on a Grecian Urn,” “that is all/Ye know on earth, and all ye need to know.”

‘I think that the politics of the living are what constitute classics as a site for productive inquiry. When folks think of classics, I would want them to think about folks of color.’

Historians stress that such ideas cannot be separated from the discourses of nationalism, colorism and progress that were taking shape during the modern colonial period, as Europeans came into contact with other peoples and their traditions. “The whiter the body is, the more beautiful it is,” Winkelmann wrote. While Renaissance scholars were fascinated by the multiplicity of cultures in the ancient world, Enlightenment thinkers created a hierarchy with Greece and Rome, coded as white, on top, and everything else below. “That exclusion was at the heart of classics as a project,” Paul Kosmin, a professor of ancient history at Harvard, told me. Among those Enlightenment thinkers were many of America’s founding fathers. Aristotle’s belief that some people were “slaves by nature” was welcomed with special zeal in the American South before the Civil War, which sought to defend slavery in the face of abolitionist critique. In “Notes on the State of Virginia,” Thomas Jefferson wrote that despite their condition in life, Rome’s enslaved showed themselves to be the “rarest artists” who “excelled too at science, insomuch as to be usually employed as tutors to their master’s children.” The fact that Africans had not done the same, he argued, proved that the problem was their race.

Jefferson, along with most wealthy young men of his time, studied classics at college, where students often spent half their time reading and translating Greek and Roman texts. “Next to Christianity,” writes Caroline Winterer, a historian at Stanford, “the central intellectual project in America before the late 19th century was classicism.” Of the 2.5 million people living in America in 1776, perhaps only 3,000 had gone to college, but that number included many of the founders. They saw classical civilization as uniquely educative — a “lamp of experience,” in the words of Patrick Henry, that could light the path to a more perfect union. However true it was, subsequent generations would come to believe, as Hannah Arendt wrote in “On Revolution,” that “without the classical example … none of the men of the Revolution on either side of the Atlantic would have possessed the courage for what then turned out to be unprecedented action.”

While the founding fathers chose to emulate the Roman republic, fearful of the tyranny of the majority, later generations of Americans drew inspiration from Athenian democracy, particularly after the franchise was extended to nearly all white men regardless of property ownership in the early decades of the 1800s. Comparisons between the United States and the Roman Empire became popular as the country emerged as a global power. Even after Latin and Greek were struck from college-entrance exams, the proliferation of courses on “great books” and Western civilization, in which classical texts were read in translation, helped create a coherent national story after the shocks of industrialization and global warfare. The project of much 20th-century art and literature was to forge a more complicated relationship with Greece and Rome, but even as the classics were pulled apart, laughed at and transformed, they continued to form the raw material with which many artists shaped their visions of modernity.

Over the centuries, thinkers as disparate as John Adams and Simone Weil have likened classical antiquity to a mirror. Generations of intellectuals, among them feminist, queer and Black scholars, have seen something of themselves in classical texts, flashes of recognition that held a kind of liberatory promise. Daniel Mendelsohn, a gay classicist and critic, discovered his sexuality at 12 while reading historical fiction about the life of Alexander the Great. “Until that moment,” he wrote in The New Yorker in 2013, “I had never seen my secret feelings reflected anywhere.” But the idea of classics as a mirror may be as dangerous as it is seductive. The language that is used to describe the presence of classical antiquity in the world today — the classical tradition, legacy or heritage — contains within it the idea of a special, quasi-genetic relationship. In his lecture “There Is No Such Thing as Western Civilization,” Kwame Anthony Appiah (this magazine’s Ethicist columnist) mockingly describes the belief in such a kinship as the belief in a “golden nugget” of insight — a precious birthright and shimmering sign of greatness — that white Americans and Europeans imagine has been passed down to them from the ancients. That belief has been so deeply held that the philosopher John Stuart Mill could talk about the Battle of Marathon, in which the Greeks defeated the first Persian invasion in 490 B.C., as one of the most important events in “English history.”

To see classics the way Padilla sees it means breaking the mirror; it means condemning the classical legacy as one of the most harmful stories we’ve told ourselves. Padilla is wary of colleagues who cite the radical uses of classics as a way to forestall change; he believes that such examples have been outmatched by the field’s long alliance with the forces of dominance and oppression. Classics and whiteness are the bones and sinew of the same body; they grew strong together, and they may have to die together. Classics deserves to survive only if it can become “a site of contestation” for the communities who have been denigrated by it in the past. This past semester, he co-taught a course, with the Activist Graduate School, called “Rupturing Tradition,” which pairs ancient texts with critical race theory and strategies for organizing. “I think that the politics of the living are what constitute classics as a site for productive inquiry,” he told me. “When folks think of classics, I would want them to think about folks of color.” But if classics fails his test, Padilla and others are ready to give it up. “I would get rid of classics altogether,” Walter Scheidel, another of Padilla’s former advisers at Stanford, told me. “I don’t think it should exist as an academic field.”

One way to get rid of classics would be to dissolve its faculties and reassign their members to history, archaeology and language departments. But many classicists are advocating softer approaches to reforming the discipline, placing the emphasis on expanding its borders. Schools including Howard and Emory have integrated classics with Ancient Mediterranean studies, turning to look across the sea at Egypt, Anatolia, the Levant and North Africa. The change is a declaration of purpose: to leave behind the hierarchies of the Enlightenment and to move back toward the Renaissance model of the ancient world as a place of diversity and mixture. “There’s a more interesting story to be told about the history of what we call the West, the history of humanity, without valorizing particular cultures in it,” said Josephine Quinn, a professor of ancient history at Oxford. “It seems to me the really crucial mover in history is always the relationship between people, between cultures.” Ian Morris put it more bluntly. “Classics is a Euro-American foundation myth,” Morris said to me. “Do we really want that sort of thing?”

For many, inside the academy and out, the answer to that question is yes. Denis Feeney, Padilla’s colleague at Princeton, believes that society would “lose a great deal” if classics was abandoned. Feeney is 65, and after he retires this year, he says, his first desire is to sit down with Homer again. “In some moods, I feel that this is just a moment of despair, and people are trying to find significance even if it only comes from self-accusation,” he told me. “I’m not sure that there is a discipline that is exempt from the fact that it is part of the history of this country. How distinctly wicked is classics? I don’t know that it is.” Amy Richlin, a feminist scholar at the University of California, Los Angeles, who helped lead the turn toward the study of women in the Roman world, laughed when I mentioned the idea of breaking up classics departments in the Ivy League. “Good luck getting rid of them,” she said. “These departments have endowments, and they’re not going to voluntarily dissolve themselves.” But when I pressed her on whether it was desirable, if not achievable, she became contemplative. Some in the discipline, particularly graduate students and untenured faculty members, worry that administrators at small colleges and public universities will simply use the changes as an excuse to cut programs. “One of the dubious successes of my generation is that it did break the canon,” Richlin told me. “I don’t think we could believe at the time that we would be putting ourselves out of business, but we did.” She added: “If they blew up the classics departments, that would really be the end.”

‘I’m not sure that there is a discipline that is exempt from the fact that it is part of the history of this country. How distinctly wicked is classics? I don’t know that it is.’

Padilla has said that he “cringes” when he remembers his youthful desire to be transformed by the classical tradition. Today he describes his discovery of the textbook at the Chinatown shelter as a sinister encounter, as though the book had been lying in wait for him. He compares the experience to a scene in one of Frederick Douglass’s autobiographies, when Mr. Auld, Douglass’s owner in Baltimore, chastises his wife for helping Douglass learn to read: “ ‘Now,’ said he, ‘if you teach that nigger (speaking of myself) how to read, there would be no keeping him. It would forever unfit him to be a slave.’” In that moment, Douglass says he understood that literacy was what separated white men from Black — “a new and special revelation, explaining dark and mysterious things.” “I would at times feel that learning to read had been a curse rather than a blessing,” Douglass writes. “It had given me a view of my wretched condition, without the remedy.” Learning the secret only deepened his sense of exclusion.

Padilla, like Douglass, now sees the moment of absorption into the classical, literary tradition as simultaneous with his apprehension of racial difference; he can no longer find pride or comfort in having used it to bring himself out of poverty. He permits himself no such relief. “Claiming dignity within this system of structural oppression,” Padilla has said, “requires full buy-in into its logic of valuation.” He refuses to “praise the architects of that trauma as having done right by you at the end.”

Last June, as racial-justice protests unfolded across the nation, Padilla turned his attention to arenas beyond classics. He and his co-authors — the astrophysicist Jenny Greene, the literary theorist Andrew Cole and the poet Tracy K. Smith — began writing their open letter to Princeton with 48 proposals for reform. “Anti-Blackness is foundational to America,” the letter began. “Indifference to the effects of racism on this campus has allowed legitimate demands for institutional support and redress in the face of microaggression and outright racist incidents to go long unmet.” Signed by more than 300 members of the faculty, the letter was released publicly on the Fourth of July. In response, Joshua Katz, a prominent Princeton classicist, published an op-ed in the online magazine Quillette in which he referred to the Black Justice League, a student group, as a “terrorist organization” and warned that certain proposals in the faculty letter would “lead to civil war on campus.”

Few in the academy cared to defend Katz’s choice of words, but he was far from the only person who worried that some of the proposals were unwise, if not dangerous. Most controversial was the idea of establishing a committee that would “oversee the investigation and discipline of racist behaviors, incidents, research and publication” — a body that many viewed as a threat to free academic discourse. “I’m concerned about how you define what racist research is,” one professor told me. “That’s a line that’s constantly moving. Punishing people for doing research that other people think is racist just does not seem like the right response.” But Padilla believes that the uproar over free speech is misguided. “I don’t see things like free speech or the exchange of ideas as ends in themselves,” he told me. “I have to be honest about that. I see them as a means to the end of human flourishing.”

On Jan. 6, Padilla turned on the television minutes after the windows of the Capitol were broken. In the crowd, he saw a man in a Greek helmet with TRUMP 2020 painted in white. He saw a man in a T-shirt bearing a golden eagle on a fasces — symbols of Roman law and governance — below the logo 6MWE, which stands for “Six Million Wasn’t Enough,” a reference to the number of Jews murdered in the Holocaust. He saw flags embroidered with the phrase that Leonidas is said to have uttered when the Persian king ordered him to lay down his arms: Molon labe, classical Greek for “Come and take them,” which has become a slogan of American gun rights activists. A week after the riot, Representative Marjorie Taylor Greene, a newly elected Republican from Georgia who has liked posts on social media that call for killing Democrats, wore a mask stitched with the phrase when she voted against impeachment on the House floor.

“There is a certain kind of classicist who will look on what transpired and say, ‘Oh, that’s not us,’” Padilla said when we spoke recently. “What is of interest to me is why is it so imperative for classicists of a certain stripe to make this discursive move? ‘This is not us.’ Systemic racism is foundational to those institutions that incubate classics and classics as a field itself. Can you take stock, can you practice the recognition of the manifold ways in which racism is a part of what you do? What the demands of the current political moment mean?”

Padilla suspects that he will one day need to leave classics and the academy in order to push harder for the changes he wants to see in the world. He has even considered entering politics. “I would never have thought the position I hold now to be attainable to me as a kid,” he said. “But the fact that this is a minor miracle does not displace my deep sense that this is temporary too.” His influence on the field may be more permanent than his presence in it. “Dan-el has galvanized a lot of people,” Rebecca Futo Kennedy, a professor at Denison University, told me. Joel Christensen, the Brandeis professor, now feels that it is his “moral and ethical and intellectual responsibility” to teach classics in a way that exposes its racist history. “Otherwise we’re just participating in propaganda,” he said. Christensen, who is 42, was in graduate school before he had his “crisis of faith,” and he understands the fear that many classicists may experience at being asked to rewrite the narrative of their life’s work. But, he warned, “that future is coming, with or without Dan-el.”

Rachel Poser is the deputy editor of Harper’s Magazine. Her writing, which often focuses on the relationship between past and present, has appeared in Harper’s, The New York Times, Mother Jones and elsewhere. A version of this article appears in print on Feb. 7, 2021, Page 38 of the Sunday Magazine with the headline: The Iconoclast.

How the History of Brazil’s Oil Industry Can Inform Our Understanding of the Anthropocene (Past & Present Blog)

Original publication

Josh Allen | January 25, 2021

by Dr. Antoine Acker (University of Zurich)

Between August 2019 and July 2020, a forest area roughly the size of Belgium was destroyed in the Brazilian Amazon. According to climatologists, the Amazon’s transformation into a savanna is one of the main tipping point towards hothouse earth, the most extreme global warming scenario. Tropical rainforests are not only endangered carbon sinks, but their burning is also a major source of Greenhouse Gas (GHG) emissions, making a place like the Amazon decisive in the current epoch which geologists named the Anthropocene. The latter, marked by the anthropic transformation in the earth system, invites historians to reassess the human past in the light of its impact on the planet’s ecology.

Although GHG particles disregard national borders when they spread in the atmosphere, the rise in their emissions over time is the product of institutions, systems and patterns, which humans have constructed. For example, in my book about the history of the Volkswagen Company in the Amazon, I studied the tight articulation between global capitalism and Brazilian state-led development in setting in motion the first wave of massive tropical deforestation in the region in the early 1970s. But Brazil also matters in the history of the Anthropocene for a different reason: its contribution to the fossil fuel, in particular hydrocarbons, economy.

Graphic presentation of the “Hino do Petróleo”, a march by Sylvio Theodosio de Mello (1949), which Brazilian congressmen proposed to make a national anthem in 1955. Source: Arquivo (digital) da Câmara dos Deputados, Lote 33, Cx 28, PL N 508/1955 74.

In 1953, Brazil became the first country in the world to enter the oil market with a state monopoly, Petrobras, initiating a history of technological breakthroughs whose most spectacular manifestation was the country’s leading role in the global development of offshore platforms. In the past thirty years, Petrobras has been a major award winner and patent holder in the field of oil exploration, making technological leaps that have intensified the world’s energy dependence. With the progressive exhaustion of conventional oil sources, offshore oil is poised to become the leading fossil fuel on a planet marked by climate instability, and Brazil to rise to one of the world´s largest producers, possibly “fuelling” the economic growth of two giant CO2-emitters, China and India.

Cover of the first edition of José Bento Monteiro Lobato´s children best-seller, “O Poço do Visconde” (São Paulo, 1937). While the image clearly conveys Lobato´s racist worldview, in my article I discuss the ambiguous racial message which Brazilian political elites sought to associate with the promotion of petroleum production.

In view of this and other scenarios, the history of Brazil’s oil needs to be explored urgently. In my article “A Different Story in the Anthropocene: Brazil’s Post-Colonial Quest for Oil (1930–1975)”, in the current issue of Past & Present (#249), I argue that this relatively recent development does not result from a simple technology transfer from older industrialized countries but inscribes itself in a national project of economic emancipation that started around 1930. Seen as an opportunity to regain national sovereignty over natural resources and reorient their use towards domestic industrial development, oil became a post-colonial symbol expected to free Brazil from its peripheral position in the global economy. The energy reform entailed in the rise of national petroleum raised hopes of ridding the country of the heritage of colonial exploitation, slavery, and the squandering of soils and forests, which lingered long after national independence was gained in 1822.

Journalist and songwriter Petronilha Pimentel at the Candéias oil gush near Salvador da Bahia (1948). Source: Arquivo CPDOC, EG foto 0068

Through this example, I believe it is possible to recontextualize the Anthropocene, particularly from three angles which previous literature on the topic has underplayed:

– Histories of the Anthropocene so far have relied mainly on a Western progress storyline, strangely reviving Eurocentric and teleological narratives which social sciences had spent the last forty years deconstructing. The history of Brazilian petroleum opens new theoretical perspectives by shedding light on specific causalities that explain the attraction of fossil fuels in formerly colonized societies.

– In terms of methodology, I suggest paying more attention to the political, social and cultural dynamics, which co-shaped energy dependency together with the evolving technological offer and economic feedback loops. Anthropocene historians’ overwhelming focus on sciences and technology (which are surely part, but not all of the story) tend to reproduce the essentialist narrative that we are a homo technologicus’ species whose thirst for energy is unappeasable. A multidimensional analysis of historical processes, in contrast, can help understand how the modern world’s energy was stimulated by public and private discourses, which cultural production mirrored and fed. In the article, I explore pictorial representations, fictional literature and music that conveyed images of national unity and freedom: for example, a march celebrating an oil-fueled society “with no shackles to enslave”, a children’s book telling the utopic story of a grandmother curing Brazil from poverty by redistributing the profits of an oil well found in her backyard, or a samba describing oil towers growing out of the old colonial cane fields. In 1948 this samba’s author, Petronilha Pimentel, posed in front of photographs while smearing her hands with oil, like Brazil’s president Getulio Vargas did in 1952 to symbolize the integration of petroleum as part of the nation’s body.

– Finally, I believe that a fair assessment of past energy transitions is only possible if historians recontextualize the meaning of environmental thought in past societies which did not have cognizance of the atmospheric impact of fossil fuel consumption. In this sense, it should be possible to research the agency of Global South societies in the Anthropocene without shifting the blame for climate change to them. In this article, for example, I show how oil production projects in Brazil, were enmeshed with concerns for forest protection and a more cautious use of resources.

Ironically, the exact opposite happened. Today’s scale of deforestation is deeply related to a fossil model of development which transformed the country into a top global producer (and consumer) of primary products such as soy, beef and steel. Brazil’s heavily mechanized agriculture, farming chemicals, road networks and motor vehicle industry are all intertwined with its trajectory as a petroleum producer. Not least, gasoline is commonly used as a means of combustion to set the forest ablaze.

Petronilha Pimentel’s application picture for the beauty contest Rainha do Petróleo (“Queen of Petroleum”), which she eventually won. The contest was organized in 1949 by an important leftist weekly publication as part of a national campaign in favor of petroleum nationalization. Source: Petronilha Pimentel, Afinal, quem descobriu o petróleo do Brasil? Das tentativas de Allport no século passado às convicções científicas de Ignácio Bastos (Rio de Janeiro, 1984).

Yet, the history of Brazilian oil was driven by discourses of rational use, fair distribution, conservation of and sovereignty over natural resources, at the service of a project of collective emancipation. In view of current ecological crises, it is tempting to dismiss this historical experience as misled. But we could also see it as a history from which we can learn, because it shows how unifying values could be mobilized and shared to efficiently serve a project of rapid energy transition. It remains to be seen whether, in Brazil or elsewhere, similar national mobilizations could take place in favor of renewable energies and forest preservation, but historians can at least contribute with a better understanding of the dynamics which drove energy revolutions in the past.

President Getulio Vargas visiting the oil-producing site of Mataripe near Salvador da Bahia (1952).

Ensino de História em Portugal perpetua mito do ‘bom colonizador’ e banaliza escravidão, diz pesquisadora (BBC)

Luis Barrucho – Da BBC Brasil em Londres

31 julho 2017

Jean-Baptiste Debret. Pintura do francês Jean-Baptiste Debret de 1826 retrata escravos no Brasil.

“De igual modo, em virtude dos descobrimentos, movimentaram-se povos para outros continentes (sobretudo europeus e escravos africanos).”

É dessa forma – “como se os negros tivessem optado por emigrar em vez de terem sido levados à força” – que o colonialismo ainda é ensinado em Portugal.

Quem critica é a portuguesa Marta Araújo, pesquisadora principal do Centro de Estudos Sociais (CES) da Universidade de Coimbra.

De setembro de 2008 a fevereiro de 2012, ela coordenou uma minuciosa pesquisa ao fim da qual concluiu que os livros didáticos do país “escondem o racismo no colonialismo português e naturalizam a escravatura”.

Além disso, segundo Araújo, “persiste até hoje a visão romântica de que cumprimos uma missão civilizatória, ou seja, de que fomos bons colonizadores, mais benevolentes do que outros povos europeus”.

“A escravatura não ocupa mais de duas ou três páginas nesses livros, sendo tratada de forma vaga e superficial. Também propagam ideias tortuosas. Por exemplo, quando falam sobre as consequências da escravatura, o único país a ganhar maior destaque é o Brasil e mesmo assim para falar sobre a miscigenação”, explica.

“Por trás disso, está o propósito de destacar a suposta multirracialidade da nossa maior colônia que, neste sentido, seria um exemplo do sucesso das políticas de miscigenação. Na prática, porém, sabemos que isso não ocorreu da forma como é tratada”, questiona.

Araújo diz que “nada mudou” desde 2012 e argumenta que a falta de compreensão sobre o assunto traz prejuízos.

“Essa narrativa gera uma série de consequências, desde a menor coleta de dados sobre a discriminação étnico-racial até a própria não admissão de que temos um problema de racismo”, afirma.

Jean-Baptiste Debret Image. Segundo Araújo, livros didáticos portugueses continuam a apregoar visão “romântica” sobre colonialismo português

‘Vítimas passivas?’

Para realizar a pesquisa, Araújo contou com a ajuda de outros pesquisadores. O foco principal foi a análise dos cinco livros didáticos de História mais vendidos no país para alunos do chamado 3º Ciclo do Ensino Básico (12 a 14 anos), que compreende do 7º ao 9º ano.

Além disso, a equipe também examinou políticas públicas, entrevistou historiadores e educadores, assistiu a aulas e conduziu workshops com estudantes.

Em um deles, as pesquisadoras presenciaram uma cena que chamou a atenção, lembra Araújo.

Na ocasião, os alunos ficaram surpresos ao saber de revoltas das próprias populações escravizadas. E também sobre o verdadeiro significado dos quilombos ─ destino dos escravos que fugiam, normalmente locais escondidos e fortificados no meio das matas.

“Em outros países, há uma abertura muito maior para discutir como essas populações lutavam contra a opressão. Mas, no caso português, os alunos nem sequer poderiam imaginar que eles se libertavam sozinhos e continuavam a acreditar que todos eram vítimas passivas da situação. É uma ideia muito resignada”, diz.

Araújo destaca que nos livros analisados “não há nenhuma alusão à Revolução do Haiti (conflito sangrento que culminou na abolição da escravidão e na independência do país, que passou a ser a primeira república governada por pessoas de ascendência africana)”.

Já os quilombos são representados, acrescenta a pesquisadora, como “locais onde os negros dançavam em um dia de festa”.

“Como resultado, essas versões acabam sendo consensualizadas e não levantam as polêmicas necessárias para problematizarmos o ensino da História da África.”

‘Visão romântica’

Araújo diz que, diferentemente de outros países, os livros didáticos portugueses continuam a apregoar uma visão “romântica” sobre o colonialismo português.

“Perdura a narrativa de que nosso colonialismo foi um colonialismo amigável, do qual resultaram sociedades multiculturais e multirraciais – e o Brasil seria um exemplo”, diz.

Ironicamente, contudo, outras potências colonizadoras daquele tempo não são retratadas de igual forma, observa ela.

“Quando falamos da descoberta das Américas, os espanhóis são descritos como extremamente violentos sempre em contraste com a suposta benevolência do colonialismo português. Já os impérios francês, britânico e belga são tachados de racistas”, assinala.

“Por outro lado, nunca se fala da questão racial em relação ao colonialismo português. Há despolitização crescente. Os livros didáticos holandeses, por exemplo, atribuem a escravatura aos portugueses”, acrescenta.

Segundo ela, essa ideia da “benevolência do colonizador português” acabou encontrando eco no luso-tropicalismo, tese desenvolvida pelo cientista social brasileiro Gilberto Freire sobre a relação de Portugal com os trópicos.

Em linhas gerais, Freire defendia que a capacidade do português de se relacionar com os trópicos ─ não por interesse político ou econômico, mas por suposta empatia inata ─ resultaria de sua própria origem ética híbrida, da sua bicontinentalidade e do longo contato com mouros e judeus na Península Ibérica.

Apesar de rejeitado pelo Estado Novo de Getúlio Vargas (1930-1945), por causa da importância que conferia à miscigenação e à interpenetração de culturas, o luso-tropicalismo ganhou força como peça de propaganda durante a ditadura do português António de Oliveira Salazar (1932-1968). Uma versão simplificada e nacionalista da tese acabou guiando a política externa do regime.

“Ocorre que a questão racial nunca foi debatida em Portugal”, ressalta Araújo. Direito de imagem Marta Araújo Image caption Livro didático português diz que escravos africanos “movimentaram-se para outros continentes”

‘Sem resposta’

A pesquisadora alega que enviou os resultados da pesquisa ao Ministério da Educação português, mas nunca obteve resposta.

“Nossa percepção é que os responsáveis acreditam que tudo está bem assim e que medidas paliativas, como festivais culturais sazonais, podem substituir a problematização de um assunto tão importante”, critica.

Nesse sentido, Araújo elogia a iniciativa brasileira de 2003 que tornou obrigatório o ensino da história e cultura afro-brasileira e indígena em todas as escolas, públicas e particulares, do ensino fundamental até o ensino médio.

“Precisamos combater o racismo, mas isso não será possível se não mudarmos a forma como ensinamos nossa História”, conclui.

Procurado pela BBC Brasil, o Ministério da Educação português não havia respondido até a publicação desta reportagem.

The Fight Over the 1619 Project Is Not About the Facts (The Atlantic)

theatlantic.com

Adam Serwer, Dec. 23, 2019


A dispute between a small group of scholars and the authors of The New York Times Magazine’s issue on slavery represents a fundamental disagreement over the trajectory of American society.

An engraving of a slave auction in Charleston, South Carolina
Bettmann / Getty

This article was updated at 7:35 p.m. ET on December 23, 2019

When The New York Times Magazine published its 1619 Project in August, people lined up on the street in New York City to get copies. Since then, the project—a historical analysis of how slavery shaped American political, social, and economic institutions—has spawned a podcast, a high-school curriculum, and an upcoming book. For Nikole Hannah-Jones, the reporter who conceived of the project, the response has been deeply gratifying.

To hear more feature stories, see our full list or get the Audm iPhone app.

“They had not seen this type of demand for a print product of The New York Times, they said, since 2008, when people wanted copies of Obama’s historic presidency edition,” Hannah-Jones told me. “I know when I talk to people, they have said that they feel like they are understanding the architecture of their country in a way that they had not.”

U.S. history is often taught and popularly understood through the eyes of its great men, who are seen as either heroic or tragic figures in a global struggle for human freedom. The 1619 Project, named for the date of the first arrival of Africans on American soil, sought to place “the consequences of slavery and the contributions of black Americans at the very center of our national narrative.” Viewed from the perspective of those historically denied the rights enumerated in America’s founding documents, the story of the country’s great men necessarily looks very different.

The reaction to the project was not universally enthusiastic. Several weeks ago, the Princeton historian Sean Wilentz, who had criticized the 1619 Project’s “cynicism” in a lecture in November, began quietly circulating a letter objecting to the project, and some of Hannah-Jones’s work in particular. The letter acquired four signatories—James McPherson, Gordon Wood, Victoria Bynum, and James Oakes, all leading scholars in their field. They sent their letter to three top Times editors and the publisher, A. G. Sulzberger, on December 4. A version of that letter was published on Friday, along with a detailed rebuttal from Jake Silverstein, the editor of the Times Magazine.

The letter sent to the Times says, “We applaud all efforts to address the foundational centrality of slavery and racism to our history,” but then veers into harsh criticism of the 1619 Project. The letter refers to “matters of verifiable fact” that “cannot be described as interpretation or ‘framing’” and says the project reflected “a displacement of historical understanding by ideology.” Wilentz and his fellow signatories didn’t just dispute the Times Magazine’s interpretation of past events, but demanded corrections.

In the age of social-media invective, a strongly worded letter might not seem particularly significant. But given the stature of the historians involved, the letter is a serious challenge to the credibility of the 1619 Project, which has drawn its share not just of admirers but also critics.

Nevertheless, some historians who declined to sign the letter wondered whether the letter was intended less to resolve factual disputes than to discredit laymen who had challenged an interpretation of American national identity that is cherished by liberals and conservatives alike.

“I think had any of the scholars who signed the letter contacted me or contacted the Times with concerns [before sending the letter], we would’ve taken those concerns very seriously,” Hannah-Jones said. “And instead there was kind of a campaign to kind of get people to sign on to a letter that was attempting really to discredit the entire project without having had a conversation.”

Underlying each of the disagreements in the letter is not just a matter of historical fact but a conflict about whether Americans, from the Founders to the present day, are committed to the ideals they claim to revere. And while some of the critiques can be answered with historical fact, others are questions of interpretation grounded in perspective and experience.

In fact, the harshness of the Wilentz letter may obscure the extent to which its authors and the creators of the 1619 Project share a broad historical vision. Both sides agree, as many of the project’s right-wing critics do not, that slavery’s legacy still shapes American life—an argument that is less radical than it may appear at first glance. If you think anti-black racism still shapes American society, then you are in agreement with the thrust of the 1619 Project, though not necessarily with all of its individual arguments.

The clash between the Times authors and their historian critics represents a fundamental disagreement over the trajectory of American society. Was America founded as a slavocracy, and are current racial inequities the natural outgrowth of that? Or was America conceived in liberty, a nation haltingly redeeming itself through its founding principles? These are not simple questions to answer, because the nation’s pro-slavery and anti-slavery tendencies are so closely intertwined.

The letter is rooted in a vision of American history as a slow, uncertain march toward a more perfect union. The 1619 Project, and Hannah-Jones’s introductory essay in particular, offer a darker vision of the nation, in which Americans have made less progress than they think, and in which black people continue to struggle indefinitely for rights they may never fully realize. Inherent in that vision is a kind of pessimism, not about black struggle but about the sincerity and viability of white anti-racism. It is a harsh verdict, and one of the reasons the 1619 Project has provoked pointed criticism alongside praise.

Americans need to believe that, as Martin Luther King Jr. said, the arc of history bends toward justice. And they are rarely kind to those who question whether it does.

Most Americans still learn very little about the lives of the enslaved, or how the struggle over slavery shaped a young nation. Last year, the Southern Poverty Law Center found that few American high-school students know that slavery was the cause of the Civil War, that the Constitution protected slavery without explicitly mentioning it, or that ending slavery required a constitutional amendment.

“The biggest obstacle to teaching slavery effectively in America is the deep, abiding American need to conceive of and understand our history as ‘progress,’ as the story of a people and a nation that always sought the improvement of mankind, the advancement of liberty and justice, the broadening of pursuits of happiness for all,” the Yale historian David Blight wrote in the introduction to the report. “While there are many real threads to this story—about immigration, about our creeds and ideologies, and about race and emancipation and civil rights, there is also the broad, untidy underside.”

In conjunction with the Pulitzer Center, the Times has produced educational materials based on the 1619 Project for students—one of the reasons Wilentz told me he and his colleagues wrote the letter. But the materials are intended to enhance traditional curricula, not replace them. “I think that there is a misunderstanding that this curriculum is meant to replace all of U.S. history,” Silverstein told me. “It’s being used as supplementary material for teaching American history.” Given the state of American education on slavery, some kind of adjustment is sorely needed.

Published 400 years after the first Africans were brought to in Virginia, the project asked readers to consider “what it would mean to regard 1619 as our nation’s birth year.” The special issue of the Times Magazine included essays from the Princeton historian Kevin Kruse, who argued that sprawl in Atlanta is a consequence of segregation and white flight; the Times columnist Jamelle Bouie, who posited that American countermajoritarianism was shaped by pro-slavery politicians seeking to preserve the peculiar institution; and the journalist Linda Villarosa, who traced racist stereotypes about higher pain tolerance in black people from the 18th century to the present day. The articles that drew the most attention and criticism, though, were Hannah-Jones’s introductory essay chronicling black Americans’ struggle to “make democracy real” and the sociologist Matthew Desmond’s essay linking the crueler aspects of American capitalism to the labor practices that arose under slavery.

The letter’s signatories recognize the problem the Times aimed to remedy, Wilentz told me. “Each of us, all of us, think that the idea of the 1619 Project is fantastic. I mean, it’s just urgently needed. The idea of bringing to light not only scholarship but all sorts of things that have to do with the centrality of slavery and of racism to American history is a wonderful idea,” he said. In a subsequent interview, he said, “Far from an attempt to discredit the 1619 Project, our letter is intended to help it.”

The letter disputes a passage in Hannah-Jones’s introductory essay, which lauds the contributions of black people to making America a full democracy and says that “one of the primary reasons the colonists decided to declare their independence from Britain was because they wanted to protect the institution of slavery” as abolitionist sentiment began rising in Britain.

This argument is explosive. From abolition to the civil-rights movement, activists have reached back to the rhetoric and documents of the founding era to present their claims to equal citizenship as consonant with the American tradition. The Wilentz letter contends that the 1619 Project’s argument concedes too much to slavery’s defenders, likening it to South Carolina Senator John C. Calhoun’s assertion that “there is not a word of truth” in the Declaration of Independence’s famous phrase that “all men are created equal.” Where Wilentz and his colleagues see the rising anti-slavery movement in the colonies and its influence on the Revolution as a radical break from millennia in which human slavery was accepted around the world, Hannah-Jones’ essay outlines how the ideology of white supremacy that sustained slavery still endures today.

“To teach children that the American Revolution was fought in part to secure slavery would be giving a fundamental misunderstanding not only of what the American Revolution was all about but what America stood for and has stood for since the Founding,” Wilentz told me. Anti-slavery ideology was a “very new thing in the world in the 18th century,” he said, and “there was more anti-slavery activity in the colonies than in Britain.”

Hannah-Jones hasn’t budged from her conviction that slavery helped fuel the Revolution. “I do still back up that claim,” she told me last week—before Silverstein’s rebuttal was published—although she says she phrased it too strongly in her essay, in a way that might mislead readers into thinking that support for slavery was universal. “I think someone reading that would assume that this was the case: all 13 colonies and most people involved. And I accept that criticism, for sure.” She said that as the 1619 Project is expanded into a history curriculum and published in book form, the text will be changed to make sure claims are properly contextualized.

On this question, the critics of the 1619 Project are on firm ground. Although some southern slave owners likely were fighting the British to preserve slavery, as Silverstein writes in his rebuttal, the Revolution was kindled in New England, where prewar anti-slavery sentiment was strongest. Early patriots like James Otis, John Adams, and Thomas Paine were opposed to slavery, and the Revolution helped fuel abolitionism in the North.

Historians who are in neither Wilentz’s camp nor the 1619 Project’s say both have a point. “I do not agree that the American Revolution was just a slaveholders’ rebellion,” Manisha Sinha, a history professor at the University of Connecticut and the author of The Slave’s Cause: A History of Abolition, told me.* “But also understand that the original Constitution did give some ironclad protections to slavery without mentioning it.”

The most radical thread in the 1619 Project is not its contention that slavery’s legacy continues to shape American institutions; it’s the authors’ pessimism that a majority of white people will abandon racism and work with black Americans toward a more perfect union. Every essay tracing racial injustice from slavery to the present day speaks to the endurance of racial caste. And it is this profound pessimism about white America that many of the 1619 Project’s critics find most galling.

Newt Gingrich called the 1619 Project a “lie,” arguing that “there were several hundred thousand white Americans who died in the Civil War in order to free the slaves.” In City Journal, the historian Allen Guelzo dismissed the Times Magazine project as a “conspiracy theory” developed from the “chair of ultimate cultural privilege in America, because in no human society has an enslaved people suddenly found itself vaulted into positions of such privilege, and with the consent—even the approbation—of those who were once the enslavers.” The conservative pundit Erick Erickson went so far as to accuse the Times of adopting “the Neo-Confederate world view” that the “South actually won the Civil War by weaving itself into the fabric of post war society so it can then discredit the entire American enterprise.” Erickson’s bizarre sleight of hand turns the 1619 Project’s criticism of ongoing racial injustice into a brief for white supremacy.

The project’s pessimism has drawn criticism from the left as well as the right. Hannah-Jones’s contention that “anti-black racism runs in the very DNA of this country” drew a rebuke from James Oakes, one of the Wilentz letter’s signatories. In an interview with the World Socialist Web Site, Oakes said, “The function of those tropes is to deny change over time … The worst thing about it is that it leads to political paralysis. It’s always been here. There’s nothing we can do to get out of it. If it’s the DNA, there’s nothing you can do. What do you do? Alter your DNA?”

These are objections not to misstatements of historical fact, but to the argument that anti-black racism is a more intractable problem than most Americans are willing to admit. A major theme of the 1619 Project is that the progress that has been made has been fragile and reversible—and has been achieved in spite of the nation’s true founding principles, which are not the lofty ideals few Americans genuinely believe in. Chances are, what you think of the 1619 Project depends on whether you believe someone might reasonably come to such a despairing conclusion—whether you agree with it or not.

Wilentz reached out to a larger group of historians, but ultimately sent a letter signed by five historians who had publicly criticized the 1619 Project in interviews with the World Socialist Web Site. He told me that the idea of trying to rally a larger group was “misconceived,” citing the holiday season and the end of the semester, among other factors. (A different letter written by Wilentz, calling for the impeachment of President Donald Trump, quickly amassed hundreds of signatures last week.) The refusal of other historians to sign on, despite their misgivings about some claims made by the 1619 Project, speaks to a divide over whether the letter was focused on correcting specific factual inaccuracies or aimed at discrediting the project more broadly.

Sinha saw an early version of the letter that was circulated among a larger group of historians. But, despite her disagreement with some of the assertions in the 1619 Project, she said she wouldn’t have signed it if she had been asked to. “There are legitimate critiques that one can engage in discussion with, but for them to just kind of dismiss the entire project in that manner, I thought, was really unwise,” she said. “It was a worthy thing to actually shine a light on a subject that the average person on the street doesn’t know much about.”

Although the letter writers deny that their objections are merely matters of “interpretation or ‘framing,’” the question of whether black Americans have fought their freedom struggles “largely alone,” as Hannah-Jones put it in her essay, is subject to vigorous debate. Viewed through the lens of major historical events—from anti-slavery Quakers organizing boycotts of goods produced through slave labor, to abolitionists springing fugitive slaves from prison, to union workers massing at the March on Washington—the struggle for black freedom has been an interracial struggle. Frederick Douglass had William Garrison; W. E. B. Du Bois had Moorfield Storey; Martin Luther King Jr. had Stanley Levison.

“The fight for black freedom is a universal fight; it’s a fight for everyone. In the end, it affected the fight for women’s rights—everything. That’s the glory of it,” Wilentz told me. “To minimize that in any way is, I think, bad for understanding the radical tradition in America.”

But looking back to the long stretches of night before the light of dawn broke—the centuries of slavery and the century of Jim Crow that followed—“largely alone” seems more than defensible. Douglass had Garrison, but the onetime Maryland slave had to go north to find him. The millions who continued to labor in bondage until 1865 struggled, survived, and resisted far from the welcoming arms of northern abolitionists.

“I think one would be very hard-pressed to look at the factual record from 1619 to the present of the black freedom movement and come away with any conclusion other than that most of the time, black people did not have a lot of allies in that movement,” Hannah-Jones told me. “It is not saying that black people only fought alone. It is saying that most of the time we did.”

Nell Irvin Painter, a professor emeritus of history at Princeton who was asked to sign the letter, had objected to the 1619 Project’s portrayal of the arrival of African laborers in 1619 as slaves. The 1619 Project was not history “as I would write it,” Painter told me. But she still declined to sign the Wilentz letter.

“I felt that if I signed on to that, I would be signing on to the white guy’s attack of something that has given a lot of black journalists and writers a chance to speak up in a really big way. So I support the 1619 Project as kind of a cultural event,” Painter said. “For Sean and his colleagues, true history is how they would write it. And I feel like he was asking me to choose sides, and my side is 1619’s side, not his side, in a world in which there are only those two sides.”

This was a recurrent theme among historians I spoke with who had seen the letter but declined to sign it. While they may have agreed with some of the factual objections in the letter or had other reservations of their own, several told me they thought the letter was an unnecessary escalation.

“The tone to me rather suggested a deep-seated concern about the project. And by that I mean the version of history the project offered. The deep-seated concern is that placing the enslavement of black people and white supremacy at the forefront of a project somehow diminishes American history,” Thavolia Glymph, a history professor at Duke who was asked to sign the letter, told me. “Maybe some of their factual criticisms are correct. But they’ve set a tone that makes it hard to deal with that.”

“I don’t think they think they’re trying to discredit the project,” Painter said. “They think they’re trying to fix the project, the way that only they know how.”

Historical interpretations are often contested, and those debates often reflect the perspective of the participants. To this day, the pro-Confederate “Lost Cause” intepretation of history shapes the mistaken perception that slavery was not the catalyst for the Civil War. For decades, a group of white historians known as the Dunning School, after the Columbia University historian William Archibald Dunning, portrayed Reconstruction as a tragic period of, in his words, the “scandalous misrule of the carpet-baggers and negroes,” brought on by the misguided enfranchisement of black men. As the historian Eric Foner has written, the Dunning School and its interpretation of Reconstruction helped provide moral and historical cover for the Jim Crow system.

In Black Reconstruction in America, W. E. B. Du Bois challenged the consensus of “white historians” who “ascribed the faults and failures of Reconstruction to Negro ignorance and corruption,” and offered what is now considered a more reliable account of the era as an imperfect but noble effort to build a multiracial democracy in the South.

To Wilentz, the failures of earlier scholarship don’t illustrate the danger of a monochromatic group of historians writing about the American past, but rather the risk that ideologues can hijack the narrative. “[It was] when the southern racists took over the historical profession that things changed, and W. E. B. Du Bois fought a very, very courageous fight against all of that,” Wilentz told me. The Dunning School, he said, was “not a white point of view; it’s a southern, racist point of view.”

In the letter, Wilentz portrays the authors of the 1619 Project as ideologues as well. He implies—apparently based on a combative but ambiguous exchange between Hannah-Jones and the writer Wesley Yang on Twitter—that she had discounted objections raised by “white historians” since publication.

Hannah-Jones told me she was misinterpreted. “I rely heavily on the scholarship of historians no matter what race, and I would never discount the work of any historian because that person is white or any other race,” she told me. “I did respond to someone who was saying white scholars were afraid, and I think my point was that history is not objective. And that people who write history are not simply objective arbiters of facts, and that white scholars are no more objective than any other scholars, and that they can object to the framing and we can object to their framing as well.”

When I asked Wilentz about Hannah-Jones’s clarification, he was dismissive. “Fact and objectivity are the foundation of both honest journalism and honest history. And so to dismiss it, to say, ‘No, I’m not really talking about whites’—well, she did, and then she takes it back in those tweets and then says it’s about the inability of anybody to write objective history. That’s objectionable too,” Wilentz told me.

Both Du Bois and the Dunning School saw themselves as having reached the truth by objective means. But as a target of the Dunning School’s ideology, Du Bois understood the motives and blind spots of Dunning School scholars far better than they themselves did.  

“We shall never have a science of history until we have in our colleges men who regard the truth as more important than the defense of the white race,” Du Bois wrote, “and who will not deliberately encourage students to gather thesis material in order to support a prejudice or buttress a lie.”

The problem, as Du Bois argued, is that much of American history has been written by scholars offering ideological claims in place of rigorous historical analysis. But which claims are ideological, and which ones are objective, is not always easy to discern.


*An earlier version of this article contained an incorrect title for historian Manisha Sinha’s book.

Adam Serwer is a staff writer at The Atlantic, where he covers politics.

We Respond to the Historians Who Critiqued The 1619 Project (New York Times)

nytimes.com

Dec. 20, 2019


Letter to the Editor

Five historians wrote to us with their reservations. Our editor in chief replies.

Published Dec. 20, 2019Updated Jan. 4, 2020

The letter below was published in the Dec. 29 issue of The New York Times Magazine.

RE: The 1619 Project

We write as historians to express our strong reservations about important aspects of The 1619 Project. The project is intended to offer a new version of American history in which slavery and white supremacy become the dominant organizing themes. The Times has announced ambitious plans to make the project available to schools in the form of curriculums and related instructional material.

We applaud all efforts to address the enduring centrality of slavery and racism to our history. Some of us have devoted our entire professional lives to those efforts, and all of us have worked hard to advance them. Raising profound, unsettling questions about slavery and the nation’s past and present, as The 1619 Project does, is a praiseworthy and urgent public service. Nevertheless, we are dismayed at some of the factual errors in the project and the closed process behind it.

These errors, which concern major events, cannot be described as interpretation or “framing.” They are matters of verifiable fact, which are the foundation of both honest scholarship and honest journalism. They suggest a displacement of historical understanding by ideology. Dismissal of objections on racial grounds — that they are the objections of only “white historians” — has affirmed that displacement.

On the American Revolution, pivotal to any account of our history, the project asserts that the founders declared the colonies’ independence of Britain “in order to ensure slavery would continue.” This is not true. If supportable, the allegation would be astounding — yet every statement offered by the project to validate it is false. Some of the other material in the project is distorted, including the claim that “for the most part,” black Americans have fought their freedom struggles “alone.”

Still other material is misleading. The project criticizes Abraham Lincoln’s views on racial equality but ignores his conviction that the Declaration of Independence proclaimed universal equality, for blacks as well as whites, a view he upheld repeatedly against powerful white supremacists who opposed him. The project also ignores Lincoln’s agreement with Frederick Douglass that the Constitution was, in Douglass’s words, “a GLORIOUS LIBERTY DOCUMENT.” Instead, the project asserts that the United States was founded on racial slavery, an argument rejected by a majority of abolitionists and proclaimed by champions of slavery like John C. Calhoun.

The 1619 Project has not been presented as the views of individual writers — views that in some cases, as on the supposed direct connections between slavery and modern corporate practices, have so far failed to establish any empirical veracity or reliability and have been seriously challenged by other historians. Instead, the project is offered as an authoritative account that bears the imprimatur and credibility of The New York Times. Those connected with the project have assured the public that its materials were shaped by a panel of historians and have been scrupulously fact-checked. Yet the process remains opaque. The names of only some of the historians involved have been released, and the extent of their involvement as “consultants” and fact checkers remains vague. The selective transparency deepens our concern.

We ask that The Times, according to its own high standards of accuracy and truth, issue prominent corrections of all the errors and distortions presented in The 1619 Project. We also ask for the removal of these mistakes from any materials destined for use in schools, as well as in all further publications, including books bearing the name of The New York Times. We ask finally that The Times reveal fully the process through which the historical materials were and continue to be assembled, checked and authenticated.

Sincerely,

Victoria Bynum, distinguished emerita professor of history, Texas State University;
James M. McPherson, George Henry Davis 1886 emeritus professor of American history, Princeton University;
James Oakes, distinguished professor, the Graduate Center, the City University of New York;
Sean Wilentz, George Henry Davis 1886 professor of American history, Princeton University;
Gordon S. Wood, Alva O. Wade University emeritus professor and emeritus professor of history, Brown University.


Editor’s response:

Since The 1619 Project was published in August, we have received a great deal of feedback from readers, many of them educators, academics and historians. A majority have reacted positively to the project, but there have also been criticisms. Some I would describe as constructive, noting episodes we might have overlooked; others have treated the work more harshly. We are happy to accept all of this input, as it helps us continue to think deeply about the subject of slavery and its legacy.

The letter from Professors Bynum, McPherson, Oakes, Wilentz and Wood differs from the previous critiques we have received in that it contains the first major request for correction. We are familiar with the objections of the letter writers, as four of them have been interviewed in recent months by the World Socialist Web Site. We’re glad for a chance to respond directly to some of their objections.

Though we respect the work of the signatories, appreciate that they are motivated by scholarly concern and applaud the efforts they have made in their own writings to illuminate the nation’s past, we disagree with their claim that our project contains significant factual errors and is driven by ideology rather than historical understanding. While we welcome criticism, we don’t believe that the request for corrections to The 1619 Project is warranted.

The project was intended to address the marginalization of African-American history in the telling of our national story and examine the legacy of slavery in contemporary American life. We are not ourselves historians, it is true. We are journalists, trained to look at current events and situations and ask the question: Why is this the way it is? In the case of the persistent racism and inequality that plague this country, the answer to that question led us inexorably into the past — and not just for this project. The project’s creator, Nikole Hannah-Jones, a staff writer at the magazine, has consistently used history to inform her journalism, primarily in her work on educational segregation (work for which she has been recognized with numerous honors, including a MacArthur Fellowship).

Though we may not be historians, we take seriously the responsibility of accurately presenting history to readers of The New York Times. The letter writers express concern about a “closed process” and an opaque “panel of historians,” so I’d like to make clear the steps we took. We did not assemble a formal panel for this project. Instead, during the early stages of development, we consulted with numerous scholars of African-American history and related fields, in a group meeting at The Times as well as in a series of individual conversations. (Five of those who initially consulted with us — Mehrsa Baradaran of the University of California, Irvine; Matthew Desmond and Kevin M. Kruse, both of Princeton University; and Tiya Miles and Khalil G. Muhammad, both of Harvard University — went on to publish articles in the issue.) After those consultations, writers conducted their own research, reading widely, examining primary documents and artifacts and interviewing historians. Finally, during the fact-checking process, our researchers carefully reviewed all the articles in the issue with subject-area experts. This is no different from what we do on any article.

As the five letter writers well know, there are often debates, even among subject-area experts, about how to see the past. Historical understanding is not fixed; it is constantly being adjusted by new scholarship and new voices. Within the world of academic history, differing views exist, if not over what precisely happened, then about why it happened, who made it happen, how to interpret the motivations of historical actors and what it all means.

The passages cited in the letter, regarding the causes of the American Revolution and the attitudes toward black equality of Abraham Lincoln, are good examples of this. Both are found in the lead essay by Hannah-Jones. We can hardly claim to have studied the Revolutionary period as long as some of the signatories, nor do we presume to tell them anything they don’t already know, but I think it would be useful for readers to hear why we believe that Hannah-Jones’s claim that “one of the primary reasons the colonists decided to declare their independence from Britain was because they wanted to protect the institution of slavery” is grounded in the historical record.

The work of various historians, among them David Waldstreicher and Alfred W. and Ruth G. Blumrosen, supports the contention that uneasiness among slaveholders in the colonies about growing antislavery sentiment in Britain and increasing imperial regulation helped motivate the Revolution. One main episode that these and other historians refer to is the landmark 1772 decision of the British high court in Somerset v. Stewart. The case concerned a British customs agent named Charles Stewart who bought an enslaved man named Somerset and took him to England, where he briefly escaped. Stewart captured Somerset and planned to sell him and ship him to Jamaica, only for the chief justice, Lord Mansfield, to declare this unlawful, because chattel slavery was not supported by English common law.

It is true, as Professor Wilentz has noted elsewhere, that the Somerset decision did not legally threaten slavery in the colonies, but the ruling caused a sensation nonetheless. Numerous colonial newspapers covered it and warned of the tyranny it represented. Multiple historians have pointed out that in part because of the Somerset case, slavery joined other issues in helping to gradually drive apart the patriots and their colonial governments. The British often tried to undermine the patriots by mocking their hypocrisy in fighting for liberty while keeping Africans in bondage, and colonial officials repeatedly encouraged enslaved people to seek freedom by fleeing to British lines. For their part, large numbers of the enslaved came to see the struggle as one between freedom and continued subjugation. As Waldstreicher writes, “The black-British alliance decisively pushed planters in these [Southern] states toward independence.”

The culmination of this was the Dunmore Proclamation, issued in late 1775 by the colonial governor of Virginia, which offered freedom to any enslaved person who fled his plantation and joined the British Army. A member of South Carolina’s delegation to the Continental Congress wrote that this act did more to sever the ties between Britain and its colonies “than any other expedient which could possibly have been thought of.” The historian Jill Lepore writes in her recent book, “These Truths: A History of the United States,” “Not the taxes and the tea, not the shots at Lexington and Concord, not the siege of Boston; rather, it was this act, Dunmore’s offer of freedom to slaves, that tipped the scales in favor of American independence.” And yet how many contemporary Americans have ever even heard of it? Enslaved people at the time certainly knew about it. During the Revolution, thousands sought freedom by taking refuge with British forces.

As for the question of Lincoln’s attitudes on black equality, the letter writers imply that Hannah-Jones was unfairly harsh toward our 16th president. Admittedly, in an essay that covered several centuries and ranged from the personal to the historical, she did not set out to explore in full his continually shifting ideas about abolition and the rights of black Americans. But she provides an important historical lesson by simply reminding the public, which tends to view Lincoln as a saint, that for much of his career, he believed that a necessary prerequisite for freedom would be a plan to encourage the four million formerly enslaved people to leave the country. To be sure, at the end of his life, Lincoln’s racial outlook had evolved considerably in the direction of real equality. Yet the story of abolition becomes more complicated, and more instructive, when readers understand that even the Great Emancipator was ambivalent about full black citizenship.

The letter writers also protest that Hannah-Jones, and the project’s authors more broadly, ignore Lincoln’s admiration, which he shared with Frederick Douglass, for the commitment to liberty espoused in the Constitution. This seems to me a more general point of dispute. The writers believe that the Revolution and the Constitution provided the framework for the eventual abolition of slavery and for the equality of black Americans, and that our project insufficiently credits both the founders and 19th-century Republican leaders like Lincoln, Thaddeus Stevens, Charles Sumner and others for their contributions toward achieving these goals.

It may be true that under a less egalitarian system of government, slavery would have continued for longer, but the United States was still one of the last nations in the Americas to abolish the institution — only Cuba and Brazil did so after us. And while our democratic system has certainly led to many progressive advances for the rights of minority groups over the past two centuries, these advances, as Hannah-Jones argues in her essay, have almost always come as a result of political and social struggles in which African-Americans have generally taken the lead, not as a working-out of the immanent logic of the Constitution.

And yet for all that, it is difficult to argue that equality has ever been truly achieved for black Americans — not in 1776, not in 1865, not in 1964, not in 2008 and not today. The very premise of The 1619 Project, in fact, is that many of the inequalities that continue to afflict the nation are a direct result of the unhealed wound created by 250 years of slavery and an additional century of second-class citizenship and white-supremacist terrorism inflicted on black people (together, those two periods account for 88 percent of our history since 1619). These inequalities were the starting point of our project — the facts that, to take just a few examples, black men are nearly six times as likely to wind up in prison as white men, or that black women are three times as likely to die in childbirth as white women, or that the median family wealth for white people is $171,000, compared with just $17,600 for black people. The rampant discrimination that black people continue to face across nearly every aspect of American life suggests that neither the framework of the Constitution nor the strenuous efforts of political leaders in the past and the present, both white and black, has yet been able to achieve the democratic ideals of the founding for all Americans.

This is an important discussion to have, and we are eager to see it continue. To that end, we are planning to host public conversations next year among academics with differing perspectives on American history. Good-faith critiques of our project only help us refine and improve it — an important goal for us now that we are in the process of expanding it into a book. For example, we have heard from several scholars who profess to admire the project a great deal but wish it had included some mention of African slavery in Spanish Florida during the century before 1619. Though we stand by the logic of marking the beginning of American slavery with the year it was introduced in the English colonies, this feedback has helped us think about the importance of considering the prehistory of the period our project addresses.

Valuable critiques may come from many sources. The letter misperceives our attitudes when it charges that we dismiss objections on racial grounds. This appears to be a reference not to anything published in The 1619 Project itself, but rather to a November Twitter post from Hannah-Jones in which she questioned whether “white historians” have always produced objective accounts of American history. As is so often the case on Twitter, context is important. In this instance, Hannah-Jones was responding to a post, since deleted, from another user claiming that many “white historians” objected to the project but were hesitant to speak up. In her reply, she was trying to make the point that for the most part, the history of this country has been told by white historians (some of whom, as in the case of the Dunning School, which grossly miseducated Americans about the history of Reconstruction for much of the 20th century, produced accounts that were deeply flawed), and that to truly understand the fullness and complexity of our nation’s story, we need a greater variety of voices doing the telling.

That, above all, is what we hoped our project would do: expand the reader’s sense of the American past. (This is how some educators are using it to supplement their teaching of United States history.) That is what the letter writers have done, in different ways, over the course of their distinguished careers and in their many books. Though we may disagree on some important matters, we are grateful for their input and their interest in discussing these fundamental questions about the country’s history.

Sincerely,
Jake Silverstein
Editor in chief


The 1619 Project was launched in August 2019, on the 400th anniversary of the arrival of the first enslaved Africans in the English colonies that would become the United States. It consisted of two components: a special issue of the magazine, containing 10 essays exploring the links between contemporary American life and the legacy of slavery, as well as a series of original poetry and fiction about key moments in the last 400 years; and a special broadsheet section, produced in collaboration with the Smithsonian’s National Museum of African American History and Culture. This work was converted into supplementary educational materials in partnership with the Pulitzer Center. The materials are available free on the Pulitzer Center’s website, pulitzercenter.org.

A new DNA study offers insight into the horrific story of the trans-Atlantic slave trade (CNN)

By Harmeet Kaur, CNN

Updated 1438 GMT (2238 HKT) July 26, 2020 – original article

This drawing of the Liverpool slave ship Brooks was commissioned by abolitionists to depict the inhumanity of the slave trade by showing how Africans were crammed below decks.

This drawing of the Liverpool slave ship Brooks was commissioned by abolitionists to depict the inhumanity of the slave trade by showing how Africans were crammed below decks.

(CNN) Much of what we know about the horrors of slavery in the Americas comes from historical records. But new research shows that evidence of the slave trade’s atrocities can also be found in the DNA of African Americans.

A study conducted by the consumer genetics company 23andMe, published Thursday in theAmerican Journal of Human Genetics, offers some new insight into the consequences of the trans-Atlantic slave trade, from the scale at which enslaved Black women were raped by their White masters to the less-documented slave trade that occurred within the Americas.

It’s one of the largest studies of its kind, thanks in part to the massive database of 23andMe customers that researchers were able to recruit consenting participants from.

The authors compiled genetic data from more than 50,000 people from the Americas, Western Europe and Atlantic Africa, and compared it against the historical records of where enslaved people were taken from and where they were enslaved. Together, the data and records tell a story about the complicated roots of the African diaspora in the Americas.

For the most part, the DNA was consistent with what the documents show. But, the study authors said, there were some notable differences.

Here’s some of what they found, and what it reveals about the history of slavery.

It shows the legacy of rape against enslaved women

The enslaved workers who were taken from Africa and brought to the Americas were disproportionately male. Yet, genetic data shows that enslaved women contributed to gene pools at a higher rate.

In the US and parts of the Caribbean colonized by the British, African women contributed to the gene pool about 1.5 to 2 times more than African men. In Latin America, that rate was even higher. Enslaved women contributed to the gene pool in Central America, the Latin Caribbean and parts of South America about 13 to 17 times more.

To the extent that people of African descent in the Americas had European ancestry, they were more likely to have White fathers in their lineage than White mothers in all regions except the Latin Caribbean and Central America.

What that suggests: The biases in the gene pool toward enslaved African women and European men signals generations of rape and sexual exploitation against enslaved women at the hands of White owners, authors Steven Micheletti and Joanna Mountain wrote in an email to CNN.

That enslaved Black women were often raped by their masters “is not a surprise” to any Black person living in the US, says Ravi Perry, a political science professor at Howard University. Numerous historical accounts confirm this reality, as the study’s authors note.

But the regional differences between the US and Latin America are what’s striking.

The US and other former British colonies generally forced enslaved people to have children in order to maintain workforces — which could explain why the children of an enslaved woman were more likely to have an enslaved father. Segregation in the US could also be a factor, the authors theorized.

By contrast, the researchers point to the presence of racial whitening policies in several Latin American countries, which brought in European immigrants with the aim of diluting the African race. Such policies, as well as higher mortality rates of enslaved men, could explain the disproportionate contributions to the gene pool by enslaved women, the authors wrote.

It sheds light on the intra-American slave trade

Far more people in the US and Latin America have Nigerian ancestry than expected, given what historical records show about the enslaved people that embarked from ports along present-day Nigeria into the Americas, according to the study.

What that suggests: This is most likely a reflection of the intercolonial slave trade that occurred largely from the British Caribbean to other parts of the Americas between 1619 and 1807, Micheletti and Mountain wrote.

Once enslaved Africans arrived in the Americas, many were put on new ships and transported to other regions.”

Documented intra-American voyages indicate that the vast majority of enslaved people were transported from the British Caribbean to other parts of the Americas, presumably to maintain the slave economy as transatlantic slave trading was increasingly prohibited,” the authors wrote in the study.

When enslaved people from Nigeria who came into the British Caribbean were traded into other areas, their ancestry spread to regions that didn’t directly trade with that part of Africa.

It shows the dire conditions enslaved people faced

Conversely, ancestry from the region of Senegal and the Gambia is underrepresented given the proportion of enslaved people who embarked from there, Micheletti and Mountain said.

The reasons for that are grim.

What that suggests: One possible explanation the authors gave for the low prevalence of Senegambian ancestry is that over time, more and more children from the region were forced onto ships to make the journey to the Americas.

The unsanitary conditions in the holds of the ship led to malnourishment and illness, the authors wrote, meaning that less of them survived.

Another possibility is the dangerous conditions that enslaved people from the region faced once they arrived. A significant proportion of Senegambians were taken to rice plantations in the US, which were often rampant with malaria, Micheletti and Mountain said.

The study has limitations

The 23andMe study is significant in how it juxtaposes genetic data with historical records, as well as in the size of its dataset, experts who weren’t involved in the study told CNN.

“I’m not aware of anyone that has done such a comprehensive job of putting these things together, by a long shot,” said Simon Gravel, a human genetics professor at McGill University. “It’s really big progress.”

Still, he said, the research has its limitations.

In order to conduct their analysis, the scientists had to make “a lot of simplifications,” Gravel said. The researchers broke down African ancestry into four corresponding regions on the continent’s Atlantic Coast: Nigerian, Senegambian, Coastal West African and Congolese.”

That doesn’t tell you the whole story,” Gravel added, though he said more data is needed in the broader field of genomics for the researchers to drill down deeper.

Jada Benn Torres, a genetic anthropologist at Vanderbilt University, also said she would have liked to see a higher proportion of people from Africa included in the study. Out of the more than 50,000 participants, about 2,000 were from Africa. “

From the perspective of human evolutionary genetics, Africa is the most genetic diverse continent,” she wrote in an email to CNN. “In order to adequate capture existing variation, the sample sizes must be large.”

But both Gravel and Benn Torres called the study an exciting start that offers more information about the descendents of enslaved Africans.

And that, the researchers, said was what they set out to do.”

We hope this paper helps people in the Americas of African descent further understand where their ancestors came from and what they overcame,” Micheletti wrote.

“… To me, this is the point, to make a personal connection with the millions of people whose ancestors were forced from Africa into the Americas and to not forget what their ancestors had to endure.”

Estudantes produzem dicionário biográfico Excluídos da História (Agência Brasil)

Olimpíada de história do Brasil foi criada em 2009 na Unicamp

Publicado em 15/08/2020 – 18:49 Por Akemi Nitahara – Repórter da Agência Brasil – Rio de Janeiro

Do cacique Tibiriçá, nascido antes de 1500 e batizado pelos jesuítas como Martim Afonso de Sousa, que teve papel importante na fundação da cidade de São Paulo a Jackson Viana de Paula dos Santos, jovem escritor nascido em Rio Branco (AC) no ano 2000, fundador da Academia Juvenil de Letras e representante da região norte na Brazil Conference, em Harvard.

Essas são as duas pontas de uma linha do tempo que busca contar a história de importantes personagens brasileiros que estão fora dos livros oficiais, num total de 2.251 verbetes, publicados agora como dicionário biográfico Excluídos da História.

O trabalho foi feito pelos 6.753 estudantes que participaram da quinta fase da Olimpíada Nacional em História do Brasil (ONHB) do ano passado, entre os dias 3 e 8 de junho de 2019, divididos em equipes de três participantes cada.

A olimpíada foi criada em 2009 pela Universidade Estadual de Campinas (Unicamp) e reúne atualmente mais de 70 mil estudantes dos ensinos fundamental e médio em uma maratona de busca pelo conhecimento em história do Brasil. A competição tem cinco fases online, com duração de uma semana cada, e uma prova para os finalistas das equipes mais bem pontuadas para definir os medalhistas.

Começou com samba

A Olimpíada Nacional em História do Brasil (ONHB) é um projeto que iniciou no ano de 2009, no âmbito do Museu Exploratório de Ciências da Universidade Estadual de Campinas (Unicamp) e que prossegue sendo elaborado por docentes e pós-graduandos

O dicionário biográfico Excluídos da História foi feito pelos estudantes que participaram da quinta fase da Olimpíada Nacional em História do Brasil (ONHB), iniciativa criada em 2009 pela Unicamp  Divulgação/Unicamp/Direitos Reservados

A coordenadora da Olimpíada Nacional em História do Brasil, Cristina Meneguello, explica que a história do dicionário começou a partir do samba enredo da Estação Primeira de Mangueira, escola campeã do carnaval carioca no ano passado, que levou para a Sapucaí o enredo História para Ninar Gente Grande.

Os versos abriram alas para os “heróis de barracões” com “versos que o livro apagou” para contar “a história que a história não conta” e mostrar “um país que não está no retrato” e o “avesso do mesmo lugar”. Versos que caíram no gosto popular antes mesmo do desfile oficial, sendo tocado em blocos de rua e rodas de samba pela cidade.

Segundo Cristina, a discussão sobre os excluídos da história foi intensa entre os historiadores depois do carnaval no ano passado e o tema permeou toda a competição, que começou no dia 6 de maio.

“Logo na primeira fase da prova a gente fez uma pergunta usando o próprio samba enredo da Mangueira. A gente usa documentos variados, letra de música, propaganda, documentos históricos mais clássicos, imagens, etc. A gente já tinha definido que esse seria o tema da tarefa deles para a quinta fase e fomos colocando as perguntas para eles irem entendendo o tema desde a primeira fase”, lembra.

De acordo com a professora, originalmente não havia a intenção de se publicar o material produzido pelos estudantes. Porém, diante da riqueza e diversidade das pesquisas apresentadas, a coordenação decidiu compartilhar o material com professores, estudantes e todos os interessados, disponibilizando o conteúdo online.

“A gente já sabia que ia ficar uma tarefa muito boa, porque esse conhecimento que eles produzem a partir da escola é sempre muito surpreendente. Mas teve uma série de fatores. O primeiro foi que realmente ficou muito bom o trabalho realizado pelos participantes. Depois, o template que foi criado, com essas quatro páginas como se fosse de um livro didático, ficou um design muito bom e ganhou a medalha de prata no Brasil Design Award no ano passado, como design de sistema educativo”.

Personagens desconhecidos

A escolha do personagem era livre para os estudantes, dentro do critério de ser importante para a história do Brasil e não ser lembrado nos livros didáticos. Cristina diz que o resultado surpreendeu a organização, com verbetes sobre pessoas com importância local e regional, inclusive muitos ainda vivos, mostrando que os participantes entenderam que a história é construída continuamente por personagens diversos, inclusive os que não são apontados pelos historiadores.

“Superou nossa expectativa. Nós observamos que esses personagens desconhecidos são personagens negros, são mulheres importantes para a história do Brasil, são mulheres negras, são líderes locais. Muitos fizeram o verbete de pessoas que estão vivas. São líderes indígenas, pessoas perseguidas na ditadura militar, professores que foram censurados na ditadura militar. Temos de personagens do Brasil colônia até pessoas que estão vivas nesses verbetes”.

Alguns personagens foram lembrados por mais de um grupo, portanto, há verbetes repetidos no dicionário, mas que trazem abordagens diferentes sobre a mesma pessoa.

O grupo da estudante Juliana Kreitlon Pereira foi um dos dois que escreveram sobre Mercedes Baptista, a primeira bailarina negra do Theatro Municipal do Rio de Janeiro.

A sugestão da personagem foi feita por Juliana, que estava no último ano da Escola Estadual de Dança Maria Olenewa e conheceu a história de Mercedes Baptista pelo professor de História da Dança Paulo Melgaço, semanas antes do desafio da olimpíada.

“A Mercedes sempre fez questão de trazer a dança brasileira para os palcos. Foi uma das coisas que mais me chamou atenção. Ela trabalhou com a Katherine Dunham, uma pesquisadora de movimento e coreógrafa dos Estados Unidos. A Mercedes viu o quanto a gente precisava desse tipo de estudo no Brasil também. Ela recorreu a vários movimentos culturais, coisas que já ocorriam no Brasil mas não tinham holofote. E ela sempre quis trazer bastante atenção para isso”.

Falecida em 2014, Mercedes teve sua estátua inaugurada em 2016 no Largo da Prainha, no circuito Pequena África da zona portuária do Rio de Janeiro.

Juliana se diz muito feliz com a publicação do dicionário online. “Eu não sabia que seria publicado. A gente se esforçou tanto, eu li o livro dela inteiro, até porque era muito interessante. Pensei, poxa, não vai acontecer nada. Quando foi publicado eu fiquei muito feliz porque mais pessoas poderiam conhecer essa bailarina”.

Já a equipe do estudante Lucas do Herval Costa Teles de Menezes decidiu escrever sobre um personagem que representasse o Rio de Janeiro e estivesse presente no cotidiano, mas que as pessoas não percebessem. Um personagem que não tivesse sido completamente apagado da história. O escolhido tem um feriado municipal em sua homenagem em Niterói e dá nome à estação das barcas que chegam do Rio de Janeiro e à praça em frente a ela, onde tem uma estátua: o indígena temiminó Araribóia.

“Eu achei interessante a dinâmica que o personagem teve com os povos estrangeiros, no caso, os portugueses e os franceses. Porque, geralmente, quando a gente aprende sobre a relação dos povos indígenas e os povos europeus invasores, a gente não pensa muito em identificar esses povos indígenas, nunca aprende sobre a história individual de uma figura indígena. Eu achei que ele teve uma história individual muito interessante, foi uma figura de liderança, teve muito envolvimento em mais de uma narrativa política daquela época, e isso me chamou atenção.”

O grupo de Lucas foi o único a lembrar de Araribóia, conhecido como fundador de Niterói e figura fundamental na disputa entre portugueses e franceses que levou à expulsão destes.

Olímpiada

A 12ª edição da Olimpíada Nacional em História do Brasil está com inscrições abertas até o dia 7 de setembro. Podem se inscrever equipes de três estudantes de 8º e 9º anos do ensino fundamental e todos os anos do ensino médio, com a orientação de um professor ou uma professora, de escolas públicas e particulares.

Diferentemente da maioria das olimpíadas científicas, a ONHB estimula a busca pelo conhecimento em história, e não avaliar o que o estudante já sabe por meio de uma prova.

“É um sistema de aprendizagem participar de olimpíadas. Ela é muito exigente e não quer aferir se os estudantes já sabem, ela dá tempo para eles estudarem, perguntam para o professor, perguntam uns para os outros. Tem uma pergunta de uma coisa que ele nunca ouviu falar, não viu na escola. Mas do lado tem um texto, ele lê, se informa, pesquisa na internet e volta para responder. Nesse processo ele aprendeu história. Eu não estou muito interessada se ele já sabia, mas se ele aprendeu naquele momento, o nosso objetivo pedagógico é esse”, afirma Cristina Meneguello.

A primeira edição da ONHB, em 2009, contou com 15 mil participantes. No ano passado, o número chegou a 73 mil. Por causa da pandemia de covid-19, a competição deste ano será online, não havendo a prova presencial para os finalistas que normalmente é aplicada na Unicamp.

As fases são compostas por questões de múltipla escolha e uma tarefa que será corrigida por outros grupos. Serão escolhidas 400 equipes finalistas, o dobro do usual, com distribuição de 20 medalhas de ouro, 30 de prata e 40 de bronze, que serão enviadas para as escolas.

Ouça na Radioagência Nacional

Edição: Lílian Beraldo

This Year Will End Eventually. Document It While You Can (New York Times)

nytimes.com

Lesley M. M. Blume

Museums are working overtime to collect artifacts and ephemera from the pandemic and the racial justice movement — and they need your help.

A journal submitted to the Autry Museum by Tanya Gibb, who came down with Covid-19 symptoms on March 5. The donor thought the canceled plans were also representative of the pandemic.
Credit…The Autry Museum of the American West

July 14, 2020, 5:00 a.m. ET

A few weeks ago, a nerdy joke went viral on Twitter: Future historians will be asked which quarter of 2020 they specialize in.

As museum curators and archivists stare down one of the most daunting challenges of their careers — telling the story of the pandemic; followed by severe economic collapse and a nationwide social justice movement — they are imploring individuals across the country to preserve personal materials for posterity, and for possible inclusion in museum archives. It’s an all-hands-on-deck effort, they say.

“Our cultural seismology is being revealed,” said Anthea M. Hartig, the director of the Smithsonian’s National Museum of American History of the events. Of these three earth-shaking events, she said, “The confluence is unlike mostanything we’ve seen.”

Museums, she said, are grappling “with the need to comprehend multiple pandemics at once.”

Last August, Dr. Erik Blutinger joined the staff of Mt. Sinai Queens as an emergency medicine physician. He knew that his first year after residency would be intense, but nothing could have prepared him for the trial-by-fire that was Covid-19.

Aware that he was at the epicenter not only of a global pandemic, but of history, Dr. Blutinger, 34, began to take iPhone videos of the scenes in his hospital, which was one of New York City’s hardest hit during the early days of the crisis.

“Everyone is Covid positive in these hallways,” he told the camera in one April 9 recording which has since been posted on the Mount Sinai YouTube channel, showing the emergency room hallways filled with hissing oxygen tanks, and the surge tents set up outside the building. “All you hear is oxygen. I’m seeing young patients, old patients, people of all age ranges, who are just incredibly sick.”

He estimated that he has recorded over 50 video diaries in total.

In Louisville, Ky., during the protests and unrest that followed the killings of George Floyd and Breonna Taylor, a Louisville resident, filmmaker named Milas Norris rushed to the streets to shoot footage using a Sony camera and a drone.

“It was pretty chaotic,” said Mr. Norris, 24, describing police in riot gear, explosions, and gas and pepper bullets. He said thatat first he didn’t know what he would do with the footage; he has since edited and posted some of it on his Instagram and Facebook accounts. “I just knew that I had to document and see what exactly was happening on the front lines.”

NPR producer Nina Gregory collects "personal ambi," or ambient noise from her home in Hollywood, Calif. "It's another form of diary," she said.
Credit…Kemper Bates

About 2,000 miles west, in Los Angeles, NPR producer Nina Gregory, 45, had set up recording equipment on the front patio of her Hollywood home. In March and April, she recorded the absence of city noise. “The sound of birds was so loud it was pinging red on my levels,” she said.

Soon the sounds of nature were replaced by the sounds of helicopters from the Los Angeles Police Department hovering overhead, and the sounds of protesters and police convoys moving through her neighborhood. She recorded all this for her personal records.

“It’s another form of diary,” she said.

Museums have indicated that these kinds of private recordings have critical value as public historical materials. All of us, curators say, are field collectors now.

In the spirit of preservation, Ms. Hartig from the National Museum of American History — along with museum collectors across the country — have begun avid campaigns to “collect the moment.”

“I do think it’s a national reckoning project,” she said. There are “a multitude of ways in which we need to document and understand — and make history a service. This is one of our highest callings.”

Some museums have assembled rapid response field collecting teams to identify and secure storytelling objects and materials. Perhaps the most widely-publicized task force, assembled by three Smithsonian museums working in a coalition, dispatched curators to Lafayette Square in Washington, D.C., to identify protest signs for eventual possible collection.

A demonstrator who was photographed by Jason Spear of the National Museum of African American History and Culture in Lafayette Square in June. Mr. Spear is part of the rapid response team working to identify protest signs for possible future collection.
Credit…Jason Spear/NMAAHC Public Affairs Specialist

The collecting task force went into action after June 1, when President Trump ordered Lafayette Square cleared of protesters so he could pose for photos in front of St. John’s Episcopal Church, clutching a bible. Shield-bearing officers and mounted police assailed peaceful protesters there with smoke canisters, pepper bullets, flash grenades and chemical spray. The White House subsequently ordered the construction of an 8-foot-high chain link fence around the perimeter, which protesters covered in art and artifacts.

Taking immediate moves to preserve these materials — much of which was made of paper and was vulnerable to the elements — amounted to a curatorial emergency for the Smithsonian’s archivists.

Yet with many museums still closed, or in the earliest stages of reopening, curatorial teams largely cannot yet bring most objects into their facilities. It isfalling to individuals to become their own interim museums and archives.

While some curators are loath to suggest a laundry list of items that we should be saving — they say that they don’t want to manipulate the documentation of history, but take their cues from the communities they document — many are imploring us to see historical value in the everyday objects of right now.

“Whatever we’re taking to be ordinary within this abnormal moment can, in fact, serve as an extraordinary artifact to our children’s children,” said Tyree Boyd-Pates, an associate curator at the Autry Museum of the American West, which is asking the public to consider submitting materials such as journal entries, selfies and even sign-of-the times social media posts (say, a tweet about someone’s quest for toilet paper — screengrab those, he said)

Credit…Lisa Herndon/The Schomburg Center for Research in Black Culture

To this end, curators said, don’t be so quick to edit and delete your cellphone photos right now. “Snapshots are valuable,” said Kevin Young, the director of New York City’s Schomburg Center for Research in Black Culture. “We might look back at one and say, ‘This picture tells more than we thought at the time.’”

At the National Civil Rights Museum in Memphis, the curatorial team will be evaluating and collecting protest materials such as placards, photos, videos and personalized masks — and the personal stories behind them.

“One activist found a tear-gas canister, and he gave it to us,” said Noelle Trent, a director at the museum. “We’re going to have to figure out how to collect items from the opposing side: We have to have theracist posters, the ‘Make America Great’ stuff. We’re going to need that at some point. The danger is that if we don’t have somebody preserving it, they will say this situation was notas bad.”

And there is perhaps no article more representative of this year than the mask, which has “become a really powerful visual symbol,” said Margaret K. Hofer, the vice president and museum director of the New-York Historical Society, which has identified around 25 masks that the museum will collect, including an N95 mask worn by a nurse in the Samaritan’s Purse emergency field hospital set up in New York’s Central Park in the spring. (The museum also collected a set of field hospital scrubs, and a cowbell that the medical team rang whenever they discharged a patient.)

A cowbell that was rung at the Samaritan’s Purse field hospital in Central Park each time a Covid patient was discharged is now in the archives of the New-York Historical Society.
Credit…New-York Historical Society

“The meaning of masks has shifted over the course of these past several months,” Ms. Hofer said. “Early on, the ones we were collecting were being sewn by people who were trying to aid medical workers, when there were all those fears about shortage of P.P.E. — last resort masks.And they’ve more recentlybecome a political statement.”

Curators say that recording the personal stories behind photos, videos and objects are just as crucial as the objects themselves — and the more personal, the better. Museums rely on objects to elicit an emotional reaction from visitors, and that sort of personal connection requires knowing the object’s back story.

“For us, really the artifact is just a metaphor, and behind that artifact are these voices, and this humanity,” said Aaron Bryant, who curates photography and visual culture at the Smithsonian’s National Museum of African American History and Culture, and who isleading the Smithsonian’s ongoing collection response in Lafayette Square.

Curatorial teams from many museums are offering to interview donors about their materials and experiences,and encourage donors to include detailed descriptions and back stories when submitting objects and records for consideration. Many are also collecting oral histories of the moment.

Many museums have put out calls for submissions on social media and are directing would-be donors to submission forms to their websites. The National Museum of African American History and Culture site has a thorough form that covers items’ significance, dimensions, condition and materials. The Civil Rights Museum is looking for “archival materials, books, photographs, clothing/textiles, audio visual materials, fine art and historic objects” that share civil rights history. The New-York Historical Society is seeking Black Lives Matter protest materials.

“We review material, we talk about it, and we respond to everyone,” said William S. Pretzer, a senior curator of history at the National Museum of African American History and Culture. “We can’t collect everything, but we’re not limiting ourselves to anything.”

Gathering materials from some communities is proving challenging, and curators are strategizing collection from individuals who may be unlikely to offer materials to historical institutions.

An anti-racism poster by 14-year-old Kyra Yip. It will be on display at New York’s Museum of Chinese in America when they reopen.
Credit…Kyra Yip

“A lot of our critical collecting and gathering of diverse stories we’ve been able to do because of directed outreach,” said Ms. Hofer of the New-York Historical Society. “We’re trying to capture the experience of all aspects of all populations in the city, including people experiencing homelessness and the incarcerated.”

“We want to make the barrier to entry on this very low,” said Nancy Yao Maasbach, the president of New York’s Museum of Chinese in America, which began collecting materials relating to pandemic-related racist attacks on Asians and Asian-Americans in late winter, and personal testimonies about experiences during the pandemic and protests. Because museums may not necessarily be obvious repositories for many immigrant communities, Ms. Maasbach said, the museum is making translators available to those who want to tell their stories.

“We’re trying to make sure we’re being accessible in creating this record,” Ms. Maasbach said.

Curators recognize that their story-of-2020 collecting will continue for years; we are in the midst of ongoing events. They are asking us to continue to document the subsequent chapters — and to be as posterity-minded as one can be when it comes to ephemera.

“We don’t know what the puzzle looks like yet,” said Ms. Hartig of the National Museum of American History. “Yet we know that each of these pieces might be an important one.”

Some museums are exhibiting submitted and accepted items right away on websites or on social media; others are planning virtual and physical exhibits for as early as this autumn. The Eiteljorg Museum of American Indians and Western Art, for example, is collecting masks and oral history testimonies from Native American communities and is considering the creation of a “rapid response gallery,” said the museum’s vice president and chief curator Elisa G. Phelps.

“If art is being sparked by something very timely, we want to have a place where we can showcase works and photos,” she said, adding that this process differed from “the elaborate, formal exhibit development process.”

Some donors, however, may not be among those to view their materials once they become part of institutionalized history — at least not right away. Even though Dr. Blutinger said that he sees the historical value of his emergency room video diaries,he has yet to revisit the peak-crisis videos himself.

“I’m almost scared to look back at them,” he said. “I’m worried that they’ll reignite a set of emotions that I’ve managed to tuck away. I’m sure one day I’ll look back and perhaps open up one or two clips, but I have never watched any of them all the way through.”

Lesley M.M. Blume is a journalist, historian andthe author of “Fallout: The Hiroshima Cover-Up and the Reporter Who Revealed It to the World,” which will be published on August 4.

How Epidemics End (Boston Review)

bostonreview.net

Jeremy A. Greene, Dora Vargha

Jun 30, 2020

Death Table from Tuberculosis in the United States, prepared for the International Congress on Tuberculosis, September 21 to October 12, 1908. Image: U.S. National Library of Medicine

Contrary to hopes for a tidy conclusion to the COVID-19 pandemic, history shows that outbreaks of infectious disease often have much murkier outcomes—including simply being forgotten about, or dismissed as someone else’s problem.

Recent history tells us a lot about how epidemics unfold, how outbreaks spread, and how they are controlled. We also know a good deal about beginnings—those first cases of pneumonia in Guangdong marking the SARS outbreak of 2002–3, the earliest instances of influenza in Veracruz leading to the H1N1 influenza pandemic of 2009–10, the outbreak of hemorrhagic fever in Guinea sparking the Ebola pandemic of 2014–16. But these stories of rising action and a dramatic denouement only get us so far in coming to terms with the global crisis of COVID-19. The coronavirus pandemic has blown past many efforts at containment, snapped the reins of case detection and surveillance across the world, and saturated all inhabited continents. To understand possible endings for this epidemic, we must look elsewhere than the neat pattern of beginning and end—and reconsider what we mean by the talk of “ending” epidemics to begin with.

The social lives of epidemics show them to be not just natural phenomena but also narrative ones: deeply shaped by the stories we tell about their beginnings, their middles, their ends.

Historians have long been fascinated by epidemics in part because, even where they differ in details, they exhibit a typical pattern of social choreography recognizable across vast reaches of time and space. Even though the biological agents of the sixth-century Plague of Justinian, the fourteenth-century Black Death, and the early twentieth-century Manchurian Plague were almost certainly not identical, the epidemics themselves share common features that link historical actors to present experience. “As a social phenomenon,” the historian Charles Rosenberg has argued, “an epidemic has a dramaturgic form. Epidemics start at a moment in time, proceed on a stage limited in space and duration, following a plot line of increasing and revelatory tension, move to a crisis of individual and collective character, then drift towards closure.” And yet not all diseases fit so neatly into this typological structure. Rosenberg wrote these words in 1992, nearly a decade into the North American HIV/AIDS epidemic. His words rang true about the origins of that disease—thanks in part to the relentless, overzealous pursuit of its “Patient Zero”—but not so much about its end, which was, as for COVID-19, nowhere in sight.

In the case of the new coronavirus, we have now seen an initial fixation on origins give way to the question of endings. In March The Atlantic offered four possible “timelines for life returning to normal,” all of which depended the biological basis of a sufficient amount of the population developing immunity (perhaps 60 to 80 percent) to curb further spread. This confident assertion derived from models of infectious outbreaks formalized by epidemiologists such as W. H. Frost a century earlier. If the world can be defined into those susceptible (S), infected (I) and resistant (R) to a disease, and a pathogen has a reproductive number R0 (pronounced R-naught) describing how many susceptible people can be infected by a single infected person, the end of the epidemic begins when the proportion of susceptible people drops below the reciprocal, 1/R0. When that happens, one person would infect, on average, less than one other person with the disease.

These formulas reassure us, perhaps deceptively. They conjure up a set of natural laws that give order to the cadence of calamities. The curves produced by models, which in better times belonged to the arcana of epidemiologists, are now common figures in the lives of billions of people learning to live with contractions of civil society promoted in the name of “bending,” “flattening,” or “squashing” them. At the same time, as David Jones and Stefan Helmreich recently wrote in these pages, the smooth lines of these curves are far removed from jagged realities of the day-to-day experience of an epidemic—including the sharp spikes in those “reopening” states where modelers had predicted continued decline.  

In other words, epidemics are not merely biological phenomena. They are inevitably framed and shaped by our social responses to them, from beginning to end (whatever that may mean in any particular case). The questions now being asked of scientists, clinicians, mayors, governors, prime ministers, and presidents around the world is not merely “When will the biological phenomenon of this epidemic resolve?” but rather “When, if ever, will the disruption to our social life caused in the name of coronavirus come to an end?” As peak incidence nears, and in many places appears to have passed, elected officials and think tanks from opposite ends of the political spectrum provide “roadmaps” and “frameworks” for how an epidemic that has shut down economic, civic, and social life in a manner not seen globally in at least a century might eventually recede and allow resumption of a “new normal.”

To understand possible endings for this epidemic, we must look elsewhere than the neat pattern of beginning and end—and reconsider what we mean by the talk of “ending” epidemics to begin with.

These two faces of an epidemic, the biological and the social, are closely intertwined, but they are not the same. The biological epidemic can shut down daily life by sickening and killing people, but the social epidemic also shuts down daily life by overturning basic premises of sociality, economics, governance, discourse, interaction—and killing people in the process as well. There is a risk, as we know from both the Spanish influenza of 1918–19 and the more recent swine flu of 2008–9, of relaxing social responses before the biological threat has passed. But there is also a risk in misjudging a biological threat based on faulty models or bad data and in disrupting social life in such a way that the restrictions can never properly be taken back. We have seen in the case of coronavirus the two faces of the epidemic escalating on local, national, and global levels in tandem, but the biological epidemic and the social epidemic don’t necessarily recede on the same timeline.

For these sorts of reasons we must step back and reflect in detail on what we mean by ending in the first place. The history of epidemic endings has taken many forms, and only a handful of them have resulted in the elimination of a disease.

section separator

History reminds us that the interconnections between the timing of the biological and social epidemics are far from obvious. In some cases, like the yellow fever epidemics of the eighteenth century and the cholera epidemics of the nineteenth century, the dramatic symptomatology of the disease itself can make its timing easy to track. Like a bag of popcorn popping in the microwave, the tempo of visible case-events begins slowly, escalates to a frenetic peak, and then recedes, leaving a diminishing frequency of new cases that eventually are spaced far enough apart to be contained and then eliminated. In other examples, however, like the polio epidemics of the twentieth century, the disease process itself is hidden, often mild in presentation, threatens to come back, and ends not on a single day but over different timescales and in different ways for different people.

Campaigns against infectious diseases are often discussed in military terms, and one result of that metaphor is to suggest that epidemics too must have a singular endpoint. We approach the infection peak as if it were a decisive battle like Waterloo, or a diplomatic arrangement like the Armistice at Compiègne in November 1918. Yet the chronology of a single, decisive ending is not always true even for military history, of course. Just as the clear ending of a military war does not necessarily bring a close to the experience of war in everyday life, so too the resolution of the biological epidemic does not immediately undo the effects of the social epidemic. The social and economic effects of the 1918–1919 pandemic, for example, were felt long after the end of the third and putatively final wave of the virus. While the immediate economic effect on many local businesses caused by shutdowns appears to have resolved in a matter of months, the broader economic effects of the epidemic on labor-wage relations were still visible in economic surveys in 1920, again in 1921, and in several areas as far as 1930.

The history of epidemic endings has taken many forms, and only a handful of them have resulted in the elimination of a disease.

And yet, like World War One with which its history was so closely intertwined, the influenza pandemic of 1918–19 appeared at first to have a singular ending. In individual cities the epidemic often produced dramatic spikes and falls in equally rapid tempo. In Philadelphia, as John Barry notes in The Great Influenza (2004), after an explosive and deadly rise in October 1919 that peaked at 4,597 deaths in a single week, cases suddenly dropped so precipitously that the public gathering ban could be lifted before the month was over, with almost no new cases in following weeks. A phenomenon whose destructive potential was limited by material laws, “the virus burned through available fuel, then it quickly faded away.”  

As Barry reminds us, however, scholars have since learned to differentiate at least three different sequences of epidemics within the broader pandemic. The first wave blazed through military installations in the spring of 1918, the second wave caused the devastating mortality spikes in the summer and fall of 1918, and the third wave began in December 1918 and lingered long through the summer of 1919. Some cities, like San Francisco, passed through the first and second waves relatively unscathed only to be devastated by the third wave. Nor was it clear to those still alive in 1919 that the pandemic was over after the third wave receded. Even as late as 1922, a bad flu season in Washington State merited a response from public health officials to enforce absolute quarantine as they had during 1918–19. It is difficult, looking back, to say exactly when this prototypical pandemic of the twentieth century was really over.

section separator

Who can tell when a pandemic has ended? Today, strictly speaking, only the World Health Organization (WHO). The Emergency Committee of the WHO is responsible for the global governance of health and international coordination of epidemic response. After the SARS coronavirus pandemic of 2002–3, this body was granted sole power to declare the beginnings and endings of Public Health Emergencies of International Concern (PHEIC). While SARS morbidity and mortality—roughly 8,000 cases and 800 deaths in 26 countries—has been dwarfed by the sheer scale of COVID-19, the pandemic’s effect on national and global economies prompted revisions to the International Health Regulations in 2005, a body of international law that had remained unchanged since 1969. This revision broadened the scope of coordinated global response from a handful of diseases to any public health event that the WHO deemed to be of international concern and shifted from a reactive response framework to a pro-active one based on real-time surveillance and detection and containment at the source rather than merely action at international borders.

This social infrastructure has important consequences, not all of them necessarily positive. Any time the WHO declares a public health event of international concern—and frequently when it chooses not to declare one—the event becomes a matter of front-page news. Since the 2005 revision, the group has been criticized both for declaring a PHEIC too hastily (as in the case of H1N1) or too late (in the case of Ebola). The WHO’s decision to declare the end of a PHEIC, by contrast, is rarely subject to the same public scrutiny. When an outbreak is no longer classified as an “extraordinary event” and no longer is seen to pose a risk at international spread, the PHEIC is considered not to be justified, leading to a withdrawal of international coordination. Once countries can grapple with the disease within their own borders, under their own national frameworks, the PHEIC is quietly de-escalated.

At their worst, epidemic endings are a form of collective amnesia, transmuting the disease that remains into merely someone else’s problem.

As the response to the 2014–16 Ebola outbreak in West Africa demonstrates, however, the act of declaring the end of a pandemic can be just as powerful as the act of declaring its beginning—in part because emergency situations can continue even after a return to “normal” has been declared. When WHO Director General Margaret Chan announced in March 2016 that the Ebola outbreak was no longer a public health event of international concern, international donors withdrew funds and care to the West African countries devastated by the outbreak, even as these struggling health systems continued to be stretched beyond their means by the needs of Ebola survivors. NGOs and virologists expressed concern that efforts to fund Ebola vaccine development would likewise fade without a sense of global urgency pushing research forward.

Part of the reason that the role of the WHO in proclaiming and terminating the state of pandemic is subject to so much scrutiny is that it can be. The WHO is the only global health body that is accountable to all governments of the world; its parliamentary World Health Assembly contains health ministers from every nation. Its authority rests not so much on its battered budget as its access to epidemic intelligence and pool of select individuals, technical experts with vast experience in epidemic response. But even though internationally sourced scientific and public health authority is key to its role in pandemic crises, WHO guidance is ultimately carried out in very different ways and on very different time scales in different countries, provinces, states, counties, and cities. One state might begin to ease up restrictions to movement and industry just as another implements more and more stringent measures. If each country’s experience of “lockdown” has already been heterogeneous, the reconnection between them after the PHEIC is ended will likely show even more variance.

section separator

So many of our hopes for the termination of the present PHEIC now lie in the promise of a COVID-19 vaccine. Yet a closer look at one of the central vaccine success stories of the twentieth century shows that technological solutions rarely offer resolution to pandemics on their own. Contrary to our expectations, vaccines are not universal technologies. They are always deployed locally, with variable resources and commitments to scientific expertise. International variations in research, development, and dissemination of effective vaccines are especially relevant in the global fight against epidemic polio.

The development of the polio vaccine is relatively well known, usually told as a story of an American tragedy and triumph. Yet while polio epidemics that swept the globe in the postwar decades did not respect national borders or the Iron Curtain, the Cold War provided context for both collaboration and antagonism. Only a few years after the licensing of Jonas Salk’s inactivated vaccine in the United States, his technique became widely used across the world, although its efficacy outside of the United States was questioned. The second, live oral vaccine developed by Albert Sabin, however, involved extensive collaboration in with Eastern European and Soviet colleagues. As the success of the Soviet polio vaccine trials marked a rare landmark of Cold War cooperation, Basil O’Connor, president of the March of Dimes movement, speaking at the Fifth International Poliomyelitis Conference in 1960, proclaimed that “in search for the truth that frees man from disease, there is no cold war.”

Two faces of an epidemic, the biological and the social, are closely intertwined, but they are not the same.

Yet the differential uptake of this vaccine retraced the divisions of Cold War geography. The Soviet Union, Hungary, and Czechoslovakia were the first countries in the world to begin nationwide immunization with the Sabin vaccine, soon followed by Cuba, the first country in the Western Hemisphere to eliminate the disease. By the time the Sabin vaccine was licensed in the United States in 1963, much of Eastern Europe had done away with epidemics and was largely polio-free. The successful ending of this epidemic within the communist world was immediately held up as proof of the superiority of their political system.

Western experts who trusted the Soviet vaccine trials, including the Yale virologist and WHO envoy Dorothy Horstmann, nonetheless emphasized that their results were possible because of the military-like organization of the Soviet health care system. Yet these enduring concerns that authoritarianism itself was the key tool for ending epidemics—a concern reflected in current debates over China’s heavy-handed interventions in Wuhan this year—can also be overstated. The Cold War East was united not only by authoritarianism and heavy hierarchies in state organization and society, but also by a powerful shared belief in the integration of paternal state, biomedical research, and socialized medicine. Epidemic management in these countries combined an emphasis on prevention, easily mobilized health workers, top-down organization of vaccinations, and a rhetoric of solidarity, all resting on a health care system that aimed at access to all citizens.

Still, authoritarianism as a catalyst for controlling epidemics can be singled out and pursued with long-lasting consequences. Epidemics can be harbingers of significant political changes that go well beyond their ending, significantly reshaping a new “normal” after the threat passes. Many Hungarians, for example, have watched with alarm the complete sidelining of parliament and the introduction of government by decree at the end of March this year. The end of any epidemic crisis, and thus the end of the need for the significantly increased power of Viktor Orbán, would be determined by Orbán himself. Likewise, many other states, urging the mobilization of new technologies as a solution to end epidemics, are opening the door to heightened state surveillance of their citizens. The apps and trackers now being designed to follow the movement and exposure of people in order to enable the end of epidemic lockdowns can collect data and establish mechanisms that reach well beyond the original intent. The digital afterlives of these practices raise new and unprecedented questions about when and how epidemics end.

Like infectious agents on an agar plate, epidemics colonize our social lives and force us to learn to live with them, in some way or another, for the foreseeable future.

Although we want to believe that a single technological breakthrough will end the present crisis, the application of any global health technology is always locally determined. After its dramatic successes in managing polio epidemics in the late 1950s and early 1960s, the oral poliovirus vaccine became the tool of choice for the Global Polio Eradication Initiative in the late 1980s, as it promised an end to “summer fears” globally. But since vaccines are in part technologies of trust, ending polio outbreaks depends on maintaining confidence in national and international structures through which vaccines are delivered. Wherever that often fragile trust is fractured or undermined, vaccination rates can drop to a critical level, giving way to vaccine-derived polio, which thrives in partially vaccinated populations.

In Kano, Nigeria, for example, a ban on polio vaccination between 2000 and 2004 resulted in a new national polio epidemic that soon spread to neighboring countries. As late as December 2019 polio outbreaks were still reported in fifteen African countries, including Angola and the Democratic Republic of the Congo. Nor is it clear that polio can fully be regarded as an epidemic at this point: while polio epidemics are now a thing of the past for Hungary—and the rest of Europe, the Americas, Australia, and East Asia as well—the disease is still endemic to parts of Africa and South Asia. A disease once universally epidemic is now locally endemic: this, too, is another way that epidemics end.

section separator

Indeed, many epidemics have only “ended” through widespread acceptance of a newly endemic state. Consider the global threat of HIV/AIDS. From a strictly biological perspective, the AIDS epidemic has never ended; the virus continues to spread devastation through the world, infecting 1.7 million people and claiming an estimated 770,000 lives in the year 2018 alone. But HIV is not generally described these days with the same urgency and fear that accompanied the newly defined AIDS epidemic in the early 1980s. Like coronavirus today, AIDS at that time was a rapidly spreading and unknown emerging threat, splayed across newspaper headlines and magazine covers, claiming the lives of celebrities and ordinary citizens alike. Nearly forty years later it has largely become a chronic disease endemic, at least in the Global North. Like diabetes, which claimed an estimated 4.9 million lives in 2019, HIV/AIDS became a manageable condition—if one had access to the right medications.

Those who are no longer directly threatened by the impact of the disease have a hard time continuing to attend to the urgency of an epidemic that has been rolling on for nearly four decades. Even in the first decade of the AIDS epidemic, activists in the United States fought tooth and nail to make their suffering visible in the face of both the Reagan administration’s dogged refusal to talk publicly about the AIDS crisis and the indifference of the press after the initial sensation of the newly discovered virus had become common knowledge. In this respect, the social epidemic does not necessarily end when biological transmission has ended, or even peaked, but rather when, in the attention of the general public and in the judgment of certain media and political elites who shape that attention, the disease ceases to be newsworthy.

Though we like to think of science as universal and objective, crossing borders and transcending differences, it is in fact deeply contingent upon local practices.

Polio, for its part, has not been newsworthy for a while, even as thousands around the world still live with polio with ever-decreasing access to care and support. Soon after the immediate threat of outbreaks passed, so did support for those whose lives were still bound up with the disease. For others, it became simply a background fact of life—something that happens elsewhere. The polio problem was “solved,” specialized hospitals were closed, fundraising organizations found new causes, and poster children found themselves in an increasingly challenging world. Few medical professionals are trained today in the treatment of the disease. As intimate knowledge of polio and its treatment withered away with time, people living with polio became embodied repositories of lost knowledge.

History tells us public attention is much more easily drawn to new diseases as they emerge rather than sustained over the long haul. Well before AIDS shocked the world into recognizing the devastating potential of novel epidemic diseases, a series of earlier outbreaks had already signaled the presence of emerging infectious agents. When hundreds of members of the American Legion fell ill after their annual meeting in Philadelphia in 1976, the efforts of epidemiologists from the Centers for Disease Control to explain the spread of this mysterious disease and its newly discovered bacterial agent, Legionella, occupied front-page headlines. In the years since, however, as the 1976 incident faded from memory, Legionella infections have become everyday objects of medical care, even though incidence in the U.S. has grown ninefold since 2000, tracing a line of exponential growth that looks a lot like COVID-19’s on a longer time scale. Yet few among us pause in our daily lives to consider whether we are living through the slowly ascending limb of a Legionella epidemic.  

Nor do most people living in the United States stop to consider the ravages of tuberculosis as a pandemic, even though an estimated 10 million new cases of tuberculosis were reported around the globe in 2018, and an estimated 1.5 million people died from the disease. The disease seems to only receive attention in relation to newer scourges: in the late twentieth century TB coinfection became a leading cause of death in emerging HIV/AIDS pandemic, while in the past few months TB coinfection has been invoked as a rising cause of mortality in COVID-19 pandemic. Amidst these stories it is easy to miss that on its own, tuberculosis has been and continues to be the leading cause of death worldwide from a single infectious agent. And even though tuberculosis is not an active concern of middle-class Americans, it is still not a thing of the past even in this country. More than 9,000 cases of tuberculosis were reported in the United States in 2018—overwhelmingly affecting racial and ethnic minority populations—but they rarely made the news.

There will be no simple return to the way things were: whatever normal we build will be a new one—whether many of us realize it or not.

While tuberculosis is the target of concerted international disease control efforts, and occasionally eradication efforts, the time course of this affliction has been spread out so long—and so clearly demarcated in space as a problem of “other places”—that it is no longer part of the epidemic imagination of the Global North. And yet history tells a very different story. DNA lineage studies of tuberculosis now show that the spread of tuberculosis in sub-Saharan Africa and Latin America was initiated by European contact and conquest from the fifteenth century through the nineteenth. In the early decades of the twentieth century, tuberculosis epidemics accelerated throughout sub-Saharan Africa, South Asia, and Southeast Asia due to the rapid urbanization and industrialization of European colonies. Although the wave of decolonizations that swept these regions between the 1940s and the 1980s established autonomy and sovereignty for newly post-colonial nations, this movement did not send tuberculosis back to Europe.

These features of the social lives of epidemics—how they live on even when they seem, to some, to have disappeared—show them to be not just natural phenomena but also narrative ones: deeply shaped by the stories we tell about their beginnings, their middles, their ends. At their best, epidemic endings are a form of relief for the mainstream “we” that can pick up the pieces and reconstitute a normal life. At their worst, epidemic endings are a form of collective amnesia, transmuting the disease that remains into merely someone else’s problem.

section separator

What are we to conclude from these complex interactions between the social and the biological faces of epidemics, past and present? Like infectious agents on an agar plate, epidemics colonize our social lives and force us to learn to live with them, in some way or another, for the foreseeable future. Just as the postcolonial period continued to be shaped by structures established under colonial rule, so too are our post-pandemic futures indelibly shaped by what we do now. There will be no simple return to the way things were: whatever normal we build will be a new one—whether many of us realize it or not. Like the world of scientific facts after the end of a critical experiment, the world that we find after an the end of an epidemic crisis—whatever we take that to be—looks in many ways like the world that came before, but with new social truths established. How exactly these norms come into being depends a great deal on particular circumstances: current interactions among people, the instruments of social policy as well as medical and public health intervention with which we apply our efforts, and the underlying response of the material which we applied that apparatus against (in this case, the coronavirus strain SARS-CoV-2). While we cannot know now how the present epidemic will end, we can be confident that it in its wake it will leave different conceptions of normal in realms biological and social, national and international, economic and political.

Though we like to think of science as universal and objective, crossing borders and transcending differences, it is in fact deeply contingent upon local practices—including norms that are easily thrown over in an emergency, and established conventions that do not always hold up in situations of urgency. Today we see civic leaders jumping the gun in speaking of access to treatments, antibody screens, and vaccines well in advance of any scientific evidence, while relatively straightforward attempts to estimate the true number of people affected by the disease spark firestorms over the credibility of medical knowledge. Arduous work is often required to achieve scientific consensus, and when the stakes are high—especially when huge numbers of lives are at risk—heterogeneous data give way to highly variable interpretations. As data moves too quickly in some domains and too slowly in others, and sped-up time pressures are placed on all investigations the projected curve of the epidemic is transformed into an elaborate guessing game, in which different states rely on different kinds of scientific claims to sketch out wildly different timetables for ending social restrictions.

The falling action of an epidemic is perhaps best thought of as asymptotic: never disappearing, but rather fading to the point where signal is lost in the noise of the new normal—and even allowed to be forgotten.

These varied endings of the epidemic across local and national settings will only be valid insofar as they are acknowledged as such by others—especially if any reopening of trade and travel is to be achieved. In this sense, the process of establishing a new normal in global commerce will continue to be bound up in practices of international consensus. What the new normal in global health governance will look like, however, is more uncertain than ever. Long accustomed to the role of international scapegoat, the WHO Secretariat seems doomed to be accused either of working beyond its mandate or not acting fast enough. Moreover, it can easily become a target of scapegoating, as the secessionist posturing of Donald Trump demonstrates. Yet the U.S. president’s recent withdrawal from this international body is neither unprecedented nor unsurmountable. Although Trump’s voting base might not wish to be grouped together with the only other global power to secede from the WHO, after the Soviet Union’s 1949 departure from the group it ultimately brought all Eastern Bloc back to task of international health leadership in 1956. Much as the return of the Soviets to the WHO resulted in the global eradication of smallpox—the only human disease so far to have been intentionally eradicated—it is possible that some future return of the United States to the project of global health governance might also result in a more hopeful post-pandemic future.

As the historians at the University of Oslo have recently noted, in epidemic periods “the present moves faster, the past seems further removed, and the future seems completely unpredictable.” How, then, are we to know when epidemics end? How does the act of looking back aid us in determining a way forward? Historians make poor futurologists, but we spend a lot of time thinking about time. And epidemics produce their own kinds of time, in both biological and social domains, disrupting our individual senses of passing days as well as conventions for collective behavior. They carry within them their own tempos and rhythms: the slow initial growth, the explosive upward limb of the outbreak, the slowing of transmission that marks the peak, plateau, and the downward limb. This falling action is perhaps best thought of as asymptotic: rarely disappearing, but rather fading to the point where signal is lost in the noise of the new normal—and even allowed to be forgotten.

London’s statues from ‘bygone’ imperial past to be reviewed, mayor says (Reuters)

uk.reuters.com

Reuters Editorial – June 9, 2020 / 5:16 AM

A police officer stands next to the statue of Winston Churchill at Parliament Square which was damaged by protesters with graffiti, in the aftermath of protests against the death of George Floyd who died in police custody in Minneapolis, London, Britain, June 8, 2020. REUTERS/John Sibley

LONDON (Reuters) – London mayor Sadiq Khan has ordered a review of the capital’s statues and street names after the toppling of the statue of an English slave trader by anti-racism protesters triggered a debate about the demons of Britain’s imperial past.

A statue of Edward Colston, who made a fortune in the 17th century from trading West African slaves, was torn down and thrown into Bristol harbour on Sunday by a group of demonstrators taking part in a wave of protests following the death of George Floyd in the United States.

Khan said a commission would review statues, plaques and street names which largely reflect the rapid expansion of London’s wealth and power at the height of Britain’s empire in the reign of Queen Victoria.

“Our capital’s diversity is our greatest strength, yet our statues, road names and public spaces reflect a bygone era,” Khan said. He said some statues would be removed.

“It is an uncomfortable truth that our nation and city owes a large part of its wealth to its role in the slave trade and while this is reflected in our public realm, the contribution of many of our communities to life in our capital has been wilfully ignored.”

In the biggest deportation in known history, weapons and gunpowder from Europe were swapped for millions of African slaves who were then shipped across the Atlantic to the Americas. Ships returned to Europe with sugar, cotton and tobacco.

As many as 17 million African men, women and children were torn from their homes and shackled into one of the world’s most brutal globalized trades between the 15th and 19th centuries. Many died in merciless conditions.

Those who survived endured a life of subjugation on sugar, tobacco and cotton plantations. Britain abolished the trans-Atlantic slave trade in 1807 although the full abolition of slavery did not follow for another generation.

See VIDEO

Reporting by Guy Faulconbridge; Editing by Nick Macfie

Angélica Kolody Mammana: Quem não recorre aos livros de história para lê-la está fadado a repeti-la

Angélica Kolody Mammana – Facebook, 20 de maio de 2020

Vou contar uma história longa.
Calma, leiam até o fim. Confiem em mim.
Era uma vez uma doença.
Ela surgiu em um país muito, muito distante.
De repente, começou a se alastrar como faísca sobre pólvora.
Pessoas começaram a morrer, em números enormes, aos montes.
Os jornais começaram a noticiar sobre a doença antes que ela chegasse ao nosso país. Informavam a população, mas as pessoas não acreditavam.
Diziam que era algo distante, que era apenas uma gripe comum, que era tudo um grande exagero.

Algumas pessoas que chegavam de viagem da Europa caiam doentes. Algumas morreram. Mas eram velhas. Tinham doenças. Não havia motivo para pânico.

As pessoas liam os jornais e ficavam indignadas com o exagero da imprensa.
Diziam que era uma jogada politica para derrubar o governo, para espalhar o comunismo pelo mundo.

Na tentativa de conter a doença, que a essa altura já se alastrara por várias nações, países começaram a indicar o uso de máscaras, recomendaram que as pessoas ficassem afastadas, em quarentena, em cidades do mundo todo.

– Quarentena? Como assim? O que será da nossa economia?? – gritavam pessoas indignadas.

Faziam piquetes, manifestações, carregavam cartazes dizendo que se recusavam a usar máscara. E, quando eram obrigadas, usavam placas informando que não concordavam com o uso dela.

Escolas foram fechadas, portas de negócios foram baixadas. Apenas farmácias e mercados poderiam permanecer abertos para abastecer a população.

Teatros e cinemas foram lacrados.
Todos os campeonatos de futebol e outros esportes foram cancelados.

O Rio de Janeiro tornou-se um cenário de tragédia. Hospitais lotados, sem vias de saída, pessoas morrendo em casa. Por toda parte, a falta de caixões e pessoas precisando ser enterradas em valas comuns. Em um único dia, chegam a ser registradas mais de 1.000 mortes.

No Congresso, propôs-se que a formatura dos estudantes fosse antecipada, para que fossem logo para o mercado de trabalho.

Cientistas procuravam loucamente a cura ou o tratamento para aquela doença, até que algum jornal anunciou que um medicamento incrível, até então usado para a malária, parecia ser eficiente.

As pessoas ficaram em polvorosa. Todos queriam o medicamento.
Alguns médicos passaram a anunciar o milagre dessa substância em veículos de comunicação, as pessoas se acumulavam na porta das farmácias e consultórios para recebê-la.

Não havia recomendação científica para o tal remédio, mas as pessoas não se importavam. Estavam desesperadas, qualquer coisa serviria.

Milhares de doentes foram medicados, mas a doença não parecia melhorar com o remédio.

Os veículos de comunicação então chegaram a uma conclusão que parecia óbvia: o remédio não funcionava porque estava sendo administrado tarde demais.

O ideal seria prescrevê-lo o quanto antes, até mesmo preventivamente, como garantia, para evitar a contaminação antes que ela acontecesse.

Alguns outros médicos tentaram alertar a população quanto ao risco do medicamento, mas foi em vão.

Estes médicos foram taxados de conspiracionistas, agredidos, xingados, tomados por comunistas, acusados de estarem contra o interesse da população.

As pessoas passaram a se auto administrar o medicamento para malária, como iriam esperar de braços cruzados?

Foi aí que a historia se complicou.
Havia pessoas que não podiam tomar o tal remédio, pois eram portadoras de condições clinicas adversas que eram contra indicação ao uso dele.
Algumas desmaiavam na rua. Correram lendas urbanas de pessoas que chegaram a ser tomadas por mortas e enterradas vivas, em decorrência de paradas cardíacas e arritmias causadas pelo remédio, cuja dose era propagada sem qualquer critério pela própria imprensa.

As pessoas, ao longo do tempo, ao verem que o medicamento não surtia o efeito prometido, passaram a recorrer a soluções populares e caseiras cujos boatos se disseminaram.

Aguardente, associada a limão e mel, seria um tratamento possível. Bares chegaram a ter filas de pessoas em busca de uma dose. O alcoolismo disparou. O preço da fruta atingiu valores jamais vistos e sumiu das prateleiras.

Correu o boato de que hospitais estavam administrando chás envenenados à meia noite, para pacientes terminais, para liberar leitos.

Por quase dois anos, o governo falhou em conseguir implementar um Ministério da Saúde eficiente. As opiniões se dividiam, discutiam o impacto do isolamento sobre o comércio

Da mesma forma que um famoso escritor chegou a descrever:
“Cada médico tinha uma tentativa de explicação diferente; nós não sabíamos no quê e em quem acreditar. Esperávamos por uma explicação que ninguém tinha para dar, como até hoje esperamos para saber o que foi aquela sassânida infernal.”

Enquanto isso, a doença avançava. Em meio a promessas vãs, avançou e avançou.
A única coisa que se provou eficaz para contê-la foram as regiões com alta adesão ao isolamento social e ao uso de máscaras.

Não, não se trata do coronavirus nem da cloroquina.

Trata-se da gripe espanhola e do sal de quinino, medicamento que na época era usado para malária.

O uso indiscriminado do sal de quinino foi promovido pela imprensa na época, a partir de 1918, e levou também inúmeras pessoas à morte. A imprensa em massa passou a prescrever os sais de quinino inicialmente como tratamento, e posteriormente como prevenção à gripe espanhola.

Nunca surtiu efeito.

A gripe espanhola terminou por matar 30 milhões de pessoas, sem que até hoje, 102 anos depois, tenha sido encontrada a cura.

Na época, muitas pessoas acreditavam que ela era uma mentira, um exagero e uma conspiração para alastrar a revolução comunista de 1917 pelo mundo.

A única medida que, retrospectivamente, conteve razoavelmente a doença em algumas regiões, foi o isolamento social.

A economia sobreviveu.

Quem não recorre aos livros de história para lê-la está fadado a repeti-la.

Notas:
1. A gripe espanhola matou o presidente da República brasileiro, recém reeleito, o Conselheiro Rodrigues Alves, em 1918, logo antes de sua posse.

2. O “medicamento caseiro” inventado para o tratamento da gripe espanhola, à base de aguardente, mel e limão, entrou para a cultura brasileira e hoje atende pelo nome de “caipirinha”.

3. O “chá da meia noite” foi um boato que difamou a Santa Casa do Rio de Janeiro em 1918. Foi apelidada na época de “Casa do Diabo”. Após o final da epidemia, o Chá da Meia Noite foi tema do primeiro bloco de carnaval do Rio, em 1919.

A pandemia incide no ano mais importante da história da humanidade. Serão as próximas zoonoses gestadas no Brasil? (UNICAMP)

05, mai – 2020 | 14:02 Ciência, saúde e sociedade: Covid-19

Luiz Marques

Edição de imagem: Renan Garcia

O ano de 2020 será lembrado como o ano em que a pandemia causada pelo vírus SARS-CoV-2 precipitou uma ruptura maior no funcionamento das sociedades contemporâneas. Será provavelmente lembrado também como o momento de uma ruptura da qual nossas sociedades não mais se recuperaram completamente. Isso porque a atual pandemia intervém num momento em que três crises estruturais na relação entre as sociedades hegemônicas contemporâneas e o sistema Terra se reforçam reciprocamente, convergindo em direção a uma regressão econômica global, ainda que com eventuais surtos conjunturais de recuperação. Essas três crises são, como reiterado pela ciência, a emergência climática, a aniquilação em curso da biodiversidade e o adoecimento coletivo dos organismos, intoxicados pela indústria química.i Os impactos cada vez mais avassaladores decorrentes da sinergia entre essas três crises sistêmicas deixarão doravante as sociedades, mesmo as mais ricas, ainda mais desiguais e mais vulneráveis, menos aptas, portanto, a recuperar seu desempenho anterior. São justamente tais perdas parciais, cada vez mais frequentes, de funcionalidade na relação das sociedades com o meio ambiente que caracterizam essencialmente o processo de colapso socioambiental em curso (Homer-Dixon et al. 2015; Steffen et al. 2018; Marques 2015/2018 e 2020).

  1. O ano da pandemia é o do mais crucial ponto de inflexão da história humana

Por sua extensão global e pelo rastro de mortes deixadas em sua passagem, superior a 250 mil vítimas (oficialmente notificadas) em pouco mais de quatro meses, a atual pandemia é um fato cuja gravidade seria difícil exagerar, tanto mais porque novos surtos podem ainda ocorrer nos próximos dois anos, segundo um relatório do Center for Infectious Disease Research and Policy (CIDRAP), da Universidade de Minnesota (Moore, Lipsitch, Barry & Osterholm 2020).

Mas ainda mais grave que o saldo imenso de mortes, é o momento da incidência da pandemia na história humana. Outras pandemias, algumas muito mais letais, ocorreram no século XX, sem afetar profundamente a capacidade de recuperação das sociedades. O que singulariza a atual pandemia é o fato de se somar a diversas crises sistêmicas que ameaçam a humanidade, e isso justamente no momento em que não é mais possível postergar decisões que afetarão crucialmente, e muito em breve, a habitabilidade do planeta. A ciência condiciona a possibilidade de estabilizar o aquecimento médio global dentro, ou não muito além, dos limites almejados pelo Acordo de Paris a um fato incontornável: as emissões de CO2 devem atingir seu pico em 2020 e começar a declinar fortemente em seguida. O IPCC traçou 196 cenários através dos quais podemos limitar o aquecimento médio global a cerca de 0,5oC acima do aquecimento médio atual em relação ao período pré-industrial (1,2oC em 2019). Nenhum deles, lembram Tom Rivett-Carnac e Christiana Figueres, admite que o pico de emissões de gases de efeito estufa (GEE) seja protelado para além de 2020 (Hooper 2020). Ninguém exprime o significado dessa data-limite de modo mais peremptório que Thomas Stocker, co-diretor do IPCC entre 2008 e 2015:ii

“Mitigação retardada ou insuficiente impossibilita limitar o aquecimento global permanentemente. O ano de 2020 é crucial para a definição das ambições globais sobre a redução das emissões. Se as emissões de CO2 continuarem a aumentar além dessa data, as metas mais ambiciosas de mitigação tornar-se-ão inatingíveis”.

Já em 2017, Jean Jouzel, ex-vice-presidente do IPCC, advertia que “para manter alguma chance de permanecer abaixo dos 2oC é necessário que o pico das emissões seja atingido no mais tardar em 2020” (Le Hir 2017). Em outubro do ano seguinte, comentando o lançamento do relatório especial do IPCC, intitulado Global Warming 1.5oC, Debra Roberts, co-diretora do Grupo de Trabalho 2 desse relatório, reforçava essa percepção: “Os próximos poucos anos serão provavelmente os mais importantes de nossa história”. E Amjad Abdulla, representante dos Pequenos Estados Insulares em Desenvolvimento (SIDS) nas negociações climáticas, acrescentava: “Não tenho dúvidas de que os historiadores olharão retrospectivamente para esses resultados [do relatório especial do IPCC de 2018] como um dos momentos definidores no curso da história humana” (Mathiesen & Sauer 2018). Em The Second Warning: A Documentary Film (2018), divulgação do manifesto The Scientist’s Warning to Humanity: A Second Notice, lançado por William Ripple e colegas em 2017 e endossado por cerca de 20 mil cientistas, a filósofa Kathleen Dean Moore faz suas as declarações acima mencionadas: “Estamos vivendo um ponto de inflexão. Os próximos poucos anos serão os mais importantes da história da humanidade”.

Em abril de 2017, um grupo de cientistas, coordenados por Stephan Rahmstorf, lançava The Climate Turning Point, em cujo Prefácio se reafirma a meta mais ambiciosa do Acordo de Paris (“manter o aumento da temperatura média global bem abaixo de 2oC em relação ao período pré-industrial”), esclarecendo que: “essa meta é considerada necessária para evitar riscos incalculáveis à humanidade, e é factível – mas, realisticamente, apenas se as emissões globais atingirem um pico até o ano de 2020, no mais tardar”. Esse documento norteou então a criação, por diversas lideranças científicas e diplomáticas, da Missão 2020 (https://mission2020.global/). Ela definia metas básicas em energia, transporte, uso da terra, indústria, infraestrutura e finanças, de modo a tornar declinante, a partir de 2020, a curva das emissões de gases de efeito estufa e colocar o planeta numa trajetória consistente com o Acordo de Paris. “Com radical colaboração e teimoso otimismo”, escreve Christiana Figueres e colegas da Missão 2020, “dobraremos a curva das emissões de gases de efeito estufa até 2020, possibilitando à humanidade florescer.” De seu lado, António Guterres, cumprindo sua missão de incentivar e coordenar os esforços de governança global, alertava em setembro de 2018: “Se não mudarmos nossa rota até 2020, corremos o risco de deixar passar o momento em que é ainda possível evitar uma mudança climática desenfreada (a runaway climate change), com consequências desastrosas para a humanidade e para os sistemas naturais que nos sustentam”.

Pois bem, 2020, enfim, chegou. Fazendo em 2019 um balanço dos progressos realizados em direção às metas da Missão 2020, o World Resources Institute (Ge et al., 2019) escreve que “na maioria dos casos, a ação foi insuficiente ou o progresso foi nulo” (in most cases action is insufficient or progress is off track). Nenhuma das metas, em suma, foi alcançada e, em dezembro passado, a COP25 em Madri varreu definitivamente, em grande parte por culpa dos governos dos EUA, Japão, Austrália e Brasil (Irfan 2019), as últimas esperanças de uma diminuição iminente das emissões globais de GEE.

  1. A pandemia entra em cena

Mas eis que a Covid-19 irrompe, deslocando, paralisando e adiando tudo, inclusive a COP26. E em pouco mais de três meses resolveu pelo caos e pelo sofrimento o que mais de três décadas de fatos, de ciência, de campanhas e de esforços diplomáticos para diminuir as emissões de GEE mostraram-se incapazes de realizar (já a Conferência de Toronto, de 1988, recomendava “ações específicas” nesse sentido). Ao invés de um decrescimento econômico racional, gradual e democraticamente planejado, o decrescimento econômico abrupto imposto pela pandemia afigura-se já, segundo Kenneth S. Rogoff, como “a mais profunda queda da economia global em 100 anos” (Goodman 2020). Em 15 de abril, o Carbon Brief estimou que a crise econômica deve provocar uma diminuição estimada em cerca de 5,5% nas emissões globais de CO2 em 2020. Em 30 de abril, a Global Energy Review 2020 – The impacts of the Covid-19 crisis on global energy demand and CO2 emissions, da Agência Internacional de Energia (AIE), vai mais longe e estima que “as emissões globais de CO2 devem cair ainda mais rapidamente ao longo dos nove meses restantes do ano, atingindo 30,6 Gt [bilhões de toneladas] em 2020, quase 8% mais baixas que em 2019. Este seria o nível mais baixo desde 2010. Tal redução seria a maior de todos os tempos, seis vezes maior que a redução precedente de 0,4 Gt em 2009, devido à crise financeira e duas vezes maior que todas as reduções anteriores desde o fim da Segunda Guerra Mundial”. (https://www.iea.org/reports/global-energy-review-2020/global-energy-and-co2-emissions-in-2020). A Figura 1 indica como essa redução das emissões globais de CO2 reflete a queda na demanda de consumo global de energia primária, comparada com as quedas anteriores.

##

Figura 1 – Taxas de mudança (%) na demanda global de energia primária, 1900 – 2020

Fonte: AIE, Global Energy Review 2020 The impacts of the Covid-19 crisis on global energy demand and CO emissions, Abril 2020, p. 11

A redução das emissões globais de CO2 projetada pela AIE para 2020 equivale ou é até pouco maior que os 7,6% de redução anual até 2030 que o IPCC considera imprescindível para conter o aquecimento aquém de níveis catastróficos (Evans 2020). O relatório da AIE apressa-se, contudo, em advertir que, “tal como nas crises precedentes, (…) o repique das emissões pode ser maior que o declínio, a menos que a onda de investimentos para retomar a economia seja dirigido a uma infraestrutura energética mais limpa e resiliente”. Salvo raras exceções, os fatos até agora não autorizam a expectativa de uma ruptura com os paradigmas energéticos e socioeconômicos anteriores. Malgrado o colapso do preço do petróleo, ou justamente por isso, as petroleiras estão se movendo com vertiginosa velocidade para tirar partido desse momento, obtendo, por exemplo, investimentos de USD 1,1 bilhão para financiar a conclusão do famigerado oleoduto Keystone XL, que ligará o petróleo canadense ao Golfo do México (McKibben 2020). Os exemplos desse tipo de oportunismo são inúmeros, inclusive no Brasil, onde os ruralistas se aproveitam da situação para fazer aprovar da Medida Provisória 910, que anistia a grilagem e eleva ainda mais as ameaças aos indígenas. Como bem afirma Laurent Joffrin, em sua Lettre politique de 30 de abril para o jornal Libération (Le monde d’avant, en pire?), o mundo pós-pandemia “corre o risco de parecer furiosamente, a curto prazo ao menos, com o mundo de antes, mas em versão piorada”. E Joffrin emenda: “o ‘mundo de após’ não mudará sozinho. Como para o ‘mundo de antes’, seu futuro dependerá de um combate político, paciente e árduo”. Político e árduo, sem dúvida, mas definitivamente não há mais tempo para paciência.

De qualquer modo, uma redução de quase 8% nas emissões globais de CO2 num ano apenas não abriu sequer um dente na curva cumulativa das concentrações atmosféricas desse gás, medidas em Mauna Loa (Havaí). Elas bateram mais um recorde em abril de 2020, atingindo 416,76 partes por milhão (ppm), 3,13 ppm acima de 2019, um dos maiores saltos desde o início de suas mensurações em 1958. Não se trata apenas de um número a mais na selva de indicadores climáticos convergentes. É o número decisivo. Como lembra Petteri Taalas, Secretário-Geral da Organização Meteorológica Mundial: “A última vez que a Terra apresentou concentrações atmosféricas de CO2 comparáveis às atuais foi há 3 a 5 milhões de anos. Nessa época, a temperatura estava 2oC a 3oC [acima do período pré-industrial] e o nível do mar estava 10 a 20 metros mais alto que hoje” (McGrath 2019). Faltam agora menos de 35 ppm para atingir 450 ppm, um nível de concentração atmosférica de CO2 largamente associado a um aquecimento médio global de 2oC acima do período pré-industrial, nível que pode ser atingido, mantida a trajetória atual, em pouco mais de 10 anos. O que nos aguarda por volta de 2030, mantida a engrenagem do sistema econômico capitalista globalizado e existencialmente dependente de sua própria reprodução ampliada, é nada menos que um desastre para a humanidade como um todo, bem como para inúmeras outras espécies. A palavra desastre não é uma hipérbole. O já mencionado Relatório do IPCC de 2018 (Global Warming 1.5oC) projeta que o mundo a 2oC em média acima do período pré-industrial terá quase 6 bilhões de pessoas expostas a ondas de calor extremo e mais de 3,5 bilhões de pessoas sujeitas à escassez hídrica, entre outras muitas adversidades. Desastre é a palavra que melhor define o mundo para o qual rumamos no horizonte dos próximos 10 anos (ou 20, pouco importa), e é exatamente o vocábulo empregado por Sir Brian Hoskins, diretor do Grantham Institute for Climate Change, do Imperial College em Londres: “Não temos evidência de que um aquecimento de 1,9oC é algo com que se possa lidar facilmente, e 2,1oC é um desastre” (Simms 2017).

Em consequência dessas altíssimas concentrações atmosféricas de CO2, o ano passado já foi o mais quente dos registros históricos na Europa (1,2oC acima do período 1981 – 2010!) e, mesmo sem El Niño, há agora, segundo o NOAA, 74,67% de chances de que 2020 venha a ser o ano mais quente em um século e meio de registros históricos na média global,iii batendo o recorde precedente de 2016 (1,24oC acima do período pré-industrial, segundo a NASA). Não é no espaço deste artigo que se podem elencar os muitos indícios de que 2020 será o primeiro ou segundo (após 2016) ano mais quente entre os sete mais quentes (2014-2020) da história da civilização humana desde a última deglaciação, cerca de 11.700 anos antes do presente. Baste aqui ter em mente que, se março de 2020 for representativo do ano, já perdemos a meta mais ambiciosa do Acordo de Paris, pois a temperatura média desse mês cravou globalmente 1,51oC acima do período 1880-1920, conforme mostra a Figura 2.

##

Figura 2 – Anomalias de temperatura em março de 2020 (1,51C na média global), em relação ao período 1880-1920. Fonte: GISS Surface Temperature Analysis (v4), NASA. <https://data.giss.nasa.gov/gistemp/maps/index_v4.html>.

O aquecimento global é uma arma apontada contra a saúde global. Como mostra Sara Goudarzi (2020), temperaturas mais elevadas favorecem a adaptação de micro-organismos a um mundo mais quente, diminuindo a eficácia de duas defesas básicas dos mamíferos contra os patógenos: (1) muitos micro-organismos não sobrevivem ainda a temperaturas superiores a 37oC, mas podem vir a se adaptar rapidamente a elas; (2) o sistema imune dos mamíferos, pois este perde eficiência em temperaturas mais elevadas. Além disso, o aquecimento global amplia o raio de ação de vetores de epidemias, como a dengue, zika e chikungunya, e altera a distribuição geográfica das plantas e animais, levando espécies animais terrestres a se deslocarem em direção a latitudes mais altas a uma taxa média de 17 km por década (Pecl et al. 2017). Aaron Bernstein, diretor do Harvard University’s Center of Climate, Health and the Global Environment, sintetiza bem a interação entre aquecimento global e desmatamento em suas múltiplas relações com novos surtos epidêmicos:iv

“À medida que o planeta se aquece (…) os animais deslocam-se para os polos fugindo do calor. Animais estão entrando em contato com animais com os quais eles normalmente não interagiriam, e isso cria uma oportunidade para patógenos encontrar outros hospedeiros. Muitas das causas primárias das mudanças climáticas também aumentam o risco de pandemias. O desmatamento, causado em geral pela agropecuária é a causa maior da perda de habitat no mundo todo. E essa perda força os animais a migrarem e potencialmente a entrar em contato com outros animais ou pessoas e compartilhar seus germes. Grandes fazendas de gado também servem como uma fonte para a passagem de infecções de animais para pessoas”.

Sem perder de vista as relações entre a emergência climática e essas novas ameaças sanitárias, foquemos em duas questões bem circunscritas e diretamente ligadas à pandemia atual.

  1. A pandemia foi prevista e será, doravante, mais frequente

A primeira questão refere-se ao caráter, por assim dizer, antropogênico da pandemia. Bem longe de ser adventícia, ela é uma consequência, reiteradamente prevista, de um sistema socioeconômico crescentemente disfuncional e destrutivo. Josef Settele, Sandra Díaz, Eduardo Brondizio e Peter Daszak escreveram um artigo, a convite do IPBES, de leitura obrigatória e que me permito citar longamente:

“Há uma única espécie responsável pela pandemia Covid-19: nós. Assim como com as crises climáticas e o declínio da biodiversidade, as pandemias recentes são uma consequência direta da atividade humana – particularmente de nosso sistema financeiro e econômico global baseado num paradigma limitado, que preza o crescimento econômico a qualquer custo. (…) Desmatamento crescente, expansão descontrolada da agropecuária, cultivo e criação intensivos, mineração e aumento da infraestrutura, assim como a exploração de espécies silvestres criaram uma ‘tempestade perfeita’ para o salto de doenças da vida selvagem para as pessoas. (…) E, contudo, isso pode ser apenas o começo. Embora se estime que doenças transmitidas de outros animais para humanos já causem 700 mil mortes por ano, é vasto o potencial para pandemias futuras. Acredita-se que 1,7 milhão de vírus não identificados, dentre os que sabidamente infectam pessoas, ainda existem em mamíferos e pássaros aquáticos. Qualquer um deles pode ser a ‘Doença X’ – potencialmente ainda mais perturbadora e letal que a Covid-19. É provável que pandemias futuras ocorram mais frequentemente, propaguem-se mais rapidamente, tenham maior impacto econômico e matem mais pessoas, se não formos extremamente cuidadosos acerca dos impactos das escolhas que fazemos hoje” (https://ipbes.net/covid19stimulus).

Cada frase dessa citação encerra uma lição de ciência e de lucidez política. A maior frequência recente de epidemias e pandemias tem por causas centrais o desmatamento e a agropecuária, algo bem estabelecido também por Christian Drosten, atual coordenador do combate à Covid-19 na Alemanha, além de diretor do Instituto de Virologia do Hospital Charité de Berlim e um dos cientistas que identificou a pandemia SARS em 2003 (Spinney 2020).

“Desde que tenha oportunidade, o coronavírus está pronto para mudar de hospedeiro e nós criamos essa oportunidade através de nosso uso não natural de animais – a pecuária (livestock). Essa expõe os animais de criação à vida silvestre, mantém esses animais em grandes grupos que podem amplificar o vírus, e os humanos têm intenso contato com eles – por exemplo, através do consumo de carne –, de modo que tais animais certamente representam uma possível trajetória de emergência para o coronavírus. Camelos são animais de criação no Oriente Médio e são os hospedeiros do vírus MERS, assim como do coronavírus 229E – que é uma causa da gripe comum em humanos –, já o gado bovino foi o hospedeiro original do coronavírus OC43, outra causa de gripe”.

Nada disso é novidade para a ciência. Sabemos que a maioria das pandemias emergentes são zoonoses, isto é, doenças infecciosas causadas por bactérias, vírus, parasitas ou príons, que saltaram de hospedeiros não humanos, usualmente vertebrados, para os humanos. Como afirma Ana Lúcia Tourinho, pesquisadora da Universidade Federal de Mato Grosso (UFMT), o desmatamento é uma causa central e uma bomba-relógio em termos de zoonoses: “quando um vírus que não fez parte da nossa história evolutiva sai do seu hospedeiro natural e entra no nosso corpo, é o caos” (Pontes 2020). Esse risco, repita-se, é crescente. Basta ter em mente que “mamíferos domesticados hospedam 50% dos vírus zoonóticos, mas representam apenas 12 espécies” (Johnson et al. 2020). Esse grupo inclui porcos, vacas e carneiros. Em resumo, o aquecimento global, o desmatamento, a destruição dos habitats selvagens, a domesticação e a criação de aves e mamíferos em escala industrial destroem o equilíbrio evolutivo entre as espécies, facilitando as condições para saltos desses vírus de uma espécie a outra, inclusive a nossa.

4. As próximas zoonoses serão gestadas no Brasil?

O segundo ponto, com o qual concluo este artigo, são as consequências especificamente sanitárias da destruição em curso da Amazônia e do Cerrado. Entre as mais funestas está a crescente probabilidade de que o país se torne o foco das próximas pandemias zoonóticas. Na última década, as megacidades da Ásia do leste, principalmente na China, têm sido o principal “hotspot” de infecções zoonóticas (Zhang et al. 2019). Não por acaso. Esses países estão entre os que mais perderam cobertura florestal no mundo em benefício do sistema alimentar carnívoro e globalizado. O caso da China é exemplar. De 2001 a 2018, o país perdeu 94,2 mil km2 de cobertura arbórea, equivalente a uma diminuição de 5,8% em sua cobertura arbórea no período. “Extração de madeira e agropecuária consomem até 5 mil km2 de florestas virgens todo ano. Na China setentrional e central a cobertura florestal foi reduzida pela metade nas últimas duas décadas”.v Em paralelo com a destruição dos habitats selvagens, o crescimento econômico chinês desencadeou uma demanda por proteínas animais, incluindo as provenientes de animais exóticos (Cheng et al. 2007). Entre 1980 e 2015, o consumo de carne na China cresceu sete vezes e 4,7 vezes per capita (de 15 kg para 70 kg per capita por ano ao longo deste período). Com cerca de 18% da população mundial, a China era em 2018 responsável por 28% do consumo de carne no planeta (Rossi 2018). Segundo um relatório de 2017 do Rabobank, intitulado China’s Animal Protein Outlook to 2020: Growth in Demand, Supply and Trade, a demanda adicional por carne a cada ano na China será de cerca de um milhão de toneladas. “A produção local de carne bovina não consegue acompanhar o crescimento da demanda. Na realidade, a China tem uma escassez estrutural de oferta de carne bovina, que necessita ser satisfeita por importações crescentes”.

A cobertura vegetal dos trópicos tem sido destruída para sustentar essa dieta crescentemente carnívora, não apenas na China, mas em vários países do mundo e particularmente entre nós. No Brasil, a remoção de mais de 1,8 milhão de km2 da cobertura vegetal da Amazônia e do Cerrado nos últimos cinquenta anos, para converter suas magníficas paisagens naturais em zonas fornecedoras de carne e ração animal, em escala nacional e global, representa o mais fulminante ecocídio jamais perpetrado pela espécie humana. Nunca, de fato, em nenhuma latitude e em nenhum momento da história humana, destruiu-se tanta vida animal e vegetal em tão pouco tempo, para a degradação de tantos e para o benefício econômico de tão poucos. E nunca, mesmo para os pouquíssimos que enriqueceram com a devastação, esse enriquecimento terá sido tão efêmero, pois a destruição da cobertura vegetal já começa a gerar erosão dos solos e secas recorrentes, solapando as bases de qualquer agricultura nessa região (na realidade, no Brasil, como um todo).

Em decorrência dessa guerra de extermínio contra a natureza deflagrada pela insanidade dos ditadores militares e continuada pelos civis, atualmente o rebanho bovino brasileiro é de aproximadamente 215 milhões de cabeças, sendo que 80% de seu consumo é absorvido pelo mercado interno, que cresceu 14% nos últimos dez anos (Macedo 2019). Além disso, o Brasil tornou-se líder das exportações mundiais de carne bovina (20% dessas exportações) e de soja (56%), basicamente destinada à alimentação animal. A maior parte do rebanho bovino brasileiro concentra-se hoje nas regiões Norte e Centro-Oeste, com crescente participação da Amazônia. Em 2010, 14% do rebanho brasileiro já se encontrava na região norte do país. Em 2016, essa participação saltou para 22%. Juntas, a região norte e centro-oeste abrigam 56% do rebanho bovino brasileiro (Zaia 2018). Em 2017, apenas 19,8% da cobertura vegetal remanescente do Cerrado permanecia ainda intocada. A continuar a devastação, a pecuária e a agricultura de soja levarão em breve à extinção quase 500 espécies de plantas endêmicas – três vezes mais que todas as extinções documentadas desde 1500 (Strassburg et al. 2017). A Amazônia, que perdeu cerca de 800 mil km2 de cobertura florestal em 50 anos e perderá outras muitas dezenas de milhares sob a sanha ecocida de Bolsonaro, tornou-se, em sua porção sul e leste, uma paisagem desolada de pastos em vias de degradação. O caos ecológico produzido pelo desmatamento por corte raso de cerca de 20% da área original da floresta, pela degradação do tecido florestal de pelo menos outros 20% e pela grande concentração de bovinos na região cria as condições para tornar o Brasil um “hotspot” das próximas zoonoses. Em primeiro lugar porque os morcegos são um grande reservatório de vírus e, entre os morcegos brasileiros, cujo habitat são sobretudo as florestas (ou o que resta delas), circulam pelo menos 3.204 tipos de coronavírus (Maxman 2017). Em segundo lugar porque, como mostraram Nardus Mollentze e Daniel Streicker (2020), o grupo taxonômico dos Artiodactyla (de casco fendido), ao qual pertencem os bois, hospedam, juntamente com os primatas, mais vírus, potencialmente zoonóticos, do que seria de se esperar entre os grupos de mamíferos, incluindo os morcegos. Na realidade, a Amazônia já é um “hotspot” de epidemias não virais, como a leishmaniose e a malária, doenças tropicais negligenciadas, mas com alto índice de letalidade. Como afirma a OMS, “a leishmaniose está associada a mudanças ambientais, tais como o desmatamento, o represamento de rios, a esquemas de irrigação e à urbanização”,vi todos eles fatores que concorrem para a destruição da Amazônia e para o aumento do risco de pandemias. A relação entre desmatamento amazônico e a malária foi bem estabelecida em 2015 por uma equipe do IPEA: para cada 1% de floresta derrubada por ano, os casos de malária aumentam 23% (Pontes 2020).

A curva novamente ascendente desde 2013 da destruição da Amazônia e do Cerrado resultou da execrável aliança de Dilma Rousseff com o que há de mais retrógrado na economia brasileira. Já para a necropolítica de Bolsonaro, a destruição da vida, do que resta do patrimônio natural brasileiro, tornou-se um programa de governo e uma verdadeira obsessão. Bolsonaro está levando o país a dar um salto sem retorno no caos ecológico, de onde a necessidade inadiável de neutralizá-lo por impeachment ou qualquer outro mecanismo constitucional. Não há mais tempo a perder. Entre agosto de 2018 e julho de 2019, o desmatamento amazônico atingiu 9.762 km2, quase 30% acima dos 12 meses anteriores e o pior resultado dos últimos dez anos, segundo o INPE. No primeiro trimestre de 2020, que apresenta tipicamente os níveis mais baixos de desmatamento em cada ano, o sistema Deter, do INPE, detectou um aumento de 51% em relação ao mesmo período de 2019, o nível mais alto para esse período desde o início da série, em 2016. Segundo Tasso Azevedo, coordenador-geral do Projeto de Mapeamento Anual da Cobertura e Uso do Solo no Brasil (MapBiomas), “o mais preocupante é que no acumulado de agosto de 2019 até março de 2020, o nível do desmatamento mais do que dobrou” (Menegassi 2020). Ao monopolizar todas as atenções, a pandemia oferece a Bolsonaro uma oportunidade inesperada para acelerar sua obra de destruição da floresta e de seus povos (Barifouse 2020).

Recapitulemos. O que importa aqui, sobretudo, é entender que a pandemia intervém no momento em que o aquecimento global e todos os demais processos de degradação ambiental estão em aceleração. A pandemia pode acelerá-los ainda mais, na ausência de uma reação política vigorosa da sociedade. Ela acrescenta, em todo o caso, mais uma dimensão a esse feixe convergente de crises socioambientais que impõe à humanidade uma situação radicalmente nova. Pode-se assim formular essa novidade: não é mais plausível esperar, passada a pandemia, um novo ciclo de crescimento econômico global e ainda menos nacional. Se algum crescimento voltar a ocorrer, ele será conjuntural e logo truncado pelo caos climático, ecológico e sanitário. O próximo decênio evoluirá sob o signo de regressões socioeconômicas, pois mesmo a se admitir que a economia globalizada tenha trazido benefícios sociais, eles foram parcos e vêm sendo de há muito superados por seus malefícios. A pandemia é apenas um entre esses malefícios, mas certamente não o pior. Não são mais atuais, portanto, em 2020, as variadas agendas desenvolvimentistas, típicas dos embates ideológicos do século XX. É claro que a exigência de justiça social, bandeira histórica da esquerda, permanece mais que nunca atual. Além de ser um valor perene e irrenunciável, a luta pela diminuição da desigualdade social significa, antes de mais nada, retirar das corporações o poder decisório sobre os investimentos estratégicos (energia, alimentação, mobilidade etc.), assumir o controle democrático e sustentável desses investimentos e, assim, atenuar os impactos do colapso socioambiental em curso. É do aprofundamento da democracia que depende crucialmente, hoje, a sobrevivência de qualquer sociedade organizada num mundo que está se tornando sempre mais quente, mais empobrecido biologicamente, mais poluído e, por todas essas razões, mais enfermo. Sobreviver, no contexto de um processo de colapso socioambiental, não é um programa mínimo. Sobreviver requer, hoje, lutar por algo muito mais ambicioso que os programas socialdemocratas ou revolucionários do século XX. Supõe redefinir o próprio sentido e finalidade da atividade econômica, vale dizer, em última instância, redefinir nossa posição como sociedade e como espécie no âmbito da biosfera.

Referências

BARIFOUSE, Rafael, “Pandemia vai permitir aceleração do desmatamento na Amazônia, prevê consultoria”. BBC Brasil, 26/IV/2020.

CHENG, Vincent C. C. et al., “Severe Acute Respiratory Syndrome Coronavirus as an Agent of Emerging and Reemerging Infection”. Clinical Microbiology Reviews, October, 2007, pp. 660-694.

EVANS, Simon, “Analysis: Coronavirus set to cause largest ever annual fall in CO2 emissions”, Carbon Brief, 9/IV/2020, atualizado em 15 de abril.

GE, Mengpin. et al. , “Tracking Progress of the 2020 Climate Turning Point.” World Resources Institute, Washington D.C. 2019.

GOODMAN, Peter, “Why the Global Recession Could Last a Long Time”. The New York Times, 1/IV/2020.

GOUDARZI, Sara, “How a Warming Climate Could Affect the Spread of Diseases Similar to COVID-19”. Scientific American, 29/IV/2020.

HOMER-DIXON, Thomas et al., “Synchronous failure: the emerging causal architecture of global crisis. Ecology and Society, 20, 3, 2015.

HOOPER, Rowan, “Ten years to save the world”. New Scientist, 14/III/2020, pp. 45-47.

IRFAN, Umair, “The US, Japan, and Australia let the whole world down at the UN climate talks”. Vox, 18/XII/2019.

JOHNSON, Christine K. et al., “Global shifts in mammalian population trends reveal key predictors of virus spillover risk”. Proceedings of the Royal Society B, 8/IV/2020.

LE HIR, Pierre, “Réchauffement climatique: la bataille des 2C est presque perdue”. Le Monde, 31/XII/2017.

MACEDO, Flávia, “Consumo de carne bovina no Brasil cresceu 14% em 10 anos, diz Cepea”. Canal Rural, 9/XII/2019.

MARQUES, Luiz, Capitalismo e Colapso Ambiental (2015). Campinas, Editora da Unicamp, 3ª ed. 2018.

MARQUES, Luiz, “O colapso socioambiental não é um evento, é o processo em curso”. Revista Rosa, 1, Março, 2020 <http://revistarosa.com/1/o-colapso-socioambiental-nao-e-um-evento>

MATHIESEN, Karl & SAUER, Natalie, “‘Most important years in history’: major UN report sounds last-minute climate alarm”. Climate Home News, 8/X/2018.

MAXMAN, Amy, “Bats Are Global Reservoir for Deadly Coronaviruses”. Scientific American, 12/VI/2017.

McGRATH, Matt, “Climate Change. Greenhouse gas concentrations again break records”. BBC, 25/XI/2019.

McKibben, Bill, “Big Oi lis using the coronavirus pandemic to push through the Keystone XL pipeline”. The Guardian, 5/IV/2020.

MENEGASSI, Duda, “Desmatamento na Amazônia atinge nível recorde no primeiro trimestre de 2020”. ((O)) eco, 13/IV/2020.

MOLLENTZE, Nardus & STREICKER, Daniel G., “Viral zoonotic risk is homogenous among taxonomic orders of mammalian and avian reservoir hosts”. PNAS, 13/IV/2020.

MOORE, Kristine A., LIPSITCH, Marc, BARRY, John & OSTERHOLM, Michael, COVID-19: The CIDRAP Viewpoint. University of Minnesota, 20/IV/2020.

MORIYAMA, Miyu & ICHINOHE, Takeshi, “High Ambient Temperature Dampens Adaptative Immune Responses to Influnza A Virus Infection”. PNAS, 116, 8, 19/II/2019, pp. 3118-3125.

PECL, Gretta et al., “Biodiversity redistribution under climate change: Impacts on ecosystems and human well-being”. Science, 355, 6332, 31/III/2017.

PONTES, Nádia, “O elo entre desmatamento e epidemias investigado pela ciência”. DW, 15/IV/2020.

ROSSI, Marcello, “The Chinese Are Eating More Meat Than Ever Before and the Planet Can’t Keep Up”. Mother Jones, 21/VII/2018.

SETTELE, J., DIAZ, S., BRONDIZIO, E. & DASZAK, Peter, “COVID-19 Stimulus Measures Must Save Lives, Protect Livelihoods, and Safeguard Nature to Reduce the Risk of Future Pandemics”. IPBES Expert Guest Article, 27/IV/2020.

SIMMS, Andrew, “A cat in hell’s chance – why we’re losing the battle to keep global warming below 2C”, The Guardian, 19/I/2017.

SPINNEY, Laura, “Germany’s Covid-19 expert: ‘For many, I’m the evil guy crippling the economy”. The Guardian, 26/IV/2020.

STEFFEN, Will et al., “Trajectories of the Earth System in the Anthropocene”. PNAS, 9/VIII/2018.

STRASSBURG, Bernardo B.N. et al., “Moment of truth for the Cerrado hotspot”. Nature Ecology & Evolution, 2017.

ZAIA, Marina, “Rebanho Brasileiro por Região”. Scot Consultoria, 16/IV/2018.

ZHANG, Juanjuan et al., “Patterns of human social contact and contact with animals in Shanghai, China”. Scientific Reports, 9, 2019.

i Segundo a Chemical Data Reporting (CDR) da EPA, nos EUA, em 2016 havia 8.707 substâncias ou compostos químicos largamente comercializados, aos quais somos cotidianamente expostos, ignorando na maior parte dos casos seus efeitos e os de suas interações sobre a saúde humana e demais espécies. <https://www.chemicalsafetyfacts.org/chemistry-context/debunking-myth-chemicals-testing-safety/>.

ii <https://mission2020.global/testimonial/stocker/>.

iii Cf. NOAA, Global Annual Temperature Rankings Outlook. Março, 2020 <https://www.ncdc.noaa.gov/sotc/global/202003/supplemental/page-2>.

iv Cf. “Coronavirus, climate change, and the environment”. Environmental Health News, 20/III/2020. <https://www.ehn.org/coronavirus-environment-2645553060.html>.

v Cf. “Deforestation and Desertification in China”. <http://factsanddetails.com/china/cat10/sub66/item389.html>.

vi Leishmaniosis, OMS, 2/III/2020 https://www.who.int/en/news-room/fact-sheets/detail/leishmaniasis.

*** Luiz Marques é professor livre-docente do Departamento de História do IFCH /Unicamp. Pela editora da Unicamp, publicou Giorgio Vasari, Vida de Michelangelo (1568), 2011 e Capitalismo e Colapso ambiental, 2015, 3a edição, 2018. Coordena a coleção Palavra da Arte, dedicada às fontes da historiografia artística, e participa com outros colegas do coletivo Crisálida, Crises SocioAmbientais Labor Interdisciplinar Debate & Atualização (crisalida.eco.br).

The Coronavirus Is Rewriting Our Imaginations (New Yorker)

What felt impossible has become thinkable. The spring of 2020 is suggestive of how much, and how quickly, we can change as a civilization.

By Kim Stanley Robinson May 1, 2020

A heat map shows people standing in a distanced line.
Possibly, in a few months, we’ll return to some version of the old normal. But this spring won’t be forgotten.Photograph by Antoine d’Agata / Magnum

The critic Raymond Williams once wrote that every historical period has its own “structure of feeling.” How everything seemed in the nineteen-sixties, the way the Victorians understood one another, the chivalry of the Middle Ages, the world view of Tang-dynasty China: each period, Williams thought, had a distinct way of organizing basic human emotions into an overarching cultural system. Each had its own way of experiencing being alive.

In mid-March, in a prior age, I spent a week rafting down the Grand Canyon. When I left for the trip, the United States was still beginning to grapple with the reality of the coronavirus pandemic. Italy was suffering; the N.B.A. had just suspended its season; Tom Hanks had been reported ill. When I hiked back up, on March 19th, it was into a different world. I’ve spent my life writing science-fiction novels that try to convey some of the strangeness of the future. But I was still shocked by how much had changed, and how quickly.

Schools and borders had closed; the governor of California, like governors elsewhere, had asked residents to begin staying at home. But the change that struck me seemed more abstract and internal. It was a change in the way we were looking at things, and it is still ongoing. The virus is rewriting our imaginations. What felt impossible has become thinkable. We’re getting a different sense of our place in history. We know we’re entering a new world, a new era. We seem to be learning our way into a new structure of feeling.

In many ways, we’ve been overdue for such a shift. In our feelings, we’ve been lagging behind the times in which we live. The Anthropocene, the Great Acceleration, the age of climate change—whatever you want to call it, we’ve been out of synch with the biosphere, wasting our children’s hopes for a normal life, burning our ecological capital as if it were disposable income, wrecking our one and only home in ways that soon will be beyond our descendants’ ability to repair. And yet we’ve been acting as though it were 2000, or 1990—as though the neoliberal arrangements built back then still made sense. We’ve been paralyzed, living in the world without feeling it.

Now, all of a sudden, we’re acting fast as a civilization. We’re trying, despite many obstacles, to flatten the curve—to avoid mass death. Doing this, we know that we’re living in a moment of historic importance. We realize that what we do now, well or badly, will be remembered later on. This sense of enacting history matters. For some of us, it partly compensates for the disruption of our lives.

Actually, we’ve already been living in a historic moment. For the past few decades, we’ve been called upon to act, and have been acting in a way that will be scrutinized by our descendants. Now we feel it. The shift has to do with the concentration and intensity of what’s happening. September 11th was a single day, and everyone felt the shock of it, but our daily habits didn’t shift, except at airports; the President even urged us to keep shopping. This crisis is different. It’s a biological threat, and it’s global. Everyone has to change together to deal with it. That’s really history.

It seems as though science has been mobilized to a dramatic new degree, but that impression is just another way in which we’re lagging behind. There are 7.8 billion people alive on this planet—a stupendous social and technological achievement that’s unnatural and unstable. It’s made possible by science, which has already been saving us. Now, though, when disaster strikes, we grasp the complexity of our civilization—we feel the reality, which is that the whole system is a technical improvisation that science keeps from crashing down.

On a personal level, most of us have accepted that we live in a scientific age. If you feel sick, you go to a doctor, who is really a scientist; that scientist tests you, then sometimes tells you to take a poison so that you can heal—and you take the poison. It’s on a societal level that we’ve been lagging. Today, in theory, everyone knows everything. We know that our accidental alteration of the atmosphere is leading us into a mass-extinction event, and that we need to move fast to dodge it. But we don’t act on what we know. We don’t want to change our habits. This knowing-but-not-acting is part of the old structure of feeling.

Now comes this disease that can kill anyone on the planet. It’s invisible; it spreads because of the way we move and congregate. Instantly, we’ve changed. As a society, we’re watching the statistics, following the recommendations, listening to the scientists. Do we believe in science? Go outside and you’ll see the proof that we do everywhere you look. We’re learning to trust our science as a society. That’s another part of the new structure of feeling.

Possibly, in a few months, we’ll return to some version of the old normal. But this spring won’t be forgotten. When later shocks strike global civilization, we’ll remember how we behaved this time, and how it worked. It’s not that the coronavirus is a dress rehearsal—it’s too deadly for that. But it is the first of many calamities that will likely unfold throughout this century. Now, when they come, we’ll be familiar with how they feel.

What shocks might be coming? Everyone knows everything. Remember when Cape Town almost ran out of water? It’s very likely that there will be more water shortages. And food shortages, electricity outages, devastating storms, droughts, floods. These are easy calls. They’re baked into the situation we’ve already created, in part by ignoring warnings that scientists have been issuing since the nineteen-sixties. Some shocks will be local, others regional, but many will be global, because, as this crisis shows, we are interconnected as a biosphere and a civilization.

Imagine what a food scare would do. Imagine a heat wave hot enough to kill anyone not in an air-conditioned space, then imagine power failures happening during such a heat wave. (The novel I’ve just finished begins with this scenario, so it scares me most of all.) Imagine pandemics deadlier than the coronavirus. These events, and others like them, are easier to imagine now than they were back in January, when they were the stuff of dystopian science fiction. But science fiction is the realism of our time. The sense that we are all now stuck in a science-fiction novel that we’re writing together—that’s another sign of the emerging structure of feeling.

Science-fiction writers don’t know anything more about the future than anyone else. Human history is too unpredictable; from this moment, we could descend into a mass-extinction event or rise into an age of general prosperity. Still, if you read science fiction, you may be a little less surprised by whatever does happen. Often, science fiction traces the ramifications of a single postulated change; readers co-create, judging the writers’ plausibility and ingenuity, interrogating their theories of history. Doing this repeatedly is a kind of training. It can help you feel more oriented in the history we’re making now. This radical spread of possibilities, good to bad, which creates such a profound disorientation; this tentative awareness of the emerging next stage—these are also new feelings in our time.

Memento mori: remember that you must die. Older people are sometimes better at keeping this in mind than younger people. Still, we’re all prone to forgetting death. It never seems quite real until the end, and even then it’s hard to believe. The reality of death is another thing we know about but don’t feel.Video From The New Yorker Throwing Shade Through Crosswords

So this epidemic brings with it a sense of panic: we’re all going to die, yes, always true, but now perhaps this month! That’s different. Sometimes, when hiking in the Sierra, my friends and I get caught in a lightning storm, and, completely exposed to it, we hurry over the rocky highlands, watching lightning bolts crack out of nowhere and connect nearby, thunder exploding less than a second later. That gets your attention: death, all too possible! But to have that feeling in your ordinary, daily life, at home, stretched out over weeks—that’s too strange to hold on to. You partly get used to it, but not entirely. This mixture of dread and apprehension and normality is the sensation of plague on the loose. It could be part of our new structure of feeling, too.

Just as there are charismatic megafauna, there are charismatic mega-ideas. “Flatten the curve” could be one of them. Immediately, we get it. There’s an infectious, deadly plague that spreads easily, and, although we can’t avoid it entirely, we can try to avoid a big spike in infections, so that hospitals won’t be overwhelmed and fewer people will die. It makes sense, and it’s something all of us can help to do. When we do it—if we do it—it will be a civilizational achievement: a new thing that our scientific, educated, high-tech species is capable of doing. Knowing that we can act in concert when necessary is another thing that will change us.

People who study climate change talk about “the tragedy of the horizon.” The tragedy is that we don’t care enough about those future people, our descendants, who will have to fix, or just survive on, the planet we’re now wrecking. We like to think that they’ll be richer and smarter than we are and so able to handle their own problems in their own time. But we’re creating problems that they’ll be unable to solve. You can’t fix extinctions, or ocean acidification, or melted permafrost, no matter how rich or smart you are. The fact that these problems will occur in the future lets us take a magical view of them. We go on exacerbating them, thinking—not that we think this, but the notion seems to underlie our thinking—that we will be dead before it gets too serious. The tragedy of the horizon is often something we encounter, without knowing it, when we buy and sell. The market is wrong; the prices are too low. Our way of life has environmental costs that aren’t included in what we pay, and those costs will be borne by our descendents. We are operating a multigenerational Ponzi scheme.

And yet: “Flatten the curve.” We’re now confronting a miniature version of the tragedy of the time horizon. We’ve decided to sacrifice over these months so that, in the future, people won’t suffer as much as they would otherwise. In this case, the time horizon is so short that we are the future people. It’s harder to come to grips with the fact that we’re living in a long-term crisis that will not end in our lifetimes. But it’s meaningful to notice that, all together, we are capable of learning to extend our care further along the time horizon. Amid the tragedy and death, this is one source of pleasure. Even though our economic system ignores reality, we can act when we have to. At the very least, we are all freaking out together. To my mind, this new sense of solidarity is one of the few reassuring things to have happened in this century. If we can find it in this crisis, to save ourselves, then maybe we can find it in the big crisis, to save our children and theirs.

Margaret Thatcher said that “there is no such thing as society,” and Ronald Reagan said that “government is not the solution to our problem; government is the problem.” These stupid slogans marked the turn away from the postwar period of reconstruction and underpin much of the bullshit of the past forty years.

We are individuals first, yes, just as bees are, but we exist in a larger social body. Society is not only real; it’s fundamental. We can’t live without it. And now we’re beginning to understand that this “we” includes many other creatures and societies in our biosphere and even in ourselves. Even as an individual, you are a biome, an ecosystem, much like a forest or a swamp or a coral reef. Your skin holds inside it all kinds of unlikely coöperations, and to survive you depend on any number of interspecies operations going on within you all at once. We are societies made of societies; there are nothing but societies. This is shocking news—it demands a whole new world view. And now, when those of us who are sheltering in place venture out and see everyone in masks, sharing looks with strangers is a different thing. It’s eye to eye, this knowledge that, although we are practicing social distancing as we need to, we want to be social—we not only want to be social, we’ve got to be social, if we are to survive. It’s a new feeling, this alienation and solidarity at once. It’s the reality of the social; it’s seeing the tangible existence of a society of strangers, all of whom depend on one another to survive. It’s as if the reality of citizenship has smacked us in the face.

As for government: it’s government that listens to science and responds by taking action to save us. Stop to ponder what is now obstructing the performance of that government. Who opposes it? Right now we’re hearing two statements being made. One, from the President and his circle: we have to save money even if it costs lives. The other, from the Centers for Disease Control and similar organizations: we have to save lives even if it costs money. Which is more important, money or lives? Money, of course! says capital and its spokespersons. Really? people reply, uncertainly. Seems like that’s maybe going too far? Even if it’s the common wisdom? Or was.

Some people can’t stay isolated and still do their jobs. If their jobs are important enough, they have to expose themselves to the disease. My younger son works in a grocery store and is now one of the front-line workers who keep civilization running.

My son is now my hero: this is a good feeling. I think the same of all the people still working now for the sake of the rest of us. If we all keep thinking this way, the new structure of feeling will be better than the one that’s dominated for the past forty years.

The neoliberal structure of feeling totters. What might a post-capitalist response to this crisis include? Maybe rent and debt relief; unemployment aid for all those laid off; government hiring for contact tracing and the manufacture of necessary health equipment; the world’s militaries used to support health care; the rapid construction of hospitals.Advertisement

What about afterward, when this crisis recedes and the larger crisis looms? If the project of civilization—including science, economics, politics, and all the rest of it—were to bring all eight billion of us into a long-term balance with Earth’s biosphere, we could do it. By contrast, when the project of civilization is to create profit—which, by definition, goes to only a few—much of what we do is actively harmful to the long-term prospects of our species. Everyone knows everything. Right now pursuing profit as the ultimate goal of all our activities will lead to a mass-extinction event. Humanity might survive, but traumatized, interrupted, angry, ashamed, sad. A science-fiction story too painful to write, too obvious. It would be better to adapt to reality.

Economics is a system for optimizing resources, and, if it were trying to calculate ways to optimize a sustainable civilization in balance with the biosphere, it could be a helpful tool. When it’s used to optimize profit, however, it encourages us to live within a system of destructive falsehoods. We need a new political economy by which to make our calculations. Now, acutely, we feel that need.

It could happen, but it might not. There will be enormous pressure to forget this spring and go back to the old ways of experiencing life. And yet forgetting something this big never works. We’ll remember this even if we pretend not to. History is happening now, and it will have happened. So what will we do with that?

A structure of feeling is not a free-floating thing. It’s tightly coupled with its corresponding political economy. How we feel is shaped by what we value, and vice versa. Food, water, shelter, clothing, education, health care: maybe now we value these things more, along with the people whose work creates them. To survive the next century, we need to start valuing the planet more, too, since it’s our only home.

It will be hard to make these values durable. Valuing the right things and wanting to keep on valuing them—maybe that’s also part of our new structure of feeling. As is knowing how much work there is to be done. But the spring of 2020 is suggestive of how much, and how quickly, we can change. It’s like a bell ringing to start a race. Off we go—into a new time.


A Guide to the Coronavirus

Kim Stanley Robinson is a science-fiction writer who lives in Davis, California. His next novel, “The Ministry for the Future,” will be published in October.

El indio inoportuno: un aporte sobre las trayectorias indígenas en el Uruguay (Hemisferio Izquierdo)

Artículo original

May 21, 2019

Francesca Repetto

Imagen: EL País

El pasado 11 de abril se celebraron los 188 años de la Masacre de Salsipuedes. Evento ápice en nuestra historia nacional, este hecho marcó el ansiado “fin” de la presencia indígena charrúa en 1831, cuando éstos fueron emboscados por el ejército de Fructuoso Rivera a orillas del Arroyo Salsipuedes. Allí, murieron cerca de 50 hombres y más de 200, -principalmente mujeres, niñas, niños y ancianos-, fueron trasladados a Montevideo y repartidos en casas de familias blancas con el fin de ser “domesticados” e “integrados” a la vida nacional: un eufemismo para referirse a la esclavización de las cautivas, niñas y niños.

Libros escolares y de cuño histórico nacionalistas, así como académicos contemporáneos, argumentan que los pocos sobrevivientes de la masacre se asimilaron a la sociedad montevideana perdiendo todo rasgo cultural distintivo. En los años 1980, sin embargo, y acompañando la tendencia de toda América Latina, en Uruguay comenzaron a resurgir numerosos colectivos que se identifican a sí mismos como descendientes de charrúas o como charrúas propiamente dicho. Este es el caso del CONACHA, de ADENCH, de UMPCHA, CHONIK, AQUECHA Pirí, Atala, entre otros, distribuidos a lo largo y ancho del país. También, en el último censo nacional de 2011, los datos arrojaron que un 2,3% de la población total declaró tener como principal ascendencia étnica a la indígena, lo cual suma más de 76.000 personas[i]. Un dato para nada menor.

Pese a estos datos, aún retumban preguntas y cuestionamientos acerca de quiénes son aquellas personas y cuál es la legitimidad que podrían tener, si, al fin y al cabo, el “problema” fue “solucionado” hace ya 188 años. “Locos”, “alucinados”, “vivos”, “truchos”, son algunos de los adjetivos que parte de la sociedad acciona para referirse a ellos[ii]. Tal vez sea preciso que volvamos a la raíz de la cuestión para entender cómo el fenómeno de la re-emergencia indígena es posible en Uruguay, como constatado en –todos- los países de nuestro continente. La re-emergencia indígena, o la “reaparición”, nos lleva a indagar sobre el movimiento que lo precede, es decir, cómo la “desaparición” indígena se configuró y asentó con tanta fuerza en nuestro imaginario de nación. 

¿Una medida puntual o una lógica de gestión?

Los años siguientes a la declaración de la Independencia en 1825 fueron marcados por movimientos de hacendados y políticos que buscaban estabilidad en la incipiente industria agropecuaria. Si por un lado los hacendados reivindicaban seguridad en el campo contra los robos de ganado y la invasión de sus propiedades, por el otro, el primer gobierno constitucional buscaría tomar medidas para protegerlas, y con ello, asegurar la recaudación pública. Entre los años 1828 y 1830, las denuncias de estancieros acerca del robo de su ganado son abundantes. Mientras que en un primer momento reconocían no saber con exactitud quiénes eran los responsables, hacia el año 1830 van a sostener con firmeza que aquellos eran los charrúas. Con la misma convicción, pasaron a exigir medidas drásticas al gobierno de Rivera, quien en 1831, se encargó de llevarlas a cabo[iii].

Presentada en ese entonces como una medida aislada y puntual, sin embargo, la masacre se localiza en un modo de gestión de las alteridades –de las poblaciones nativas-, que trascendió el evento concreto y también a los charrúas. Las denuncias de estancieros y el paso a paso del gobierno y su ejército eran sistemáticamente publicados en el más importante diario de la época, El Universal. Durante algunos años, innumerables ediciones hacían alguna referencia al tema. Sobre el año 1831, prácticamente todas las ediciones abordaron las persecuciones indígenas. Ya en 1832, no existen más publicaciones al respecto. El problema realmente parecía haber culminado de una buena vez por todas. 

Tres años antes, sin embargo, en 1828, más de 8.000 indígenas misioneros[iv] –en su enorme mayoría guaraní-, habían sido asentados en puntos estratégicos de frontera, como en la actual Bella Unión y, tiempo más tarde, a orillas del Río Yaguarón, en San Servando[v]. Antes de eso, los guaraníes ya habían participado en la construcción de Colonia de Sacramento, de Montevideo y Minas, y engrosaban el cuerpo de soldados del ejército. Si bien inicialmente los guaraníes fueron representados como indios patriotas, pues habrían “seguido” a Rivera, una sublevación en 1832 cambiaría el trato dado a ellos. El hambre y las malas condiciones a la que habían sido abandonados, llevaron a que la colonia de Bella Unión se sublevara en enero de aquel año. El evento llevó a que los líderes fueran ahorcados en público, otros recibieran “más de 300 palos”[vi] también en público y que, el mismo año, la colonia fuera ya desmontada y relocalizada a orillas del Rio Yí, actual departamento de Florida, con el nombre de San Borja[vii]. Aunque la presencia guaraní haya sido ampliamente extendida en el tiempo y en el espacio, la misma se pierde en los registros nacionales hacia el año 1860, cuando ésta última colonia indígena, San Borja, es finalmente desalojada[viii].

El silencio que reinó hasta fines de los años 1980 acerca de la presencia guaraní y la exaltación de la Masacre de Salsipuedes como evento final irremediable, llaman la atención para el lugar que los indígenas han tenido a lo largo de la historia. Nuestra historiografía no hace mención a las masacres posteriores a Salsipuedes, ni menciona las políticas de asimilación –o en términos de la época: de “domesticación de los salvajes”, de esclavización, encarcelamiento y deportación que marcaron la gestión estatal sobre los cautivos[ix]. Tampoco se hace referencia alguna a la dimensión de género que tuvieron esas acciones. En cambio, el “etnocidio” charrúa es valorado en una escala masculina de guerra y muerte, donde el asesinato de los varones ha sido tomado como la muerte de todo un grupo social. De ese modo, las trayectorias de las mujeres y sus hijos que ingresaron como esclavas en las casas montevideanas, más que haber “desaparecido”, se incorporaron como espectros en las producciones historiográficas nacionalistas.

Los años siguientes a la Independencia del Uruguay fueron singularizados por este tipo de acciones. La primera Constitución, aprobada en 1830, por ejemplo, prohibía la comercialización de esclavos, lo cual llevó a que paulatinamente la oferta de africanos como mano de obra esclava comenzaran a escasear durante los años siguientes. Recién en 1842, en plena Guerra Grande, la esclavitud es abolida finalmente por medio de la ley N°242. Por tanto, no resulta extraño pensar que uno de los objetivos de la persecución a los charrúas fuese, justamente, la apropiación de mano de obra para fines de esclavitud en trabajos domésticos.

La mayor parte de los sobrevivientes de Salsipuedes fueron mujeres y niños de ambos sexos. Los hombres jóvenes o líderes fueron encarcelados y más tarde ofrecidos como personal de servicio a los buques de bandera extranjera, con la tajante prohibición de sólo poder bajar los indios a tierra una vez en territorio extranjero.  Por otra parte, los nombres originales de los niños y niñas fueron cambiados y, -separados de sus madres-, muchos fueron bautizados en la Iglesia Matriz. Algunas de las personalidades que dan nombre a nuestras calles de Montevideo fueron personas que tomaron “indiecitos” del reparto: José Brito del Pino, Luis Lamas, Joaquín Campana, Rufino Bauzá, para mencionar algunos[x].

Un indígena siempre fuera de su tiempo

Irónicamente o no, en los años de re-emergencia de colectivos de identificación charrúa, académicos contemporáneos pasaron cada vez más a reivindicar la figura de los guaraníes, o de los “indígenas misioneros”. Algunos aseguran que si hoy existe algún indio auténtico, este no sería de forma alguna charrúa, sino más bien guaraní. “¿Qué nos van a venir a decir quiénes somos nosotros!?”, me cuestionó irritado un militante charrúa en 2014, haciendo referencia a los conocidos dichos de Daniel Vidart[xi].

Desde los años 1980 han habido enormes esfuerzos en materia académica por “recuperar” el patrimonio cultural de los guaraníes y de reivindicar el significativo aporte poblacional que éstos tuvieron en la base de nuestra sociedad, en detrimento del silencio que se asentó sobre ellos durante dos siglos[xii]. Parte de esos esfuerzos son los que trajeron a tono el ingreso de más de 8000 guaraníes en 1828, o la existencia de varias colonias indígenas, como la de Villa Soriano, San Borja o Bella Unión. En esa línea, algunos académicos sustentan que la idea del charrúa como el indio nacional y predominante en nuestra historia sería equivocada, pues se basaría en una hipervaloración de su presencia, en perjuicio de la enorme mayoría numérica guaraní. Lo irónico es que los esfuerzos por dislocar la imagen del indio nacional hacia el guaraní sean justamente en el momento de re-emergencia charrúa.

Esos sectores, atacan a los descendientes de charrúas afirmando que estos promueven una lógica según la cual cualquier persona podría convertirse en indígena con el sólo hecho de así desearlo, que no mantienen ninguna característica cultural nativa, como el uso de lengua charrúa, y que tampoco conviven en espacios territorialmente demarcados. Para ellos, los charrúas de hoy no encajan en las características que –según afirman- definirían a alguien como indígena. El problema central de esta cuestión es qué toman los antropólogos como pautas de comparación. Aún más, ¿qué tipo de autoridad académica poseen para sobreponer teorías y definiciones antiguas a la re-emergencia de un grupo étnico y a la resignificación de una identidad en tanto fenómeno social que lleva ya más de 30 años? En primer lugar, les exigen a los descendientes autenticidad de costumbres en un contexto en el cual el Estado-nación los obligó a dispersarse y a esconderse por la fuerza. Es decir, les exigen trazos culturales que no condicen con las acciones de exterminio a los cuales fueron sometidos. En segundo lugar, porque toman como pautas clasificatorias a las descripciones hechas por viajeros europeos durante el período colonial, o sea, de hace más de 300 años.

Por otra parte, algunos antropólogos se han basado en argumentos de cuño biológico para argumentar en contra de la presencia charrúa. Señalan que, aunque hayan sobrevivido charrúas luego de las masacres, el hecho de heredar algunos genes no sería prueba suficiente para ser considerados indígenas. De hecho no es prueba suficiente. Pero no lo es por “insuficiencia” de marcadores genéticos, sino porque el propio argumento es erróneo. En la antropología social, fenómenos como el de las identidades étnicas no son mensurables en porcentajes genéticos, en tipos raciales determinados biológicamente, y en ningún tipo de determinantes que no sean fenómenos surgidos en el ámbito social de donde son emanados y recreados.

Mientras el “exceso” de marcadores indígenas –siempre presentadas como características negativas- atribuidas a los charrúas de los 1800 decretaron su etnocidio, hoy, la identidad en recuperación, el pasado y las memorias compartidas no son catalogadas como marcadores suficientes para reconocerlos como indígenas, lo cual denota que el charrúa es pensado siempre como un indio incómodo y a destiempo. La supresión de los guaraníes en nuestra historia y la violencia ejercida sobre las colonias misioneras, así como las campañas de masacre y asimilación forzada charrúa, llaman la atención sobre un doble movimiento de una misma lógica de Estado que llevó a ambos grupos a una “desaparición” obligada. Esta “desaparición”, sea entendida como fruto de las campañas de exterminio o sea por la supresión de esas presencias en las producciones historiográficas o contemporáneas, debe ser examinada a la luz de re-emergencia charrúa.

La re-emergencia charrúa

La identidad reivindicada por los descendientes y charrúas contemporáneos, no es una imitación a aquella que los europeos registraron en el siglo XVIII. Como comenté al inicio, a fines de los años 1980 y luego de la reapertura democrática, emergieron los primeros colectivos charrúas. Comúnmente, el pasar a identificarse como un descendiente o un charrúa lleva tiempo, y sobre todo, un arduo trabajo de investigación de las ramas familiares. Muchas veces el “descubrimiento” de pertenecer a una familia indígena se da por accidente, por desconfianza previa o por relatos que los más viejos deciden poner sobre la mesa. En esos casos, descendientes me han relatado que luego de descubrir que su familia era de origen indígena, pasaron a reconocer en sus propias prácticas cotidianas, -como formas de cura, mitos e incluso trazos fenotípicos-, prácticas conocidas como pertenecientes a la cultura charrúa. Pero más allá de prácticas tradicionales, lo que comparten los descendientes son el ser portadores de memorias de dolor. Lo que salta a la vista y que unifica a esta población es la memoria de trauma, de vergüenzas y miedo que comparten o que sus antepasados les transmitieron. Cargan con las memorias de sus antepasados, y mantienen vivo el peso de las persecuciones, del imperativo de llamarse a silencio, del usar nombres de origen europeo. El miedo a decirse indígena en público, el compartir trayectorias personales sólo de puertas adentro o la vergüenza de reconocerse como tal en un país que se jacta de su origen europeo, son algunos de los trazos en común.

Por qué motivo estas memorias subterráneas –haciendo eco de las palabras de Pollak (2006)-, irrumpieron en la arena pública luego de la dictadura, puede ser atribuido a muchas razones. Sin embargo, lo que debería interesarnos (y principalmente a los antropólogos sociales) es qué formas viene adoptando la re-emergencia indígena en el Uruguay, cuáles son los mecanismos y las estrategias de irrupción de memorias, qué historias tienen para contar, qué gramáticas y memorias comparten. ¿Hasta cuándo seguiremos sin tomar en serio las trayectorias de más de 76.000 integrantes de nuestra sociedad y de más de 30 años de lucha por la reivindicación de la identidad charrúa?  

* Francesca Repetto es magíster en Antropología Social por el Programa de Posgrado en Antropología Social del Museo Nacional de la Universidad Federal de Rio de Janeiro, y doctoranda por la misma institución.

Referencias

[i] Para más detalle, consultar el Informe Temático “El perfil demográfico y socioeconómico de la población uruguaya según su ascendencia racial”, de Marisa Bucheli y Wanda Cabella. INE, s/d.

[ii] Por ejemplo, “Entrevista Pi Hugarte y los charrúas”. Montevideo Portal. http://www.montevideo.com.uy/auc.aspx?104044. Acceso en: 20/12/2014. O también: Vidart, Daniel. “No hay indios en el Uruguay contemporáneo”. Anuario de Antropología Social y Cultural en Uruguay, Vol. 10, 2012, pp. 251-257.

[iii] Acosta y Lara, 2006, vol. II, p. 85.

[iv] Algunos autores llegaron a contabilizar a 20.000 guaraníes en distintos fondos documentales del país, sin embargo, la documentación de archivo consultada en el Archivo General de la Nación habla de 8.000 misioneros. Por más detalle, consultar Gonzáles y Rissotto, y Susana Rodríguez Varese. “Contribuciones al estudio de la influencia guaraní en la formación de la sociedad uruguaya”. Revista Histórica. 1982, pp. 199-316.

[v] La colonia San Servando, en el Departamento de Cerro Largo, fue fundada en 1833. Para más detalle acerca de la situación de las colonias misioneras en el país, consultar el “Informe Uruguay”, para la Comisión del Patrimonio Cultural de la Nación. PROPIM, 2012.

[vi] Archivo General de la Nación: Ministerio de Guerra y Marina, Caja 1199, Foja 60, 23.01.1832

[vii] Archivo General de la Nación: Ministerio de Guerra y Marina, Caja 1209, Foja 1, 27.12.1832.

[viii] Archivo General de la Nación: Ministerio de Gobierno y Relaciones Exteriores, Caja 214, Expediente 48, 01.01.1860.

[ix] Archivo General de la Nación: Ministerio de Guerra y Marina, Caja 1190, Foja 7.

[x] Archivo General de la Nación: Ministerio de Gobierno, Caja 1187, Foja 25.

[xi] VIDART, Daniel. “No hay indios charrúas en el Uruguay contemporáneo”. S/D. Link: http://www.bitacora.com.uy/auc.aspx?3988,7. Acceso en 01/05/2019

[xii] Para más información, consultar los trabajos de Isabel Barreto, Diego Bracco y Carmen Curbelo.

Bibliografía

POLLAK, Michael. Memoria, olvido, silencio. La producción social de identidades frente a situaciones límite. La Plata, Argentina: Ed. Al Margen, 2006.

REPETTO, A. Francesca. Arqueología do apagamento. Narrativas de desaparecimento charrúa no Uruguai desde 1830. Tesis de maestria defendida en el Programa de Pós-graduação em Antropologia Social, Museu Nacional/UFRJ. Rio de Janeiro, 2017.

VIDART, Daniel. No hay indios charrúas en el Uruguay contemporáneo. S/D. Link: http://www.bitacora.com.uy/auc.aspx?3988,7. Acceso en 01/05/2019

Lilia Schwarcz: Pandemia marca fim do século 20 e indica limites da tecnologia (UOL Universa)

Camila Brandalise e Andressa Rovani, 9 de abril de 2020

Um milhão e quinhentas mil pessoas infectadas pelo mundo —um terço delas na última semana. Oitenta e sete mil mortos em uma velocidade desconcertante. O fim dos deslocamentos. Milhões de pessoas obrigadas a readequar suas rotinas ao limite de suas casas. Há 100 dias, o mundo parou.

Em 31 de dezembro de 2019 um comunicado do governo chinês alertava a Organização Mundial da Saúde para a ocorrência de casos de uma pneumonia “de origem desconhecida” registrada no sul do país. Ainda sem nome, o novo coronavírus alcançaria 180 países ou territórios. “É incrível refletir sobre quão radicalmente o mundo mudou em tão curto período de tempo”, indica o diretor-geral da OMS, Tedros Ghebreyesus.

Para uma das principais historiadoras do país, no futuro, professores precisarão investir algumas aulas para explicar o que vivemos hoje —momento que, para ela, pode ser comparado à quebra da Bolsa de Nova York, em 1929. “A quebra da Bolsa também parecia inimaginável”, afirma Lilia Schwarcz, professora da Universidade de São Paulo e de Princeton, nos EUA. “A aula vai se chamar: O dia em que a Terra parou.”

Lilia sugere ainda que a crise causada pela disseminação da covid-19 marca o fim do século 20, período pautado pela tecnologia. “Nós tivemos um grande desenvolvimento tecnológico, mas agora a pandemia mostra esses limites”, diz.

A seguir, trechos da entrevista em que a historiadora compara o coronavírus à gripe espanhola, de 1918, diz que o negacionismo em relação a doenças sempre existiu e afirma que grandes crises sanitárias construíram heróis nacionais, como Oswaldo Cruz e Carlos Chagas, e reforçaram a fé na ciência.

Completam-se 100 dias desde que o primeiro caso de coronavírus, na China, foi notificado à Organização Mundial de Saúde. Podemos considerar que esses 100 dias mudaram o mundo?

É impressionante como um uma coisinha tão pequena, minúscula, invisível, tenha capacidade de paralisar o planeta. É uma experiência impressionante de assistir. Eu estava dando aula em Princeton [universidade nos EUA], e foi muito impressionante ver como as instituições foram fechando. É uma coisa que só se conhecia do passado, ou de distopias, era mais uma fantasia.

Nunca se sai de um estado de anomalia da mesma maneira. Crises desse tipo fecham e abrem portas. Estamos privados da nossa rotina, sem poder ver pessoas que a gente gosta, de quem sentimos imensa falta, não podemos cumprir compromissos.

Mas também abre portas: estamos refletindo um pouco se essa rotina acelerada é de fato necessária, se todas as viagens de avião são necessárias, se todo mundo precisa sair de casa e voltar no mesmo horário. Se não podemos ser mais flexíveis, menos congestionados, com menos poluição.

Então, talvez abra [a oportunidade] para refletir sobre alguns valores como a solidariedade. Todo mundo que diz que sabe o que vai acontecer está equivocado, a humanidade é muito teimosa. Mas penso que estamos vivendo uma situação muito singular, de outra temporalidade, num tempo diferente. Isso pode romper com algumas barreiras: estamos vivendo num país de muito negacionismo. No Brasil vivemos situação paradoxal, o presidente nega a pandemia.

Mas o mundo, neste momento, é outro?

Neste exato momento em que conversamos, o mundo está mudado. Nós que éramos tão certeiros nas nossas agendas, draconianas, de repente me convidam para um evento em setembro, eu digo: “Olha, não sei se vou poder ir, se vai dar para confirmar”. Essa humanização das nossas agendas, dos nossos tempos, eu penso que já mudou, sim.

Ficar em casa é reinventar sua rotina, se descobrir como uma pessoa estrangeira [à nova rotina]. Eu me conheço como uma pessoa que acorda de manhã, vai correr, vai para o trabalho, vai para o outro, chega em casa exausta. Agora, sou eu tendo que me inventar numa temporalidade diferente, que parece férias mas não é. É um movimento interior de redescoberta.

Insisto que nem todos passam por isso. [O filósofo francês] Montaigne dizia: “A humanidade é vária”. Nem todos estão passando por isso da mesma maneira, depende de raça, classe, há diferenças, varia muito.

E em relação aos papéis sociais dos homens e das mulheres?

Nós, mulheres, já temos um conhecimento distinto dos homens na noção do cuidado, na casa, acho que a mudança vai ser maior para os homens, que não estão acostumados com o dia a dia da casa, com fazer comida, arrumar. Essa ideia de cuidado foi eminentemente uma função feminina.

E estou muito interessada em ver como os homens vão lidar com essa ideia de ficar em casa e ter que cuidar também. É uma experiência muito única que vivemos.

Há discussões que dizem que o século 20 carecia de um “marco” para seu fim e que as primeiras décadas do século 21 ainda estavam lidando com a herança do século passado. A senhora concorda? Essa pandemia pode funcionar como esse divisor?

Sim. [O historiador britânico Eric] Hobsbawn disse que o longo século 19 só terminou depois das Primeira Guerra Mundial [1914-1918]. Nós usamos o marcador de tempo: virou o século, tudo mudou. Mas não funciona assim, a experiência humana é que constrói o tempo. Ele tem razão, o longo século 19 terminou com a Primeira Guerra, com mortes, com a experiência do luto, mas também o que significou sobre a capacidade destrutiva.

Acho que essa nossa pandemia marca o final do século 20, que foi o século da tecnologia. Nós tivemos um grande desenvolvimento tecnológico, mas agora a pandemia mostra esses limites

Mostra que não dá conta de conter uma pandemia como essa, nem de manter a sua rotina numa situação como essa. A grande palavra do final do século 19 era progresso. Euclides da Cunha dizia: “Estamos condenados ao progresso”. Era quase natural, culminava naquela sociedade que gostava de se chamar de civilização.

O que a Primeira Guerra mostrou? Que [o mundo] não era tão civilizado quando se imaginava. Pessoas se guerreavam frente a frente. E isso mostrou naquele momento o limite da noção de civilização e de evolução, que era talvez o grande mito do final do século 19 e começo do 20. E nós estamos movendo limites. Investimos tanto na tecnologia, mas não em sistemas de saúde e de prevenção que pudessem conter esse grande inimigo invisível.

A senhora já assinalou que a gripe espanhola matou muito mais do que as duas Grandes Guerras juntas e que, assim como vivemos hoje no Brasil, houve muito negacionismo e lentidão na tomada de decisões. Não aprendemos essa lição? Por que é difícil não repetir os erros?

A doença, seja ela qual for, produz uma sensação de medo e insegurança. Diante desse tipo de crise, sanitária, a nossa primeira reação é dizer: “Não, aqui não, aqui não vai entrar”. Antes de virar pandemia, as mortes são distantes, esse discurso do “aqui não”, é muito claro, é natural, com todas as aspas que se pode colocar, porque o estado que queremos é de saúde. Mas nós também somos uma sociedade que esquece o nosso próprio corpo, ele serve para botar uma roupa, pentear o cabelo, é como se ele não existisse.

É demorado assumir, o negacionismo existiu sempre. No começo do século, em 1903, a expectativa de vida era de 33 anos. O Brasil era chamado de grande hospital e tinha todo tipo de doença: lepra, sífilis, tuberculose, peste bubônica, febre amarela. Quando entra [o presidente] Rodrigues Alves e indica um médico sanitarista para combater a febre amarela, a peste bubônica e a varíola, eles começam matando ratos e mosquitos e depois passam a vacinar contra a varíola.

Mas na época a população não entendeu, não foi informada e reagiu. O mesmo presidente que indicou Osvaldo Cruz é o que vai estar no poder no contexto da gripe espanhola. Osvaldo Cruz já tinha morrido, então indica o herdeiro dele, Carlos Chagas. [Com a gripe espanhola] As autoridades brasileiras já sabiam o que estava acontecendo, mesmo assim não tomaram atitude. A gripe entrou a bordo de navios que atracaram no Brasil e aí explodiu. Mas a atitude sempre foi essa: “Aqui não, é um país de clima quente, não é de pessoas idosas”.

Como pode falar em ter menos risco no Brasil porque a população é mais jovem, se é muito mais desigual que países europeus que já estão sofrendo? O negacionismo cria o bode expiatório, é recorrente.

Mas por que não aprendemos com os erros do passado?

Porque o negacionismo nega a história também. É dizer: “Em 1918 não tínhamos as condições que temos agora, não tínhamos a tecnologia”. Então também se pode usar a história de maneira negacionista, negando o passado e dizendo que isso aconteceu naquela época mas não vai acontecer agora.

Quando se fala em guerra, o que acontece? Por que todos os países têm seu exército e tem reserva? Porque, na hipótese de ter uma guerra, temos que ter um exército, tem toda uma população de reserva na hipótese de ter guerra.

Se o estado brasileiro levasse a sério a metáfora bélica, o que já deveria ter sido feito? Uma estrutura para atender guerras de saúde, e isso não é só no Brasil, mas os estados não fazem, não existe um sistema para prevenir as pandemias.

A doença só existe quando as pessoas concordam que ela existe, é preciso ensinar para população. Se não tem esse comando, as pessoas não constroem a doença e continuam a negá-la

As reações contra a gripe espanhola foram muito semelhantes às de agora: poucas pessoas andavam nas ruas, quem andava estava de máscara, igrejas fechadas, teatros lavados com detergente. A humanidade ainda não inventou outra maneira de lidar com a pandemia a não ser esperar pelo remédio ou pela vacina.

Nos acostumamos com o discurso de que os idosos vão morrer quase que inevitavelmente caso sejam infectados. O que isso mostra sobre a maneira como lidamos com as pessoas mais velhas?

Mostra que somos uma sociedade que preza a juventude e faz o que com a história e com os idosos? Transforma tudo em velharia. Eu particularmente não acho que juventude seja qualidade. É uma forma de estar no mundo. Você pode ser jovem na terceira idade, ou um velho jovem. Essa nossa construção da juventude faz muito mal.

E a pergunta que cada um de nós tem que se fazer: alguém tem direito de dizer quem pode morrer ou não? Se cuidarmos melhor das populações vulneráveis, e aí se incluem os idosos, estaremos cuidando melhor de nós mesmos, não só na questão simbólica, também na questão prática.

O que é não lidar com a velhice? É uma forma que nós temos de não lidar com a morte, não sabemos falar do luto. Não vemos o presidente falar uma palavra de solidariedade às famílias das pessoas que morreram, é como se não quisesse falar da morte.

Estamos esticando a nossa linha do tempo, as pessoas não podem envelhecer, e ao mesmo tempo estamos acabando com nossa capacidade subjetiva. Velhice é vista só como momento de decrepitude. Não são valores que são estimados pela população e no nosso século.

Tem a ver com tecnologia também: velho é aquele que não sabe lidar com ela. Portanto, o isole. E ele que aguarde a morte.

Remédios milagrosos também fazem parte da história das pandemias?

Todos nós sempre esperamos por um milagre. Nossa prepotência é um pouco esta: achamos que somos uma sociedade muito racional, que se pauta pela tecnologia, mas todos nós esperamos por um milagre sempre.

Todo mundo quer ouvir o que o presidente fala: “Tenho um remédio que vai acabar com isso tudo”. Que pensamento mágico é esse? A crise vai mudar o mundo? Depende do quanto as pessoas saírem do pensamento mágico, refletirem mais sobre seus castelos de verdades.

A pandemia traz alguma mudança em relação à história das mulheres?

A questão das mulheres é também questão de gênero e classe social. Mulheres de classes média e alta têm muitos recursos e podem lidar mais livremente com trabalho. O que é muito diferente no caso de mulheres pobres, negras, que vivenciam ainda mais essa situação. Há muitas enfermeiras negras e pardas. A posição da enfermeira é de cuidado também, com os pacientes, até com os médicos, ela desempenha esse papel que tem no interior da sua casa no sistema de saúde.

E essas mulheres são vulneráveis porque muitas delas estão nas lidas dos hospitais, sem proteção necessária, e porque estão nas lidas das suas casas.

Os séculos 20 e 21 são da revolução feminista, como já vai aparecendo. As mulheres não vão voltar atrás. Teremos uma realidade marcada por uma nova posição das mulheres

Eu desejo que as pessoas usem esse momento para repensar suas verdades, e dentre as muitas verdades [que precisam ser repensadas, está essa questão de gênero muitas vezes invisível: mulheres ocupam as posições de cuidado sem ser vista.

Como um professor de história explicará a pandemia de 2020 daqui a 100 anos?

Vai explicar como o crash da Bolsa de Nova York é explicado hoje. Essa pandemia vai merecer algumas aulas. A quebra da Bolsa também parecia inimaginável, e estamos vivendo situações que são anomalias nesse sentido, porque são inimagináveis.

O professor de história terá que lidar com o fato de que a pandemia poderá marcar o final de um século e começo de outro, como também conseguiu parar o mundo em tal atividade e com tal rotatividade, e com tanta velocidade. Nós aceleramos muito, e agora tivemos que parar.

O título da aula será: “O dia em que a Terra parou”

A ameaça da pandemia também deu mais voz a quem tenta chamar a atenção para as condições de moradia e saúde precárias de uma parte significativa dos brasileiros. A crise é também uma oportunidade para uma mudança social?

O Brasil consistentemente vai ganhando posições de proeminência de desigualdade social, há classes sociais muito distintas no alcance das benesses da tão proclamada civilização. O Brasil é o 6º país mais desigual do mundo. Tendemos também a negar a desigualdade. Não acho que será pior com classes baixas do que será com idosos, são grupos muito vulneráveis [ao risco de agravamento].

Na gripe espanhola, os grupos mais afetados eram as populações pobres, dos subúrbios. As vítimas tinham entre 20 a 40 anos, mas muitos mais morreram em nome da civilização, porque a pobreza foi expulsa [do centro]. E as epidemias são impiedosas. Quando dizem “Fique em casa, mantenha o isolamento”, tem que refletir sobre as condições que moram essas populações.

Em um Brasil tão múltiplo, com condições sociais tão diferentes, os mais pobres serão as populações mais afetadas. O Brasil também é o terceiro país em população carcerária. Me tira o sono o que vai acontecer se a pandemia entrar nas prisões. Se é que já não chegou e nós não sabemos. Se isso acontecer, quando chegar nos mais pobres, vamos ter que enfrentar como é perversa a correlação de pandemia e desigualdade social.

No Brasil, que tem uma saúde dividida entre privada e pública, as pessoas de mais renda nem pensam em usar a saúde pública. A doença faz isso, vai nivelar, porque atinge as várias classes sociais

Já podemos vislumbrar alguma aprendizagem com a crise atual?

Eu penso que sim, vários países já estão começando a pensar no exército de reserva, como vamos construir não só uma estrutura para reagir à pandemia mas que também se antecipe.

O problema é que nós vivemos um governo no Brasil que não acredita na ciência. Vamos ver se aprendemos de uma vez, que a gente pense no que a ciência produz. Em horas como agora fica mais claro: a saída virá da ciência, com a vacina ou remédio que venha controlar a pandemia.

Não estranharia se tivermos os próximos presidentes médicos, o que estamos aprendendo nos vários países é a importância do Ministério da Saúde, e de termos de fato especialistas nos ministérios, contar não apenas com um político, mas com um político especialista.


Que grande mudança política já é possível dizer que a pandemia trouxe ao Brasil?

Ela está acontecendo. O presidente foi destituído pelo Ministro da Saúde. Mandetta seria demitido mas recuou após pressão. Você já está verificando um crescimento dessas figuras, como aconteceu na época da Revolta da Vacina [1904], o grande herói daquele momento era Oswaldo Cruz, e na gripe espanhola, Carlos Chagas virou grande herói nacional.

Espero que essas pessoas, se chegarem a esses lugares, não usem a posição para garantir mais poder, torço muito para que usem de forma generosa essa posição.

A política é como cachaça, quem tomou não abre mais mão. É o caso de não baixar a vigilância cidadã em relação a políticos médicos. Mandetta, que está ocupando bem seu cargo, foi profundamente ideológico, com a carreira vinculada a seguros médicos privados, e, por ideologia, acabou com o Mais Médicos.

As pessoas olhavam para nós, acadêmicos, e diziam: “Vocês são parasitas”. Espero que as pessoas reflitam e entendam que o mundo da produção tem temporalidades diferentes.

Uma coisa é o tempo da indústria, da tecnologia, que é questão de segundos. Outra é o tempo do cientista, que usa da temporalidade mais alargada para descobrir novas saídas. As pessoas vão começar a entender, como na época da gripe espanhola, porque Carlos Chagas se tornou mais popular do que cantor e jogador de futebol — as charges falavam isso.

A ciência, que era o bandido, é hoje a grande a utopia.

Antropóloga e historiadora, Lila Schwarcz é professora titular na Universidade de São Paulo e professora visitante na Universidade de Princeton, nos EUA. É autora de uma série de livros, entre eles: “Sobre o autoritarismo brasileiro“; “Espetáculo das raças” e “Brasil: Uma biografia”. É editora da Companhia das Letras, colunista do jornal Nexo e curadora adjunta para histórias do Masp.

For Decades, Our Coverage Was Racist. To Rise Above Our Past, We Must Acknowledge It (National Geographic)

We asked a preeminent historian to investigate our coverage of people of color in the U.S. and abroad. Here’s what he found.

In a full-issue article on Australia that ran in 1916, Aboriginal Australians were called “savages” who “rank lowest in intelligence of all human beings.” PHOTOGRAPHS BY C.P. SCOTT (MAN); H.E. GREGORY (WOMAN); NATIONAL GEOGRAPHIC CREATIVE (BOTH)

This story helps launch a series about racial, ethnic, and religious groups and their changing roles in 21st-century life. The series runs through 2018 and will include coverage of Muslims, Latinos, Asian Americans, and Native Americans.

contributors-page-fairfax-virginia-butler.adapt.280.1

 “Cards and clay pipes amuse guests in Fairfax House’s 18th-century parlor,” reads the caption in a 1956 article on Virginia history. Although slave labor built homes featured in the article, the writer contended that they “stand for a chapter of this country’s history every American is proud to remember.” PHOTOGRAPH BY ROBERT F. SISSON AND DONALD MCBAIN, NATIONAL GEOGRAPHIC CREATIVE (RIGHT)

I’m the tenth editor of National Geographic since its founding in 1888. I’m the first woman and the first Jewish person—a member of two groups that also once faced discrimination here. It hurts to share the appalling stories from the magazine’s past. But when we decided to devote our April magazine to the topic of race, we thought we should examine our own history before turning our reportorial gaze to others.

Race is not a biological construct, as writer Elizabeth Kolbert explains in this issue, but a social one that can have devastating effects. “So many of the horrors of the past few centuries can be traced to the idea that one race is inferior to another,” she writes. “Racial distinctions continue to shape our politics, our neighborhoods, and our sense of self.”

How we present race matters. I hear from readers that National Geographic provided their first look at the world. Our explorers, scientists, photographers, and writers have taken people to places they’d never even imagined; it’s a tradition that still drives our coverage and of which we’re rightly proud. And it means we have a duty, in every story, to present accurate and authentic depictions—a duty heightened when we cover fraught issues such as race.

Photographer Frank Schreider shows men from Timor island his camera in a 1962 issue. The magazine often ran photos of “uncivilized” native people seemingly fascinated by “civilized” Westerners’ technology. PHOTOGRAPH BY FRANK AND HELEN SCHREIDER, NATIONAL GEOGRAPHIC CREATIVE

We asked John Edwin Mason to help with this examination. Mason is well positioned for the task: He’s a University of Virginia professor specializing in the history of photography and the history of Africa, a frequent crossroads of our storytelling. He dived into our archives.

What Mason found in short was that until the 1970s National Geographicall but ignored people of color who lived in the United States, rarely acknowledging them beyond laborers or domestic workers. Meanwhile it pictured “natives” elsewhere as exotics, famously and frequently unclothed, happy hunters, noble savages—every type of cliché.

Unlike magazines such as Life, Mason said, National Geographic did little to push its readers beyond the stereotypes ingrained in white American culture.

editors-page-pacific-islanders.adapt.280.1National Geographic of the mid-20th century was known for its glamorous depictions of Pacific islanders. Tarita Teriipaia, from Bora-Bora, was pictured in July 1962—the same year she appeared opposite Marlon Brando in the movie Mutiny on the Bounty. PHOTOGRAPH BY LUIS MARDEN, NATIONAL GEOGRAPHIC CREATIVE (RIGHT)

“Americans got ideas about the world from Tarzan movies and crude racist caricatures,” he said. “Segregation was the way it was. National Geographic wasn’t teaching as much as reinforcing messages they already received and doing so in a magazine that had tremendous authority. National Geographic comes into existence at the height of colonialism, and the world was divided into the colonizers and the colonized. That was a color line, and National Geographic was reflecting that view of the world.”

Some of what you find in our archives leaves you speechless, like a 1916 story about Australia. Underneath photos of two Aboriginal people, the caption reads: “South Australian Blackfellows: These savages rank lowest in intelligence of all human beings.”

Questions arise not just from what’s in the magazine, but what isn’t. Mason compared two stories we did about South Africa, one in 1962, the other in 1977. The 1962 story was printed two and a half years after the massacre of 69 black South Africans by police in Sharpeville, many shot in the back as they fled. The brutality of the killings shocked the world.

An article reporting on apartheid South Africa in 1977 shows Winnie Mandela, a founder of the Black Parents’ Association and wife of Nelson. She was one of some 150 people the government prohibited from leaving their towns, speaking to the press, and talking to more than two people at a time. PHOTOGRAPH BY JAMES P. BLAIR, NATIONAL GEOGRAPHIC CREATIVE

National Geographic’s story barely mentions any problems,” Mason said. “There are no voices of black South Africans. That absence is as important as what is in there. The only black people are doing exotic dances … servants or workers. It’s bizarre, actually, to consider what the editors, writers, and photographers had to consciously not see.”

Contrast that with the piece in 1977, in the wake of the U.S. civil rights era: “It’s not a perfect article, but it acknowledges the oppression,” Mason said. “Black people are pictured. Opposition leaders are pictured. It’s a very different article.”

Fast-forward to a 2015 story about Haiti, when we gave cameras to young Haitians and asked them to document the reality of their world. “The images by Haitians are really, really important,” Mason said, and would have been “unthinkable” in our past. So would our coverage now of ethnic and religious conflicts, evolving gender norms, the realities of today’s Africa, and much more.

“I buy bread from her every day,” Haitian photographer Smith Neuvieme said of fellow islander Manuela Clermont. He made her the center of this image, published in 2015PHOTOGRAPH BY SMITH NEUVIEME, FOTOKONBIT

Mason also uncovered a string of oddities—photos of “the native person fascinated by Western technology. It really creates this us-and-them dichotomy between the civilized and the uncivilized.” And then there’s the excess of pictures of beautiful Pacific-island women.

“If I were talking to my students about the period until after the 1960s, I would say, ‘Be cautious about what you think you are learning here,’ ” he said. “At the same time, you acknowledge the strengths National Geographic had even in this period, to take people out into the world to see things we’ve never seen before. It’s possible to say that a magazine can open people’s eyes at the same time it closes them.”

April 4 marks the 50th anniversary of the assassination of Martin Luther King, Jr. It’s a worthy moment to step back, to take stock of where we are on race. It’s also a conversation that is changing in real time: In two years, for the first time in U.S. history, less than half the children in the nation will be white. So let’s talk about what’s working when it comes to race, and what isn’t. Let’s examine why we continue to segregate along racial lines and how we can build inclusive communities. Let’s confront today’s shameful use of racism as a political strategy and prove we are better than this.

For us this issue also provided an important opportunity to look at our own efforts to illuminate the human journey, a core part of our mission for 130 years. I want a future editor of National Geographic to look back at our coverage with pride—not only about the stories we decided to tell and how we told them but about the diverse group of writers, editors, and photographers behind the work.

We hope you will join us in this exploration of race, beginning this month and continuing throughout the year. Sometimes these stories, like parts of our own history, are not easy to read. But as Michele Norris writes in this issue, “It’s hard for an individual—or a country—to evolve past discomfort if the source of the anxiety is only discussed in hushed tones.”

Climate Change – Catastrophic or Linear Slow Progression? (Armstrong Economics)

woolyrhinoIndeed, science was turned on its head after a discovery in 1772 near Vilui, Siberia, of an intact frozen woolly rhinoceros, which was followed by the more famous discovery of a frozen mammoth in 1787. You may be shocked, but these discoveries of frozen animals with grass still in their stomachs set in motion these two schools of thought since the evidence implied you could be eating lunch and suddenly find yourself frozen, only to be discovered by posterity.

baby-mammoth

The discovery of the woolly rhinoceros in 1772, and then frozen mammoths, sparked the imagination that things were not linear after all. These major discoveries truly contributed to the “Age of Enlightenment” where there was a burst of knowledge erupting in every field of inquisition. Such finds of frozen mammoths in Siberia continue to this day. This has challenged theories on both sides of this debate to explain such catastrophic events. These frozen animals in Siberia suggest strange events are possible even in climates that are not that dissimilar from the casts of dead victims who were buried alive after the volcanic eruption of 79 AD at Pompeii in ancient Roman Italy. Animals can be grazing and then suddenly freeze abruptly. That climate change was long before man invented the combustion engine.

Even the field of geology began to create great debates that perhaps the earth simply burst into a catastrophic convulsion and indeed the planet was cyclical — not linear. This view of sequential destructive upheavals at irregular intervals or cycles emerged during the 1700s. This school of thought was perhaps best expressed by a forgotten contributor to the knowledge of mankind, George Hoggart Toulmin in his rare 1785 book, “The Eternity of the World“:

” ••• convulsions and revolutions violent beyond our experience or conception, yet unequal to the destruction of the globe, or the whole of the human species, have both existed and will again exist ••• [terminating] ••• an astonishing succession of ages.”

Id./p3, 110

bernhardi-erratics

In 1832, Professor A. Bernhardi argued that the North Polar ice cap had extended into the plains of Germany. To support this theory, he pointed to the existence of huge boulders that have become known as “erratics,” which he suggested were pushed by the advancing ice. This was a shocking theory for it was certainly a nonlinear view of natural history. Bernhardi was thinking out of the box. However, in natural science people listen and review theories unlike in social science where theories are ignored if they challenge what people want to believe. In 1834, Johann von Charpentier (1786-1855) argued that there were deep grooves cut into the Alpine rock concluding, as did Karl Schimper, that they were caused by an advancing Ice Age.

This body of knowledge has been completely ignored by the global warming/climate change religious cult. They know nothing about nature or cycles and they are completely ignorant of history or even that it was the discovery of these ancient creatures who froze with food in their mouths. They cannot explain these events nor the vast amount of knowledge written by people who actually did research instead of trying to cloak an agenda in pretend science.

Glaciologists have their own word, jökulhlaup(from Icelandic), to describe the spectacular outbursts when water builds up behind a glacier and then breaks loose. An example was the 1922 jökulhlaup in Iceland. Some seven cubic kilometers of water, melted by a volcano under a glacier, had rushed out in a few days. Still grander, almost unimaginably events, were floods that had swept across Washington state toward the end of the last ice age when a vast lake dammed behind a glacier broke loose. Catastrophic geologic events are not generally part of the uniformitarian geologist’s thinking. Rather, the normal view tends to be linear including events that are local or regional in size

One example of a regional event would be the 15,000 square miles of the Channeled Scablands in eastern WashingtonInitially, this spectacular erosion was thought to be the product of slow gradual processes. In 1923, JHarlen Bretz presented a paper to the Geological Society of America suggesting the Scablands were eroded catastrophically. During the 1940s, after decades of arguing, geologists admitted that high ridges in the Scablands were the equivalent of the little ripples one sees in mud on a streambed, magnified ten thousand times. Finally, by the 1950s, glaciologists were accustomed to thinking about catastrophic regional floods. The Scablands are now accepted to have been catastrophically eroded by the “Spokane Flood.” This Spokane flood was the result of the breaching of an ice dam which had created glacial Lake Missoula. Now the United States Geological Survey estimates the flood released 500 cubic miles of water, which drained in as little as 48 hours. That rush of water gouged out millions of tons of solid rock.

When Mount St. Helens erupted in 1980, this too produced a catastrophic process whereby two hundred million cubic yards of material was deposited by volcanic flows at the base of the mountain in just a matter of hours. Then, less than two years later, there was another minor eruption, but this resulted in creating a mudflow, which carved channels through the recently deposited material. These channels, which are 1/40th the size of the Grand Canyon, exposed flat segments between the catastrophically deposited layers. This is what we see between the layers exposed in the walls of the Grand Canyon. What is clear, is that these events were relatively minor compared to a global flood. For example, the eruption of Mount St. Helens contained only 0.27 cubic miles of material compared to other eruptions, which have been as much as 950 cubic miles. That is over 2,000 times the size of Mount St. Helens!

With respect to the Grand Canyon, the specific geologic processes and timing of the formation of the Grand Canyon have always sparked lively debates by geologists. The general scientific consensus, updated at a 2010 conference, maintains that the Colorado River carved the Grand Canyon beginning 5 million to 6 million years ago. This general thinking is still linear and by no means catastrophic. The Grand Canyon is believed to have been gradually eroded. However, there is an example cyclical behavior in nature which demonstrates that water can very rapidly erode even solid rock. An example of this took place in the Grand Canyon region back on June 28th, 1983. There emerged an overflow of Lake Powell which required the use of the Glen Canyon Dam’s 40-foot diameter spillway tunnels for the first time. As the volume of water increased, the entire dam started to vibrate and large boulders spewed from one of the spillways. The spillway was immediately shut down and an inspection revealed catastrophic erosion had cut through the three-foot-thick reinforced concrete walls and eroded a hole 40 feet wide, 32 feet deep, and 150 feet long in the sandstone beneath the dam. Nobody thought such catastrophic erosion that quick was even possible.

Some have speculated that the end of the Ice Age resulted in a flood of water which had been contained by an ice dam. Like that of the Scablands, it is possible that a sudden catastrophic release of water originally carved the Grand Canyon. It is clear that both the formation of the Scablands and the evidence of how Mount St Helens unfolded, may be support for the catastrophic formation of events rather than nice, slow, and linear formations.

Then there is the Biblical Account of the Great Flood and Noah. Noah is also considered to be a Prophet of Islam. Darren Aronofsky’s film Noah was based on the biblical story of Genesis. Some Christians were angry because the film strayed from biblical Scripture. The Muslim-majority countries banned the film Noah from screening in theaters because Noah was a prophet of God in the Koran. They considered it to be blasphemous to make a film about a prophet. Many countries banned the film entirely.

The story of Noah predates the Bible. There exists the legend of the Great Flood rooted in the ancient civilizations of Mesopotamia. The Sumerian Epic of Gilgamesh dates back nearly 5,000 years which is believed to be perhaps the oldest written tale on Earth. Here too, we find an account of the great sage Utnapishtim, who is warned of an imminent flood to be unleashed by wrathful gods. He builds a vast circular-shaped boat, reinforced with tar and pitch, and carries his relatives, grains along with animals. After enduring days of storms, Utnapishtim, like Noah in Genesis, releases a bird in search of dry land. Since there is evidence that there were survivors in different parts of the world, it is merely logical that there should be more than just one.

Archaeologists generally agree that there was a historical deluge between 5,000 and 7,000 years ago which hit lands ranging from the Black Sea to what many call the cradle of civilization, which was the floodplain between the Tigris and Euphrates rivers. The translation of ancient cuneiform tablets in the 19th century confirmed the Mesopotamian Great Flood myth as an antecedent of the Noah story in the Bible.

The problem that existed was the question of just how “great” was the Great Flood? Was it regional or worldwide? The stories of the Great Flood in Western Culture clearly date back before the Bible. The region implicated has long been considered to be the Black Sea. It has been suggested that the water broke through the land by Istanbul and flooded a fertile valley on the other side much as we just looked at in the Scablands. Robert Ballard, one of the world’s best-known underwater archaeologists, who found the Titanic, set out to test that theory to search for an underwater civilization. He discovered that some four hundred feet below the surface, there was an ancient shoreline, proving that there was a catastrophic event did happen in the Black Sea. By carbon dating shells found along the underwater shoreline, Ballard dated this catastrophic event to around 5,000 BC. This may match around the time when Noah’s flood could have occurred.

Given the fact that for the entire Earth to be submerged for 40 days and 40 nights is impossible for that much water to simply vanish, we are probably looking at a Great Flood that at the very least was regional. However, there are tales of the Great Floodwhich spring from many other sources. Various ancient cultures have their own legends of a Great Flood and salvation. According to Vedic lore, a fish tells the mythic Indian king Manu of a Great Flood that will wipe out humanity. In turn, Manu also builds a ship to withstand the epic rains and is later led to a mountaintop by the same fish.

We also find an Aztec story that tells of a devout couple hiding in the hollow of a vast tree with two ears of corn as divine storms drown the wicked of the land. Creation myths from Egypt to Scandinavia also involve tidal floods of all sorts of substances purging and remaking the earth. The fact that we have Great Flood stories from India is not really a surprise since there was contact between the Middle East and India throughout recorded history. However, the Aztec story lacks the ship, but it still contains punishing the wicked and here there was certainly no direct contact, although there is evidence of cocaine use in Egypt implying there was some trade route probably through island hopping in the Pacific to the shores of India and off to Egypt. Obviously, we cannot rule out that this story of the Great Flood even made it to South America. 

Then again, there is the story of Atlantis – the island that sunk beath the sea. The Atlantic Ocean covers approximately one-fifth of Earth’s surface and second in size only to the Pacific Ocean. The ocean’s name, derived from Greek mythology, means the “Sea of Atlas.” The origin of names is often very interesting clues as well. For example. New Jersey is the English Translation of Latin Nova Caesarea which appeared even on the colonial coins of the 18th century. Hence, the state of New Jersey is named after the Island of Jersey which in turn was named in the honor of Julius Caesar. So we actually have an American state named after the man who changed the world on par with Alexander the Great, for whom Alexandria of Virginia is named after with the location of the famous cemetery for veterans, where John F. Kennedy is buried.

So here the Atlantic Ocean is named after Atlas and the story of Atlantis. The original story of Atlantis comes to us from two Socratic dialogues called Timaeus and Critias, both written about 360 BC by the Greek philosopher Plato. According to the dialogues, Socrates asked three men to meet him: Timaeus of Locri, Hermocrates of Syracuse, and Critias of Athens. Socrates asked the men to tell him stories about how ancient Athens interacted with other states. Critias was the first to tell the story. Critias explained how his grandfather had met with the Athenian lawgiver Solon, who had been to Egypt where priests told the Egyptian story about Atlantis. According to the Egyptians, Solon was told that there was a mighty power based on an island in the Atlantic Ocean. This empire was called Atlantis and it ruled over several other islands and parts of the continents of Africa and Europe.

Atlantis was arranged in concentric rings of alternating water and land. The soil was rich and the engineers were technically advanced. The architecture was said to be extravagant with baths, harbor installations, and barracks. The central plain outside the city was constructed with canals and an elaborate irrigation system. Atlantis was ruled by kings but also had a civil administration. Its military was well organized. Their religious rituals were similar to that of Athens with bull-baiting, sacrifice, and prayer.

Plato told us about the metals found in Atlantis, namely gold, silver, copper, tin and the mysterious Orichalcum. Plato said that the city walls were plated with Orichalcum (Brass). This was a rare alloy metal back then which was found both in Crete as well as in the Andes, in South America. An ancient shipwreck was discovered off the coast of Sicily in 2015 which contained 39 ingots of Orichalcum. Many claimed this proved the story of AtlantisOrichalcum was believed to have been a gold/copper alloy that was cheaper than gold, but twice the value of copper. Of course, Orichalcum was really a copper-tin or copper-zinc brass. We find in Virgil’s Aeneid, the breastplate of Turnus is described as “stiff with gold and white orichalc”.

The monetary reform of Augustus in 23BC reintroduced bronze coinage which had vanished after 84BC. Here we see the introduction of Orichalcum for the Roman sesterius and the dupondius. The Roman As was struck in near pure copper. Therefore, about 300 years after Plato, we do see Orichalcum being introduced as part of the monetary system of Rome. It is clear that Orichalcum was rare at the time Plato wrote this. Consequently, this is similar to the stories of America that there was so much gold, they paved the streets with it.

As the story is told, Atlantis was located in the Atlantic Ocean. There have been bronze-age anchors discovered at the Gates of Hercules (Straights of Gibralter) and many people proclaimed this proved Atlantis was real. However, what these proponents fail to take into account is the Minoans. The Minoans were perhaps the first International Economy. They traded far and wide even with Britain seeking tin to make bronze – henceBronze Age. Their civilization was of the Bronze Age rising civilization that arose on the island of Crete and flourished from approximately the 27th century BC to the 15th century BC – nearly 12,000 years. Their trading range and colonization extended to Spain, Egypt, Israel (Canaan), Syria (Levantine), Greece, Rhodes, and of course to Turkey (Anatolia). Many other cultures referred to them as the people from the islands in the middle of the sea. However, the Minoans had no mineral deposits. They lacked gold as well as silver or even the ability to produce large mining of copper. They appear to have copper mines in Anatolia (Turkey) in colonized cities. What has survived are examples of copper ingots that served as MONEY in trade. Keep in mind that gold at this point was rare, too rare to truly serve as MONEY. It is found largely as jewelry in tombs of royal dignitaries.

The Bronze Age emerged at different times globally appearing in Greece and China around 3,000BC but it came late to Britain reaching there about 1900BC. It is known that copper emerged as a valuable tool in Anatolia (Turkey) as early as 6,500BC, where it began to replace stone in the creation of tools. It was the development of casting copper that also appears to aid the urbanization of man in Mesopotamia. By 3,000BC, copper is in wide use throughout the Middle East and starts to move up into Europe. Copper in its pure stage appears first, and tin is eventually added creating actual bronze where a bronze sword would break a copper sword. It was this addition of tin that really propelled the transition of copper to bronze and the tin was coming from England where vast deposits existed at Cornwall. We know that the Minoans traveled into the Atlantic for trade. Anchors are not conclusive evidence of Atlantis.

As the legend unfolds, Atlantis waged an unprovoked imperialistic war on the remainder of Asia and Europe. When Atlantis attacked, Athens showed its excellence as the leader of the Greeks, the much smaller city-state the only power to stand against Atlantis. Alone, Athens triumphed over the invading Atlantean forces, defeating the enemy, preventing the free from being enslaved, and freeing those who had been enslaved. This part may certainly be embellished and remains doubtful at best. However, following this battle, there were violent earthquakes and floods, and Atlantis sank into the sea, and all the Athenian warriors were swallowed up by the earth. This appears to be almost certainly a fiction based on some ancient political realities. Still, the explosive disappearance of an island some have argued is a reference to the eruption of MinoanSantorini. The story of Atlantis does closely correlate with Plato’s notions of The Republic examining the deteriorating cycle of life in a state.

 

There have been theories that Atlantiswas the Azores, and still, others argue it was actually South America. That would explain to some extent the cocaine mummies in Egypt. Yet despite all these theories, usually, when there is an ancient story, despite embellishment, there is often a grain of truth hidden deep within. In this case, Atlantis may not have completely submerged, but it could have partially submerged from an earthquake at least where some people survived. Survivors could have made to either the Americas or to Africa/Europe. What is clear, is that a sudden event could have sent a  tsunami into the Mediterranean which then broke the land mass at Istanbul and flooded the valley below transforming this region into the Black Sea becoming the story of Noah.

We also have evidence which has surfaced that the Earth was struck by a comet around 12,800 years ago. Scientific American has published that sediments from six sites across North America—Murray Springs, Ariz.; Bull Creek, Okla.; Gainey, Mich.; Topper, S.C.; Lake Hind, Manitoba; and Chobot, Alberta, have yielded tiny diamonds, which only occur in sediment exposed to extreme temperatures and pressures. The evidence surfacing implies that the Earth moved into an Ice Age killing off large mammals and setting the course for Global Cooling for the next 1300 years. This may indeed explain that catastrophic freezing of Wooly Mammoths in Siberia. Such an event could have also been responsible for the legend of Atlantis where the survivors migrated taking their stories with them.

There is also evidence surfacing from stone carvings at one of the oldest sites recorded located in Anatolia (Turkey). Using a computer programme to show where the constellations would have appeared above Turkey thousands of years ago, researchers were able to pinpoint the comet strike to 10,950BC, the exact time the Younger Dryas,which was was a return to glacial conditions and Global Cooling which temporarily reversed the gradual climatic warming after the Last Glacial Maximum that began to recede around 20,000 BC, utilizing ice core data from Greenland.

Now, there is a very big asteroid which passed by the Earth on September 16th, 2013. What is most disturbing is the fact that its cycle is 19 years so it will return in 2032. Astronomers have not been able to swear it will not hit the Earth on the next pass in 2032. It was discovered by Ukrainian astronomers with just 10 days to go back in 2013.  The 2013 pass was only a distance of 4.2 million miles (6.7 million kilometers). If anything alters its orbit, then it will get closer and closer. It just so happens to line up on a cyclical basis that suggests we should begin to look at how to deflect asteroids and soon.

It definitely appears that catastrophic cooling may also be linked to the Earth being struck by a meteor, asteroids, or a comet. We are clearly headed into a period of Global Cooling and this will get worse as we head into 2032. The question becomes: Is our model also reflecting that it is once again time for an Earth change caused by an asteroid encounter? Such events are not DOOMSDAY and the end of the world. They do seem to be regional. However, a comet striking in North America would have altered the comet freezing animals in Siberia.

If there is a tiny element of truth in the story of Atlantis, the one thing it certainly proves is clear – there are ALWAYS survivors. Based upon a review of the history of civilization as well as climate, what resonates profoundly is that events follow the cyclical model of catastrophic occurrences rather than the linear steady slow progression of evolution.

O Brasil deveria mudar o modo como lida com a memória da escravidão? (BBC Brasil)

29 outubro 2016

Exposição 'Mãe Preta', no Rio de Janeiro

O Brasil recebeu a maioria dos africanos escravizados enviados às Américas

Uma sala com peças de um navio que levava para o Brasil 500 mulheres, crianças e homens escravizados é a principal atração do novo museu sobre a história dos americanos negros, em Washington.

Numa segunda-feira de outubro, era preciso passar 15 minutos na fila para entrar na sala com objetos do São José – Paquete de África, no subsolo do Museu de História e Cultura Afroamericana.

Inaugurado em setembro pelo Smithsonian Institution, o museu custou o equivalente a R$ 1,7 bilhão se tornou o mais concorrido da capital americana: os ingressos estão esgotados até março de 2017.

Em 1794, o São José deixou a Ilha de Moçambique, no leste africano, carregado de pessoas que seriam vendidas como escravas em São Luís do Maranhão. A embarcação portuguesa naufragou na costa da África do Sul, e 223 cativos morreram.

Visitantes – em sua maioria negros americanos – caminhavam em silêncio pela sala que simula o porão de um navio negreiro, entre lastros de ferro do São José e algemas usadas em outras embarcações (um dos pares, com circunferência menor, era destinado a mulheres ou crianças).

“Tivemos 12 negros que se afogaram voluntariamente e outros que jejuaram até a morte, porque acreditam que quando morrem retornam a seu país e a seus amigos”, diz o capitão de outro navio, em relato afixado na parede.

Prova de existência

Expor peças de um navio negreiro era uma obsessão do diretor do museu, Lonnie Bunch. Em entrevista ao The Washington Post, ele disse ter rodado o mundo atrás dos objetos, “a única prova tangível de que essas pessoas realmente existiram”.

Destroços do São José foram descobertos em 1980, mas só entre 2010 e 2011 pesquisadores localizaram em Lisboa documentos que permitiram identificá-lo. Um acordo entre arqueólogos marinhos sul-africanos e o Smithsonian selou a vinda das peças para Washington.

Museu de História e Cultura Afroamericana

Inaugurado em setembro, o Museu de História e Cultura Afroamericana custou US$ 1,7 bilhão. DIVULGAÇÃO/SMITHSONIAN

Que o destino do São José fosse o Brasil não era coincidência, diz Luiz Felipe de Alencastro, professor emérito da Universidade de Paris Sorbonne e um dos maiores especialistas na história da escravidão transatlântica.

Ele afirma à BBC Brasil que fomos o paradeiro de 43% dos africanos escravizados enviados às Américas, enquanto os Estados Unidos acolheram apenas 0,5%.

Segundo um estudo da Universidade de Emory (EUA), ao longo da escravidão ingressaram nos portos brasileiros 4,8 milhões de africanos, a maior marca entre todos os países do hemisfério.

Esse contingente, oito vezes maior que o número de portugueses que entraram no Brasil até 1850, faz com que Alencastro costume dizer que o Brasil “não é um país de colonização europeia, mas africana e europeia”.

O fluxo de africanos também explica porque o Brasil é o país com mais afrodescendentes fora da África (segundo o IBGE, 53% dos brasileiros se consideram pretos ou pardos).

Por que, então, o Brasil não tem museus ou monumentos sobre a escravidão comparáveis ao novo museu afroamericano de Washington?

Apartheid e pilhagem da África

Para Alencastro, é preciso considerar as diferenças nas formas como Brasil e EUA lidaram com a escravidão e seus desdobramentos.

Ele diz que, nos EUA, houve uma maior exploração de negros nascidos no país, o que acabaria resultando numa “forma radical de racismo legal, de apartheid”.

Crianças brincam no Museu de História e Cultura Afroamericana

Museu virou um dos mais concorridos da capital americana e está com os ingressos esgotados até março. BBC BRASIL / JOÃO FELLET

Até a década de 1960, em partes do EUA, vigoravam leis que segregavam negros e brancos em espaços públicos, ônibus, banheiros e restaurantes. Até 1967, casamentos inter-raciais eram ilegais em alguns Estados americanos.

No Brasil, Alencastro diz que a escravidão “se concentrou muito mais na exploração dos africanos e na pilhagem da África”, embora os brasileiros evitem assumir responsabilidade por esses processos.

Ele afirma que muitos no país culpam os portugueses pela escravidão, mas que brasileiros tiveram um papel central na expansão do tráfico de escravos no Atlântico.

Alencastro conta que o reino do Congo, no oeste da África, foi derrubado em 1665 em batalha ordenada pelo governo da então capitania da Paraíba.

“O pelotão de frente das tropas era formado por mulatos pernambucanos que foram barbarizar na África e derrubar um reino independente”, ele diz.

Vizinha ao Congo, Angola também foi invadida por milicianos do Brasil e passou vários anos sob o domínio de brasileiros, que a tornaram o principal ponto de partida de escravos destinados ao país.

“Essas histórias são muito ocultadas e não aparecem no Brasil”, ele afirma.

Reparações históricas

Para a brasileira Ana Lucia Araújo, professora da Howard University, em Washington, “o Brasil ainda está muito atrás dos EUA” na forma como trata a história da escravidão.

“Aqui (nos EUA) se reconhece que o dinheiro feito nas costas dos escravos ajudou a construir o país, enquanto, no Brasil, há uma negação disso”, ela diz.

Autora de vários estudos sobre a escravidão nas Américas, Araújo afirma que até a ditadura (1964-1985) era forte no Brasil a “ideologia da democracia racial”, segundo a qual brancos e negros conviviam harmonicamente no país.

São recentes no Brasil políticas para atenuar os efeitos da escravidão, como cotas para negros em universidades públicas e a demarcação de territórios quilombolas.

Ferragens usadas em navios negreiros

Expor peças de um navio negreiro era uma obsessão do diretor do museu. DIVULGAÇÃO/SMITHSONIAN

Ela diz que ainda poucos museus no Brasil abordam a escravidão, “e, quando o fazem, se referem à população afrobrasileira de maneira negativa, inferiorizante”.

Segundo a professora, um dos poucos espaços a celebrar a cultura e a história afrobrasileira é o Museu Afro Brasil, em São Paulo, mas a instituição deve sua existência principalmente à iniciativa pessoal de seu fundador, o artista plástico Emanoel Araújo.

E só nos últimos anos o Rio de Janeiro passou a discutir o que fazer com o Cais do Valongo, maior porto receptor de escravos do mundo. Mantido por voluntários por vários anos, o local se tornou neste ano candidato ao posto de Patrimônio da Humanidade na Unesco.

Para a professora, museus e monumentos sobre a escravidão “não melhoram as vidas das pessoas, mas promovem um tipo de reparação simbólica ao fazer com que a história dessas populações seja reconhecida no espaço público”.

Visibilidade e representação

Para o jornalista e pesquisador moçambicano Rogério Ba-Senga, a escravidão e outros pontos da história entre Brasil a África têm pouca visibilidade no país, porque “no Brasil os brancos ainda têm o monopólio da representação social dos negros”.

“Há muitos negros pensando e pesquisando a cultura negra no Brasil, mas o centro decisório ainda é branco”, diz Ba-Senga, que mora em São Paulo desde 2003.

Para ele, o cenário mudará quando negros forem mais numerosos na mídia brasileira – “para que ponham esses assuntos em pauta” – e nos órgãos públicos.

Para Alencastro, mesmo que o Estado brasileiro evite tratar da escravidão, o tema virá à tona por iniciativa de outros grupos.

“Nações africanas que foram pilhadas se tornaram independentes. Há nesses países pessoas estudando o tema e uma imigração potencialmente crescente de africanos para o Brasil”, ele diz.

Em outra frente, o professor afirma que movimentos brasileiros em periferias e grupos quilombolas pressionam para que os assuntos ganhem espaço.

“Há hoje uma desconexão entre a academia e o debate no movimento popular, mas logo, logo tudo vai se juntar, até porque a maioria da população brasileira é afrodescentente. Os negros são maioria aqui.”

Catálogo traz à tona História Indígena e Escravidão Negra no Brasil (Nossa Ciência)

PESQUISA Segunda, 31 de Outubro de 2016

Mônica Costa

Crédito: Acervo pessoal

Pesquisa investigou documentos referentes ao Brasil, no período colonial, que constam do Arquivo Histórico Ultramarino, em Lisboa/Portugal

“Temos uma imensa documentação fantástica vinculada ao Quilombo dos Palmares, ao tráfico negreiro. Em relação à questão indígena, temos inúmeros povos e grupos étnicos que souberam negociar com a Coroa Portuguesa, lideranças indígenas que souberam lutar em favor do seu próprio povo.” A afirmação é da professora Juciene Ricarte, da Universidade Federal e Campina Grande (UFCG) sobre o conteúdo dos Catálogos Gerais dos Manuscritos Avulsos e em Códices referentes à História Indígena e à Escravidão Negra no Brasil Existentes no Arquivo Histórico Ultramarino, recentemente lançados em Campina Grande (PB)

Durante quatro anos, o projeto analisou 136.506 mil verbetes/documentos entre os anos de 1581 e 1843 que constam do Arquivo Histórico Ultramarino (AHU), de Lisboa, em Portugal. Foram analisadas cartas relatórios, requerimentos, cartas régias, alvarás, provisões, consultas, relatos de viagens, entre outros documentos geridos pela burocracia administrativa portuguesa e que lidavam com as questões indígenas e da escravidão no Brasil Colonial. Segundo a coordenadora do projeto, os documentos falam do cotidiano da Colônia. “Tem situações e narrativas de luta em mocambos, mocambos com indígenas e negros, levantes em aldeamentos, da vida desses protagonistas”, explica.

Guerra dos Bárbaros

Ao final, o projeto catalogou 6.009 verbetes, sendo 3.052 sobre a História Indígena e 2.957 referentes à escravidão negra em todas as regiões do País. Professora do Programa de Pós-Graduação em História, Juciene garante que muitas monografias, dissertações, teses e livros poderão surgir a partir dessas duas coleções, que contêm ainda todos os documentos digitalizados. “Pesquisadores, antropólogos, historiadores, memorialistas poderão usar esses documentos porque tem mais do que os livros-catálogos com o resumo. Por exemplo, se a pessoa quiser pesquisar sobre os Potiguaras com os Janduis, na Guerra dos Bárbaros, procura no DVD e com o número do verbete, ela vai encontrar a imagem digitalizada do manuscrito. Então, ela pode imprimir, arquivar no próprio computador, ampliar, reduzir e ainda tem acessibilidade para cegos.”

Além da produção dos livros-catálogos, o projeto realizou a gravação dos verbetes em áudio para promover a acessibilidade aos deficientes visuais “Esse formato é extremamente necessário e importante para o que objetivamos de acessibilidade no tocante às fontes históricas. Nenhum outro sistema de informação no trato com documentação de questões étnicas foi produzido com a possibilidade não só da leitura de verbetes, mas da escuta dos resumos”, ressaltou.

Povos exterminados

Tendo concluído o Pós-Doutorado na Universidade Nova de Lisboa, em 2015, Juciene afirma que os pesquisadores encontraram histórias e memórias sobre o cotidiano de diversos grupos indígenas que nem existem mais. São lutas e resistências de muitos que deram a própria vida, etnocídios, povos que foram exterminados lutando por seus territórios tradicionais, outros se adaptando à nova realidade colonial para também sobreviver.

Ela explica que é no período colonial que ocorrem as primeiras relações interétnicas entre indígenas e não-indígenas, entre negros e não-negros, entre senhores e escravos e entre missionários e indígenas. São as primeiras relações do que iria constituir o povo brasileiro. A etnohistoriadora argumenta que a documentação recolhida mostra homens e mulheres que são sujeitos do seu tempo e que tanto indígenas, quanto os homens e mulheres negros conseguiram construir relações e sociabilidade num mundo que, muitas vezes, era de dominação. “Quando se constituiu os aldeamentos indígenas sob a tutela de missionários, da Igreja e depois, com a Lei do Marquês de Pombal, quando expulsaram os jesuítas, esses indígenas sabiam muito bem construir novos espaços nesse mundo colonial, mesmo deixando de falar a própria língua, porque eram obrigados, continuavam sendo indígenas, resistiam, lutavam e ao mesmo tempo, tentavam sobreviver naquela realidade que lhe era imposta. Assim também os homens e mulheres negros. A ideia de escravidão negra é que fossem submetidos. Absolutamente! Eles construíam negociações ou atos de violências numa tentativa de sobreviver num mundo de escravidão”, narra.

Nordeste

Relacionado ao que hoje compreende o nordeste, a coordenadora do projeto esclarece que foi catalogada importante documentação sobre grupos étnicos que já não existem mais no Rio Grande do Norte ou na Paraíba. São os indígenas do sertão, da chamada Guerra dos Bárbaros, além de outros grupos étnicos que viviam ao longo do Rio São Francisco, muitos mocambos e quilombos. Esses documentos podem inclusive ser usados em laudos antropológicos para identificação de territórios tradicionais tanto indígenas, quanto negros.

Justificando a importância do trabalho, a professora afirma que ele não fala apenas do passado, porque negros e indígenas continuam existindo hoje, lutando por inclusão étnico-racial. “Nos próprios livros didáticos de História, é como se a História Indígena e a História do Negro no Brasil não fosse importante e com um instrumento de pesquisa como esse, a gente muda a historiografia nacional, contribui para que esses protagonistas, reais sujeitos históricos do Brasil, tenham vez na escrita da História”, ressalta.

Os produtos culturais – compostos por livros-catálogos e DVD´s – foram realizados por uma equipe de 43 pesquisadores da Universidade Federal de Campina Grande (UFCG) em parceria com a Fundação Parque Tecnológico da Paraíba (PaqTcPB), através do Edital Cultural da Petrobrás/2010. Os recursos foram de R$ 500 mil. A tiragem de cada Catálogo é de dois mil exemplares, a serem distribuídos para todas as universidades brasileiras, arquivos históricos, Movimento Negro e Movimento Indígena. A distribuição será gratuita e exclusivamente para instituições.

A política indigenista e o malogrado projeto de aldeamento indígena do século XIX (Pesquisa Fapesp)

30 de junho de 2016

José Tadeu Arantes  |  Agência FAPESP – Existe uma expressiva produção historiográfica sobre os primeiros 250 anos de contato dos indígenas com os conquistadores europeus do atual território brasileiro. O escambo entre forasteiros e nativos, as várias tentativas de escravização dos índios, a catequese jesuíta, o protagonismo indígena em grandes episódios, como a Guerra dos Tamoios, são razoavelmente conhecidos. Mas, após a derrota dos Guarani das Missões Jesuíticas em meados do século XVIII, escasseiam os relatos. Eles só irão reaparecer no século XX, com a intensificação do processo de interiorização. O século XIX, em especial, parece desprovido de índios. Presente na poesia e na prosa da literatura romântica, o indígena é o grande ausente nas páginas da história.

Com a ajuda de frades capuchinhos italianos, o Império procurou enquadrar os índios geográfica e culturalmente, mas a resistência velada que estes opuseram redimensionou o empreendimento (imagem: Cacique Pahi Kaiowá, Aldeamento de Santo Inácio do Paranapanema. Franz Keller, 1865 / Carneiro, Newton: Iconografia Paranaense, Curitiba, Impressora Paranaense, 1950)

No entanto, o século XIX foi palco da primeira política indigenista do Estado brasileiro. O fenômeno é o objeto do livro Terra de índio: imagens em aldeamentos do Império, de Marta Amoroso, publicado com o apoio da FAPESP. A obra resultou das pesquisas de doutorado e pós-doutorado de Amoroso – ambas apoiadas pela FAPESP.

“Esta política de Estado, baseada no “Programa de Catequese e Civilização dos Índios”, e instituída por decreto do imperador Pedro II, consistia no aldeamento das populações indígenas. E atendia a dois objetivos principais: por um lado, integrar o índio, como trabalhador rural, à jovem nação brasileira; por outro, liberar terras, antes utilizadas pelos indígenas, para os imigrantes europeus, que começavam a chegar nas colônias do Sudeste do país”, disse a pesquisadora à Agência FAPESP.

Pedro II tinha apenas 19 anos quando assinou, em 24 de junho de 1845, o decreto que criou os aldeamentos. Estes perduraram até o final do Segundo Reinado, em 1889. Aldeamentos foram criados em todas as províncias brasileiras. Para administrá-los e dirigi-los, o Império solicitou à Propaganda Fide, do Vaticano, precursora da atual Congregação para a Evangelização dos Povos, que enviasse ao Brasil frades italianos da Ordem Menor dos Capuchinhos. Cerca de cem missionários capuchinhos desembarcaram no país, logo enviados aos quatro cantos do Império, ao encontro das populações indígenas.

“Os capuchinhos não tinham frente aos índios um projeto de autonomia como o dos jesuítas, que atuaram nos primeiros séculos da colonização. Eram pragmáticos e burocráticos, a maioria deles de origem rural, mal falando o português. E foram contratados como funcionários do governo, com salário pago. Estavam envolvidos no programa de criação da nação brasileira, de construir um povo a partir da mistura. Era um programa de apagamento da identidade indígena, e os capuchinhos se empenharam ao máximo em levá-lo à prática”, informou Amoroso.

A maior parte da documentação utilizada por ela em seu livro veio de um arquivo dessa ordem religiosa, localizado no Rio de Janeiro. “Os capuchinhos deixaram relatórios e cartas absolutamente circunstanciados, com detalhes administrativos ultraminuciosos. Além dos relatos dos viajantes do século XIX, foram esses documentos religiosos, e ao mesmo tempo oficiais, que forneceram a base de dados para o meu trabalho”, afirmou.

Inicialmente a pesquisadora fez um levantamento da cartografia dos aldeamentos do Império. Depois, fechou o foco da pesquisa no sistema de aldeamentos do Paraná, especialmente em seu núcleo central, São Pedro de Alcântara, localizado às margens do rio Tibagi, para o qual havia uma documentação muito substanciosa. “Esse aldeamento, próximo da cidade de Castro, reuniu cerca de 4 mil índios, de quatro etnias: os Kaingang, do tronco linguístico Macro-Jê, e os Kaiowá, Nhandeva e Mbyá, que são falantes da língua Guarani. Considerados agricultores dóceis, os Guarani-Kaiowá, que atualmente sofrem violências brutais devido a conflitos de terras, foram trazidos do Mato Grosso para o Paraná, com a perspectiva de que povoassem os aldeamentos do governo e pudessem produzir mantimentos para abastecer o exército brasileiro na chamada Guerra do Paraguai [1864 – 1870]”, relatou Amoroso.

Segundo a pesquisadora, foram feitas várias tentativas para tornar o aldeamento de São Pedro de Alcântara economicamente produtivo: mantimentos, café, tabaco etc. Mas todas elas fracassaram. Até que o empreendimento finalmente prosperou com a instalação de uma destilaria de aguardente. “Houve todo um esforço, muito bem documentado, dos capuchinhos na montagem dessa destilaria. É incrível que uma das maiores calamidades vividas pelas populações indígenas, que é o alcoolismo, tenha sido oficialmente promovida”, comentou.

Programa de Catequese

O “Programa de Catequese e Civilização dos Índios” inspirou-se em uma ideia de tutela das populações indígenas que remontava aos Apontamentos para a civilização dos índios bravos do Império do Brasil, produzidos em 1823 por José Bonifácio de Andrada e Silva. E, antes deles, às diretrizes definidas pelo Marquês de Pombal após a expulsão dos jesuítas do Império Português, na segunda metade do século XVIII.

Como escreveu Amoroso, o modelo do indigenismo pombalino, retomado nos aldeamentos indígenas do Império, contrastava na sua concepção com o ideal de autonomia buscada pelas missões jesuíticas. Daí a ênfase na mistura dos índios com os demais habitantes das vilas e povoados, na migração dos colonos para as regiões tradicionalmente habitadas pelos indígenas, e nos deslocamentos forçados dos índios. Bem como nas tentativas de proibição do uso das línguas indígenas e do nheengatu, a chamada “língua geral”, resultante da mistura de idiomas indígenas com o português.

Até por isso, os aldeamentos do Império não eram áreas de confinamento. Os índios não permaneciam reclusos em seu interior. “Os aldeamentos eram concebidos como colônias agrícolas, em cujas sedes ficavam lotados os missionários e funcionários contratados e instaladas as unidades produtivas mais importantes. E essas sedes administravam aldeias indígenas localizadas relativamente perto. Apesar de os deslocamentos serem admitidos pela ideologia associada aos aldeamentos, na maioria dos casos, estes traslados forçados de população de fato não ocorreram. Os frades tutelavam aldeias que já existiam e continuaram existindo”, acrescentou a pesquisadora.

A própria ideia da tutela parece ter sido encarada como uma solução provisória. Em carta enviada pelo Palácio Imperial ao presidente da Província de São Paulo em 1847, dois anos após a assinatura do decreto que criou os aldeamentos, assim foi exposto o princípio que os orientava: “arrancar à vida errante a multidão de selvagens que vaga pelos nossos bosques para reuni-los em sociedade, inspirar-lhes o amor ao trabalho e proporcionar-lhes os cômodos da vida civil, até que possam apreciar as suas vantagens e viver de qualquer trabalho ou indústria”. Essa mesma correspondência ordenava ao presidente provincial que impedisse que o aldeamento acolhesse indígenas e descendentes já integrados à sociedade, “confundidos na massa geral da população”.

Um aspecto para o qual a pesquisadora chamou a atenção foi o fato de que, ao lado de cada aldeamento, o Império instalou também uma guarnição militar. “As Colônias Militares são a evidência de uma política de guerra nas fronteiras internas do Império, em contraponto à ‘brandura para com os índios’ da propaganda imperial”, disse.

O livro destaca as estratégias indígenas diante do “Programa de Catequese e Civilização dos Índios”, encenando uma resistência não declarada nos territórios então administrados pelo Governo. “Tomando como exemplo a participação Guarani em um desses aldeamentos, vislumbram-se conflitos interétnicos e a grande mobilidade de indivíduos e grupos familiares em torno dos equipamentos instalados. O abandono frequente de São Pedro de Alcântara pelos Guarani foi muitas vezes motivado pela impossibilidade de compartilharem o espaço do aldeamento com os funcionários e religiosos e com outros coletivos indígenas aldeados. Já os Kaingang, além de terem imposto sua presença nos aldeamentos originalmente concebidos para os Guarani-Kaiowá, permaneceram em algumas das unidades criadas mesmo depois de estas serem abandonadas pelos órgãos públicos”, descreveu Amoroso.

Moeda de troca

Mas, de maneira geral, o que a pesquisa destacou foi a grande mobilidade dos grupos indígenas que permaneceram em suas aldeias, frequentando eventualmente os aldeamentos.

“Isso era favorecido pelo fato de que, no século XIX, havia ainda uma grande área disponível para a circulação. Logo depois da Lei de Terras de 1850, as fazendas privadas estavam sendo implantadas, os colonos europeus estavam chegando, mas os índios ainda podiam circular por vastas extensões. E frequentavam a civilização apenas quando lhes convinha. Indivíduos que já haviam sido batizados em um aldeamento apresentam-se em outro como ‘selvagens’, em busca de ajuda. E usavam os equipamentos fornecidos pelos missionários como ‘moeda de troca’ nos relacionamentos com outros grupos indígenas. Os frades comentavam e se indispunham contra essa mobilidade, mas nada podiam fazer, porque, sem dizer ‘não’, fazendo-se muitas vezes de desentendidos, os índios opunham uma resistência velada, que acabou se impondo”, argumentou Amoroso.

Assim, a despeito do zelo gerencial dos capuchinhos, a política de aldeamento fracassou. As atividades produtivas não prosperaram, as verbas foram minguando, os equipamentos se degradaram, e as políticas indígenas triunfaram sobre a normatização burocrática. “Nem mesmo a orientação de que os índios deviam se comunicar apenas na língua portuguesa deu certo”, acrescentou a pesquisadora.

Algo importante que a pesquisa buscou destacar foi o “outro lado” dessa história, isto é, como os grupos indígenas vivenciaram o processo. “O trabalho com a documentação, na tentativa de compor uma etnografia, me permitiu perceber que houve todo tipo de arranjo: grupos inteiros foram exterminados, como os Guarani-Kaiowá de São Pedro de Alcântara, que morreram devido a uma epidemia de cólera na década de 1860; grupos permaneceram, como os Kaingang, que até hoje habitam a região; grupos transitaram pelos aldeamentos, sendo registrada ao longo das quatro décadas grande mobilidade dos Nhandeva e os Mbyá”, informou.

Muitos aldeamentos do Império são agora terras indígenas. É o caso do Aldeamento São Jerônimo, atualmente Posto Indígena São Jerônimo da Serra, na margem do rio Tigre, afluente do Tibagi, no Paraná. Criado em 1859, a partir da doação da Fazenda de São Jerônimo pelo Barão de Antonina, teve sua área original, de 33.880 hectares, drasticamente reduzida para pouco mais de 1.339 hectares. Mas sua modesta população vem apresentando consistente crescimento demográfico: 133 pessoas, em 1945; 285, em 1975; 380, em 2005, de acordo com informações do Portal Kaingang.

Também a população Guarani do Estado de São Paulo, computada, em 2013, em 3.593 indígenas das etnias Nhandeva e Mbyá, vem crescendo a uma taxa de 4,5% ao ano, muito superior à da média da população brasileira (de 0,9%, em 2013, e de 0,8%, em 2016) (fonte: www.rau.ufscar.br/wp-content/uploads/2015/05/vol5no1_03.Juracilda.pdf).

Essa tendência de recuperação demográfica de populações altamente devastadas é um fenômeno conhecido e registrado hoje em dia na África. No território brasileiro, cuja população indígena foi reduzida de estimados 5 milhões em 1500 para 400 mil atualmente (distribuídos em cerca de 200 etnias e 170 línguas), isso também está ocorrendo.

“Ainda mais persistente do que a sobrevivência física tem sido a sobrevivência das práticas culturais. O fio da meada que parece desaparecer em um ponto volta a aparecer adiante, muitas vezes de forma surpreendente. Descobri, por exemplo, que o núcleo Guarani que, em 1906, acolheu em São Paulo o célebre etnólogo alemão Curt Unckel (1883 – 1945) provinha exatamente daquele aldeamento estudado, no Paraná. Foram eles que lhe deram o nome Nimuendajú, pelo qual o alemão trocou seu sobrenome Unckel ao se naturalizar brasileiro. Em seu trabalho de campo, Curt Nimuendajú entrevistou grandes xamãs Guarani, que passaram pela experiência dos aldeamentos. Estes lhe falaram do trabalho extenuante que tinham que realizar sob a direção dos capuchinhos. E também relataram seus primeiros contatos com as drogas da civilização: o açúcar e a cachaça.”

Curt Nimuendajú tornou-se um marco da etnologia, quase tão lendário em vida quanto o foi o etnólogo groenlandês Knud Rasmussen (1879 – 1933). É revelador que a busca de uma perspectiva indígena sobre a política de aldeamento tenha convergido para a trajetória do etnólogo. E que uma abordagem antropológica da tentativa de enquadramento institucional dos indígenas tenha vindo desembocar na figura daquele que testemunhou com os próprios olhos, na década de 1920, a mais impressionante manifestação da fidelidade do povo Guarani a suas raízes culturais: a última grande migração para o Leste, em busca da mítica “Terra sem Males” (Yvy marã e’ỹ).

50 anos de calamidades na América do Sul (Pesquisa Fapesp)

Terremotos e vulcões matam mais, mas secas e inundações atingem maior número de pessoas 

MARCOS PIVETTA | ED. 241 | MARÇO 2016

Um estudo sobre os impactos de 863 desastres naturais registrados nas últimas cinco décadas na América do Sul indica que fenômenos geológicos relativamente raros, como os terremotos e o vulcanismo, produziram quase o dobro de mortes do que eventos climáticos e meteorológicos de ocorrência mais frequente, como inundações, deslizamento de encostas, tempestades e secas. Dos cerca de 180 mil óbitos decorrentes dos desastres, 60% foram em razão de tremores de terra e da atividade de vulcões, um tipo de ocorrência que se concentra nos países andinos, como Peru, Chile, Equador e Colômbia. Os terremotos e o vulcanismo representaram, respectivamente, 11% e 3% dos eventos contabilizados no trabalho.

Aproximadamente 32% das mortes ocorreram em razão de eventos associados a ocorrências meteorológicas ou climáticas, categoria que engloba quatro de cada cinco desastres naturais registrados na região entre 1960 e 2009. Epidemias de doenças – um tipo de desastre biológico com dados escassos sobre a região, segundo o levantamento – levaram 15 mil pessoas a perder a vida, 8% do total. No Brasil, 10.225 pessoas morreram ao longo dessas cinco décadas em razão de desastres naturais, pouco mais de 5% do total, a maioria em inundações e deslizamentos de encostas durante tempestades.

Seca no Nordeste...

O trabalho foi feito pela geógrafa Lucí Hidalgo Nunes, professora do Instituto de Geociências da Universidade Estadual de Campinas (IG-Unicamp) para sua tese de livre-docência e resultou no livro Urbanização e desastres naturais – Abrangência América do Sul (Oficina de Textos), lançado em meados do ano passado. “Desde os anos 1960, a população urbana da América do Sul é maior do que a rural”, diz Lucí. “O palco maior das calamidades naturais tem sido o espaço urbano, que cresce em área ocupada pelas cidades e número de habitantes.”

A situação se inverteu quando o parâmetro analisado foi, em vez da quantidade de mortos, o número de indivíduos afetados em cada tipo de desastre. Dos 138 milhões de vítimas não fatais atingidas por esses eventos, 1% foi alvo de epidemias, 11% de terremotos e vulcanismo, 88% de fenômenos climáticos ou meteorológicos. As secas e as inundações foram as ocorrências que provocaram impactos em mais indivíduos. As grandes estiagens atingiram 57 milhões de pessoas (41% de todos os afetados), e as enchentes, 52,5 milhões de habitantes (38%). O Brasil respondeu por cerca de 85% das vítimas não fatais de secas, essencialmente moradores do Nordeste, e por um terço dos atingidos por inundações, fundamentalmente habitantes das grandes cidades do Sul-Sudeste.

...inundação em Caracas, na Venezuela: esses dois tipos de desastres são os que afetam o maior número de pessoas

Estimados em US$ 44 bilhões ao longo das cinco décadas, os prejuízos materiais associados aos quase 900 desastres contabilizados foram decorrentes, em 80% dos casos, de fenômenos de natureza climática ou meteorológica. “O Brasil tem quase 50% do território e mais da metade da população da América do Sul. Mas foi palco de apenas 20% dos desastres, 5% das mortes e 30% dos prejuízos econômicos associados a esses eventos”, diz Lucí. “O número de pessoas afetadas aqui, no entanto, foi alto, 53% do total de atingidos por desastres na América do Sul. Ainda temos vulnerabilidades, mas não tanto quanto países como Peru, Colômbia e Equador.”

Para escrever o estudo, a geógrafa com-pilou, organizou e analisou os registros de desastres naturais das últimas cinco décadas nos países da América do Sul, além da Guiana Francesa (departamento ultramarino da França), que estão armazenados no Em-Dat – International Disaster Database. Essa base de dados reúne informações sobre mais de 21 mil desastres naturais ocorridos em todo o mundo desde 1900 até hoje. Ela é mantida pelo Centro de Pesquisa em Epidemiologia de Desastres (Cred, na sigla em inglês), que funciona na Escola de Saúde Pública da Universidade Católica de Louvain, em Bruxelas (Bélgica). “Não há base de dados perfeita”, pondera Lucí. “A do Em-Dat é falha, por exemplo, no registro de desastres biológicos.” Sua vantagem é juntar informações oriundas de diferentes fontes – agências não governamentais, órgãos das Nações Unidas, companhias de seguros, institutos de pesquisa e meios de comunicação – e arquivá-las usando sempre a mesma metodologia, abordagem que possibilita a realização de estudos comparativos.

O que caracteriza um desastre
Os eventos registrados no Em-Dat como desastres naturais devem preencher ao menos uma de quatro condições: provocar a morte de no mínimo 10 pessoas; afetar 100 ou mais indivíduos; motivar a declaração de estado de emergência; ou ainda ser a razão para um pedido de ajuda internacional. No trabalho sobre a América do Sul, Lucí organizou os desastres em três grandes categorias, subdivididas em 10 tipos de ocorrências. Os fenômenos de natureza geofísica englobam os terremotos, as erupções vulcânicas e os movimentos de massa seca (como a queda de uma pedra morro abaixo em um dia sem chuva). Os eventos de caráter meteorológico ou climático abarcam as tempestades, as inundações, os deslocamentos de terra em encostas, os extremos de temperatura (calor ou frio fora do normal), as secas e os incêndios. As epidemias representam o único tipo de desastre biológico contabilizado (ver quadro).

062-065_Desastres climáticos_241O climatologista José Marengo, chefe da divisão de pesquisas do Centro Nacional de Monitoramento e Alertas de Desastres Naturais (Cemaden), em Cachoeira Paulista, interior de São Paulo, afirma que, além de eventos naturais, existem desastres considerados tecnológicos e casos híbridos. O rompimento em novembro passado de uma barragem de rejeitos da mineradora Samarco, em Mariana (MG), que provocou a morte de 19 pessoas e liberou toneladas de uma lama tóxica na bacia hidrográfica do rio Doce, não tem relação com eventos naturais. Pode ser qualificado como um desastre tecnológico, em que a ação humana está ligada às causas da ocorrência. Em 2011, o terremoto de 9.0 graus na escala Richter, seguido de tsunamis, foi o maior da história do Japão. Matou quase 16 mil pessoas, feriu 6 mil habitantes e provocou o desaparecimento de 2.500 indivíduos. Destruiu também cerca de 138 mil edificações. Uma das construções afetadas foi a usina nuclear de Fukushima, de cujos reatores vazou radioatividade. “Nesse caso, houve um desastre tecnológico causado por um desastre natural”, afirma Marengo.

Década após década, os registros de desastres naturais têm aumentado no continente, seguindo uma tendência que parece ser global. “A qualidade das informações sobre os desastres naturais melhorou muito nas últimas décadas. Isso ajuda a engrossar as estatísticas”, diz Lucí. “Mas parece haver um aumento real no número de eventos ocorridos.” Segundo o estudo, grande parte da escalada de eventos trágicos se deveu ao número crescente de fenômenos meteorológicos e climáticos de grande intensidade que atingiram a América do Sul. Na década de 1960, houve 51 eventos desse tipo. Nos anos 2000, o número subiu para 257. Ao longo das cinco décadas, a incidência de desastres geofísicos, que provocam muitas mortes, manteve-se mais ou menos estável e os casos de epidemias diminuíram.

Risco urbano 
O número de mortes em razão de eventos extremos parece estar diminuindo depois de ter atingido um pico de 75 mil óbitos nos anos 1970. Na década passada, houve pouco mais de 6 mil mortes na América do Sul causadas por desastres naturais, de acordo com o levantamento de Lucí. Historicamente, as vítimas fatais se concentram em poucas ocorrências de enormes proporções, em especial os terremotos e as erupções vulcânicas. Os 20 eventos com mais fatalidades (oito ocorridos no Peru e cinco na Colômbia) responderam por 83% de todas as mortes ligadas a fenômenos naturais entre 1960 e 2009. O pior desastre foi um terremoto no Peru em maio de 1970, com 66 mil mortes, seguido de uma inundação na Venezuela em dezembro de 1999 (30 mil mortes) e uma erupção vulcânica na Colômbia em novembro de 1985 (20 mil mortes). O Brasil contabiliza o 9º evento com mais fatalidades (a epidemia de meningite em 1974, com 1.500 óbitos) e o 19° (um deslizamento de encostas, em razão de fortes chuvas, que matou 436 pessoas em março de 1967 em Caraguatatuba, litoral de São Paulo).

Também houve declínio na quantidade de pessoas afetadas nos anos mais recentes, mas as cifras continuam elevadas. Nos anos 1980, os desastres produziram cerca de 50 milhões de vítimas não fatais na América do Sul. Na década passada e também na retrasada, o número caiu para cerca de 20 milhões.

062-065_Desastres climáticos_241-02Sete em cada 10 latino-americanos moram atualmente em cidades, onde a ocupação do solo sem critérios e algumas características geoclimáticas específicas tendem a aumentar a vulnerabilidade da população local a desastres naturais. Lucí comparou a situação de 56 aglomerados urbanos com mais de 750 mil habitantes da América do Sul em relação a cinco fatores que aumentam o risco de calamidades: seca, terremoto, inundação, deslizamento de encostas e vulcanismo. Quito, capital do Equador, foi a única metrópole que estava exposta aos cinco fatores. Quatro cidades colombianas (Bogotá, Cáli, Cúcuta e Medellín) e La Paz, na Bolívia, vieram logo atrás, com quatro vulnerabilidades. As capitais brasileiras apresentaram no máximo dois fatores de risco, seca e inundação (ver quadro). “Os desastres resultam da junção de ameaças naturais e das vulnerabilidades das áreas ocupadas”, diz o pesquisador Victor Marchezini, do Cemaden, sociólogo que estuda os impactos de longo prazo desses fenômenos extremos. “São um evento socioambiental.”

É difícil mensurar os custos de um desastre. Mas a partir de dados da edição de 2013 do Atlas brasileiro de desastres naturais, que usa uma metodologia dife-rente da empregada pela geógrafa da Unicamp para contabilizar calamidades na América do Sul, o grupo de Carlos Eduardo Young, do Instituto de Economia da Universidade Federal do Rio de Janeiro (UFRJ), fez no final do ano passado um estudo. Baseado em estimativas do Banco Mundial de perdas provocadas por desastres em alguns estados brasileiros, Young calculou que enxurradas, inundações e movimentos de massa ocorridos entre 2002 e 2012 provocaram prejuízos econômicos de ao menos R$ 180 bilhões para o país. Em geral, os estados mais pobres, como os do Nordeste, sofreram as maiores perdas econômicas em relação ao tamanho do seu PIB. “A vulnerabilidade a desastres pode ser inversamente proporcional ao grau de desenvolvimento econômico dos estados”, diz o economista. “As mudanças climáticas podem acirrar a questão da desigualdade regional no Brasil.”