Arquivo mensal: abril 2016

Xenotext Genetic Poetry

synthetic zerØ

Genetics and Poetics

“Words on a page — that’s usually how we conceive of poetry. But Christian Bök, at the University of Calgary, has done something no other writer has ever done: as part of his recent project, The Xenotext, he’s enciphered a poem into a micro-organism, which then “rewrote” that poem as part of its biological response. His eventual hope is to encode a poem inside a near-indestructible bacterium (deinococcus radiodurans) which may actually outlast human civilization.”

Ver o post original

Weasel Apparently Shuts Down World’s Most Powerful Particle Collider (NPR)

April 29, 201611:04 AM ET

GEOFF BRUMFIEL

The Large Hadron Collider uses superconducting magnets to smash sub-atomic particles together at enormous energies.

The Large Hadron Collider uses superconducting magnets to smash sub-atomic particles together at enormous energies. CERN

A small mammal has sabotaged the world’s most powerful scientific instrument.

The Large Hadron Collider, a 17-mile superconducting machine designed to smash protons together at close to the speed of light, went offline overnight. Engineers investigating the mishap found the charred remains of a furry creature near a gnawed-through power cable.

A small mammal, possibly a weasel, gnawed-through a power cable at the Large Hadron Collider.A small mammal, possibly a weasel, gnawed-through a power cable at the Large Hadron Collider. Ashley Buttle/Flickr

“We had electrical problems, and we are pretty sure this was caused by a small animal,” says Arnaud Marsollier, head of press for CERN, the organization that runs the $7 billion particle collider in Switzerland. Although they had not conducted a thorough analysis of the remains, Marsollier says they believe the creature was “a weasel, probably.” (Update: An official briefing document from CERN indicates the creature may have been a marten.)

The shutdown comes as the LHC was preparing to collect new data on the Higgs Boson, a fundamental particle it discovered in 2012. The Higgs is believed to endow other particles with mass, and it is considered to be a cornerstone of the modern theory of particle physics.

Researchers have seen some hints in recent data that other, yet-undiscovered particles might also be generated inside the LHC. If those other particles exist, they could revolutionize researcher’s understanding of everything from the laws of gravity, to quantum mechanics.

Unfortunately, Marsollier says, scientists will have to wait while workers bring the machine back online. Repairs will take a few days, but getting the machine fully ready to smash might take another week or two. “It may be mid-May,” he says.

These sorts of mishaps are not unheard of, says Marsollier. The LHC is located outside of Geneva. “We are in the countryside, and of course we have wild animals everywhere.” There have been previous incidents, including one in 2009, when a bird is believed to have dropped a baguette onto critical electrical systems.

Nor are the problems exclusive to the LHC: In 2006, raccoons conducted a “coordinated” attack on a particle accelerator in Illinois.

It is unclear whether the animals are trying to stop humanity from unlocking the secrets of the universe.

Of course, small mammals cause problems in all sorts of organizations. Yesterday, a group of children took National Public Radio off the air for over a minute before engineers could restore the broadcast.

Argentine football club Tigre launches implantable microchip for die-hard fans (AFP)

Abril 26, 2016 6:59pm

Tigres players hugging after a goal

PHOTO: Tigres fans won’t need hard copy tickets or to enter their stadium with the implanted microchip. (Reuters: Enrique Marcarian)

For football lovers so passionate that joining a fan club just isn’t enough, Argentine side Tigre has launched the “Passion Ticket”: a microchip that die-hards can have implanted in their skin.

In football-mad Argentina, fans are known for belting out an almost amorous chant to their favourite clubs: “I carry you inside me!”

First-division side Tigre said it had decided to take that to the next level and is offering fans implantable microchips that will open the stadium turnstiles on match days, no ticket or ID required.

“Carrying the club inside you won’t just be a metaphor,” the club wrote on its Twitter account.

Tigre secretary general Ezequiel Rocino kicked things off by getting one of the microchips implanted in his arm, under an already existing tattoo in the blue and red of the club.

The chips are similar to the ones dog and cat owners can have implanted in their pets in case they get lost.

Rocino showed off the technology for journalists, placing his arm near a scanner to open the turnstile to the club’s stadium 30 kilometres north of the capital, Buenos Aires.

“The scanner will read the data on the implanted chip, and if the club member is up-to-date on his payments, will immediately open the security turnstile,” the club said.

Rocino said getting a chip would be completely voluntary.

“We’re not doing anything invasive, just accelerating access. There’s no GPS tracker, just the member’s data,” he said.

AFP

Répteis têm atividade cerebral típica de sonhos humanos, revela estudo (Folha de S.Paulo)

Dr. Stephan Junek, Max Planck Institute for Brain Research
Sleeping dragon (Pogona vitticeps). [Credit: Dr. Stephan Junek, Max Planck Institute for Brain Research]
Estudo mostra que lagartos atingem padrão de sono que, em humanos, permite o surgimento de sonhos

REINALDO JOSÉ LOPES
COLABORAÇÃO PARA A FOLHA

28/04/2016 14h56

Será que os lagartos sonham com ovelhas escamosas? Ninguém ainda foi capaz de enxergar detalhadamente o que acontece no cérebro de tais bichos para que seja possível responder a essa pergunta, mas um novo estudo revela que o padrão de atividade cerebral típico dos sonhos humanos também surge nesses répteis quando dormem.

Trata-se do chamado sono REM (sigla inglesa da expressão “movimento rápido dos olhos”), que antes parecia ser exclusividade de mamíferos como nós e das aves. No entanto, a análise da atividade cerebral de um lagarto australiano, o dragão-barbudo (Pogona vitticeps), indica que, ao longo da noite, o cérebro do animal fica se revezando entre o sono REM e o sono de ondas lentas (grosso modo, o sono profundo, sem sonhos), num padrão parecido, ainda que não idêntico, ao observado em seres humanos.

Liderado por Gilles Laurent, do Instituto Max Planck de Pesquisa sobre o Cérebro, na Alemanha, o estudo está saindo na revista especializada “Science”. “Laurent não brinca em serviço”, diz Sidarta Ribeiro, pesquisador da UFRN (Universidade Federal do Rio Grande do Norte) e um dos principais especialistas do mundo em neurobiologia do sono e dos sonhos. “Foi feita uma demonstração bem clara do fenômeno.”

A metodologia usada para verificar o que acontecia no cérebro reptiliano não era exatamente um dragão de sete cabeças. Cinco exemplares da espécie receberam implantes de eletrodos no cérebro e, na hora de dormir, seu comportamento foi monitorado com câmeras infravermelhas, ideais para “enxergar no escuro”. Os animais costumavam dormir entre seis e dez horas por noite, num ciclo que podia ser mais ou menos controlado pelos cientistas do Max Planck, já que eles é que apagavam e acendiam as luzes e regulavam a temperatura do recinto.

O que os pesquisadores estavam medindo era a variação de atividade elétrica no cérebro dos dragões-barbudos durante a noite. São essas oscilações que produzem o padrão de ondas já conhecido a partir do sono de humanos e demais mamíferos, por exemplo.

Só foi possível chegar aos achados relatados no novo estudo por causa de seu nível de detalhamento, diz Suzana Herculano-Houzel, neurocientista da UFRJ (Universidade Federal do Rio de Janeiro) e colunista da Folha. “Estudos anteriores menos minuciosos não tinham como detectar sono REM porque, nesses animais, a alternância entre os dois tipos de sono é extremamente rápida, a cada 80 segundos”, explica ela, que já tinha visto Laurent apresentar os dados num congresso científico. Em humanos, os ciclos são bem mais lentos, com duração média de 90 minutos.

Além da semelhança no padrão de atividade cerebral, o sono REM dos répteis também tem correlação clara com os movimentos oculares que lhe dão o nome (os quais lembram vagamente a maneira como uma pessoa desperta mexe os olhos), conforme mostraram as imagens em infravermelho.

DORMIR, TALVEZ SONHAR

A primeira implicação das descobertas é evolutiva. Embora dormir seja um comportamento aparentemente universal no reino animal, o sono REM (e talvez os sonhos) pareciam exclusividade de espécies com cérebro supostamente mais complexo. “Para quem estuda os mecanismos do sono, é um estudo fundamental”, afirma Suzana.

Acontece que tanto mamíferos quanto aves descendem de grupos primitivos associados aos répteis, só que em momentos bem diferentes da história do planeta – mamíferos já caminhavam pela Terra havia dezenas de milhões de anos quando um grupo de pequenos dinossauros carnívoros deu origem às aves. Ou seja, em tese, mamíferos e aves precisariam ter “aprendido a sonhar” de forma totalmente independente. O achado “resolve esse paradoxo”, diz Ribeiro: o sono REM já estaria presente no ancestral comum de todos esses vertebrados.

O trabalho do pesquisador brasileiro e o de outros especialistas mundo afora tem mostrado que ambos os tipos de sono são fundamentais para “esculpir” memórias no cérebro, ao mesmo tempo fortalecendo o que é relevante e jogando fora o que não é importante. Sem os ciclos alternados de atividade cerebral, a capacidade de aprendizado de animais e humanos ficaria seriamente prejudicada.

Tanto Ribeiro quanto Suzana, porém, dizem que ainda não dá para cravar que lagartos ou outros animais sonham como nós. “Talvez um dia alguém faça ressonância magnética em lagartos adormecidos e veja se eles mostram a mesma reativação de áreas sensoriais que se vê em humanos em sono REM”, diz ela. “Claro que os donos de cachorro têm certeza que suas mascotes sonham, mas o ideal seria fazer a decodificação do sinal neural”, uma técnica que permite saber o que uma pessoa imagina estar vendo quando sonha e já foi aplicada com sucesso por cientistas japoneses.

Paranormal beliefs can increase number of dé jà vu experiences (Science Daily)

Date:
April 27, 2016
Source:
British Psychological Society (BPS)
Summary:
A belief in the paranormal can mean an individual experiences more déjà vu moments in their life.

A belief in the paranormal can mean an individual experiences more déjà vu moments in their life.

This is one of the findings of a study by 3rd year undergraduate student Chloe Pickles and Dr Mark Moss, of Northumbria University, who will present their poster today, Thursday 28 April 2016, at the British Psychological Society’s annual conference in Nottingham. Over 100 participants completed surveys relating to perceived stress, belief in paranormal experiences and beliefs about déjà vu. Analysis of the results showed a strong link between belief in paranormal experiences and the frequency, pleasantness and intensity of déjà vu experiences. Stress was linked significantly to intensity and duration only.

Chloe Pickles said: “Our study calls in to question whether stress increases the number of déjà vu moments for an individual. Previous research had not considered the impact of belief when experiencing the feeling that this moment has happened before. Déjà vu might be a normal experience for those more open to it as well as (or instead of) a consequence of a negative life events.”

Why E O Wilson is wrong about how to save the Earth (AEON)

01 March, 2016

Robert Fletcher is an associate professor at the Sociology of Development and Change Group at Wageningen University in the Netherlands. His most recent book is Romancing the Wild: Cultural Dimensions of Ecotourism (2014).

Bram Büscher is a professor and Chair at the Sociology of Development and Change Group at Wageningen University in the Netherlands. His most recent book is Transforming the Frontier: Peace Parks and the Politics of Neoliberal Conservation in Southern Africa (2013).

Edited by Brigid Hains

Opinion sized gettyimages 459113790

A member of the military-style Special Ranger Patrol talks to a suspected rhino poacher on 7 November 2014 at the Kruger National Park, South Africa. Photo by James Oatway/Sunday Times/Getty

Edward O Wilson is one of the world’s most revered, reviled and referenced conservation biologists. In his new book (and Aeon essayHalf-Earth, he comes out with all guns blazing, proclaiming the terrible fate of biodiversity, the need for radical conservation, and humanity’s centrality in both. His basic message is simple: desperate times call for desperate measures, ‘only by setting aside half the planet in reserve, or more, can we save the living part of the environment and achieve the stabilisation required for our own survival’. Asserting that ‘humanity’ behaves like a destructive juggernaut, Wilson is deeply concerned that the current ‘sixth extinction’ is destroying many species before scientists have even been able to identify them.

Turning half of the Earth into a series of nature parks is a grand utopian vision for conservation, perhaps even a hyperbolic one, yet Wilson seems deadly serious about it. Some environmental thinkers have been arguing the exact opposite, namely that conservation should give up its infatuation with parks and focus on ‘mixing’ people and nature in mutually conducive ways. Wilson defends a traditional view that nature needs more protection, and attacks them for being ‘unconcerned with what the consequences will be if their beliefs are played out’. As social scientists who study the impact of international conservation on peoples around the world, we would argue that it is Wilson himself who has fallen into this trap: the world he imagines in Half-Earth would be a profoundly inhumane one if ever his beliefs were ‘played out’.

The ‘nature needs half’ idea is not entirely new – it is an extreme version of a more widespread ‘land sparing’ conservation strategy. This is not about setting aside half the Earth as a whole but expanding the world’s current network of protected areas to create a patchwork grid encompassing at least half the world’s surface (and the ocean) and hence ‘about 85 per cent’ of remaining biodiversity. The plan is staggering in scale: protected areas, according to the International Union for the Conservation of Nature, currently incorporate around 10-15 per cent of the Earth’s terrain, so would need to more than triple in extent.

Wilson identifies a number of causes of the current ecological crisis, but is particularly concerned by overpopulation. ‘Our population,’ he argues, ‘is too large for safety and comfort… Earth’s more than 7 billion people are collectively ravenous consumers of all the planet’s inadequate bounty.’ But can we talk about the whole of humanity in such generalised terms? In reality, the world is riven by dramatic inequality, and different segments of humanity have vastly different impacts on the world’s environments. The blame for our ecological problems therefore cannot be spread across some notion of a generalised ‘humanity’.

Although Wilson is careful to qualify that it is the combination ofpopulation growth and ‘per-capita consumption’ that causes environmental degradation, he is particularly concerned about places he identifies as the remaining high-fertility problem spots – ‘Patagonia, the Middle East, Pakistan, and Afghanistan, plus all of sub-Saharan Africa exclusive of South Africa’. These are countries with some of the world’s lowest incomes. Paradoxically, then, it is those consuming the least that are considered the greatest problem. ‘Overpopulation’, it seems, is the same racialised bogeyman as ever, and the poor the greatest threat to an environmentally-sound future.

Wilson’s Half-Earth vision is offered as an explicit counterpoint to so-called ‘new’ or ‘Anthropocene’ conservationists, who are loosely organised around the controversial Breakthrough Institute. For Wilson, these ‘Anthropocene ideologists’ have given up on nature altogether. In her book, Rambunctious Garden (2011), Emma Marris characteristically argues that there is no wilderness left on the Earth, which is everywhere completely transformed by the human presence. According to Anthropocene thinking, we are in charge of the Earth and must manage it closely whether we like it or not. Wilson disagrees, insisting that ‘areas of wilderness… are real entities’. He contends that an area need not be ‘pristine’ or uninhabited to be wilderness, and ‘[w]ildernesses have often contained sparse populations of people, especially those indigenous for centuries or millennia, without losing their essential character’.

Research across the globe has shown that many protected areas once contained not merely ‘sparse’ inhabitants but often quite dense populations – clearly incompatible with the US Wilderness Act’s classic definition of wilderness as an area ‘where man himself is a visitor who does not remain’. Most existing ‘wilderness’ parks have required the removal or severe restriction of human beings within their bounds. Indeed, one of Wilson’s models for conservation success – Gorongosa National Park in Mozambique – sidelined local people despite their unified opposition. In his book Conservation Refugees (2009), Mark Dowie estimates that 20-50 million people have been displaced by previous waves of protected-area creation. To extend protected areas to half of the Earth’s surface would require a relocation of human populations on a scale that could dwarf all previous conservation refugee crises.

Would these people include Montana cattle ranchers? Or Australian wheat growers? Or Florida retirees? The answer, most likely, is no, for the burden of conservation has never been shared equitably across the world. Those who both take the blame and pay the greatest cost of environmental degradation are, almost always, those who do not have power to influence either their own governments or international politics. It is the hill tribes of Thailand, the pastoralists of Tanzania, and the forest peoples of Indonesia who are invariably expected to relocate, often at gunpoint, as Dowie and many scholars, including Dan Brockington in his book Fortress Conservation (2002), have demonstrated.

How will human society withstand the shock of removing so much land and ocean from food-growing and other uses? Wilson criticises the Anthropocene worldview’s faith that technological innovation can solve environmental problems or find substitutes for depleted resources, but he simultaneously promotes his own techno-fix in a vision of ‘intensified economic evolution’ in which ‘the free market, and the way it is increasingly shaped by high technology’ will solve the problem seemingly automatically. According to Wilson, ‘products that win competition today… are those that cost less to manufacture and advertise, need less frequent repair and replacement, and give highest performance with a minimum amount of energy’. He thus invokes a biological version of Adam Smith’s invisible hand in maintaining that ‘[j]ust as natural selection drives organic evolution by competition among genes to produce more copies of themselves per unit cost in the next generation, raising benefit-to-cost of production drives the evolution of the economy’ and asserting, without any evidence, that ‘[a]lmost all of the competition in a free market, other than in military technology, raises the average quality of life’.

Remarkably, this utopian optimism about technology and the workings of the free market leads Wilson to converge on a position rather like that of the Anthropocene conservationists he so dislikes, advocating a vision of ‘decoupling economic activity from material and environmental throughputs’ in order to create sustainable livelihoods for a population herded into urban areas to free space for self-willed nature. The Breakthrough Institute has recently promoted its own, quite similar, manifesto for land sparing and decoupling to increase terrain for conservation.

In this vision, science and technology can compensate for some of humanity’s status as the world’s ‘most destructive species’. And at the pinnacle of science stands (conservation) biology, according to Wilson. He argues: ‘If people are to live long and healthy lives in the sustainable Eden of our dreams, and our minds are to break free and dwell in the far more interesting universe of reason triumphant over superstition, it will be through advances in biology.’ How exactly humans are to ‘break free’ is not explained and is, in fact, impossible according to Wilson himself, given ‘the Darwinian propensity in our brain’s machinery to favour short-term decisions over long-range planning’. As far as Wilson is concerned, any worldview that does not favour protected-area expansion as the highest goal is by definition an irrational one. In this way, the world’s poor are blamed not only for overpopulating biodiversity hotspots but also for succumbing to the ‘religious belief and inept philosophical thought’ standing in the way of environmental Enlightenment.

Let us finish by making a broader point, drawing on Wilson’s approving quotation of Alexander von Humboldt, the 19th-century German naturalist who claimed that ‘the most dangerous worldview is the worldview of those who have not viewed the world’. In viewing the world, we also construct it, and the world Wilson’s offers us in Half-Earth is a truly bizarre one. For all his zeal, (misplaced) righteousness and passion, his vision is disturbing and dangerous, and would have profoundly negative ‘consequences if played out’. It would entail forcibly herding a drastically reduced human population into increasingly crowded urban areas to be managed in oppressively technocratic ways. How such a global programme of conservation Lebensraum would be accomplished is left to the reader’s imagination. We therefore hope readers will not take Wilson’s proposal seriously. Addressing biodiversity loss and other environmental problems must proceed by confronting the world’s obscene inequality, not by blaming the poor and trusting the ‘free market’ to save them.


Half-Earth (AEON)

29 February, 2016

Half of the Earth’s surface and seas must be dedicated to the conservation of nature, or humanity will have no future

by Edward O Wilson

Header essay nationalgeographic 381719

The Serengeti National Park. Photo by Medford Taylor/National Geographic

Edward O Wilson is a professor emeritus in entomology at Harvard. Half-Earth concludes Wilson’s trilogy begun by The Social Conquest of Earth and The Meaning of Human Existence, a National Book Award finalist. 

Edited by Pam Weintraub

Unstanched haemorrhaging has only one end in all biological systems: death for an organism, extinction for a species. Researchers who study the trajectory of biodiversity loss are alarmed that, within the century, an exponentially rising extinction rate might easily wipe out most of the species still surviving at the present time.

The crucial factor in the life and death of species is the amount of suitable habitat left to them. When, for example, 90 per cent of the area is removed, the number that can persist sustainably will descend to about a half. Such is the actual condition of many of the most species-rich localities around the world, including Madagascar, the Mediterranean perimeter, parts of continental southwestern Asia, Polynesia, and many of the islands of the Philippines and the West Indies. If 10 per cent of the remaining natural habitat were then also removed – a team of lumbermen might do it in a month – most or all of the surviving resident species would disappear.

Today, every sovereign nation in the world has a protected-area system of some kind. All together the reserves number about 161,000 on land and 6,500 over marine waters. According to the World Database on Protected Areas, a joint project of the United Nations Environmental Program and the International Union for Conservation of Nature, they occupied by 2015 a little less than 15 per cent of Earth’s land area and 2.8 per cent of Earth’s ocean area. The coverage is increasing gradually. This trend is encouraging. To have reached the existing level is a tribute to those who have led and participated in the global conservation effort.

But is the level enough to halt the acceleration of species extinction? Unfortunately, it is in fact nowhere close to enough. The declining world of biodiversity cannot be saved by the piecemeal operations in current use alone. The extinction rate our behaviour is now imposing on the rest of life, and seems destined to continue, is more correctly viewed as the equivalent of a Chicxulub-sized asteroid strike played out over several human generations.

The only hope for the species still living is a human effort commensurate with the magnitude of the problem. The ongoing mass extinction of species, and with it the extinction of genes and ecosystems, ranks with pandemics, world war, and climate change as among the deadliest threats that humanity has imposed on itself. To those who feel content to let the Anthropocene evolve toward whatever destiny it mindlessly drifts, I say please take time to reconsider. To those who are steering the growth of reserves worldwide, let me make an earnest request: don’t stop, just aim a lot higher.

see just one way to make this 11th-hour save: committing half of the planet’s surface to nature to save the immensity of life-forms that compose it. Why one-half? Why not one-quarter or one-third? Because large plots, whether they already stand or can be created from corridors connecting smaller plots, harbour many more ecosystems and the species composing them at a sustainable level. As reserves grow in size, the diversity of life surviving within them also grows. As reserves are reduced in area, the diversity within them declines to a mathematically predictable degree swiftly – often immediately and, for a large fraction, forever. A biogeographic scan of Earth’s principal habitats shows that a full representation of its ecosystems and the vast majority of its species can be saved within half the planet’s surface. At one-half and above, life on Earth enters the safe zone. Within half, existing calculations from existing ecosystems indicate that more than 80 per cent of the species would be stabilised.

There is a second, psychological argument for protecting half of Earth. The current conservation movement has not been able to go the distance because it is a process. It targets the most endangered habitats and species and works forward from there. Knowing that the conservation window is closing fast, it strives to add increasing amounts of protected space, faster and faster, saving as much as time and opportunity will allow.

The key is the ecological footprint, defined as the amount of space required to meet the needs of an average person

Half-Earth is different. It is a goal. People understand and prefer goals. They need a victory, not just news that progress is being made. It is human nature to yearn for finality, something achieved by which their anxieties and fears are put to rest.

The Half-Earth solution does not mean dividing the planet into hemispheric halves or any other large pieces the size of continents or nation-states. Nor does it require changing ownership of any of the pieces, but instead only the stipulation that they be allowed to exist unharmed. It does, on the other hand, mean setting aside the largest reserves possible for nature, hence for the millions of other species still alive.

The key to saving one-half of the planet is the ecological footprint, defined as the amount of space required to meet all of the needs of an average person. It comprises the land used for habitation, fresh water, food production and delivery, personal transportation, communication, governance, other public functions, medical support, burial, and entertainment. In the same way the ecological footprint is scattered in pieces around the world, so are Earth’s surviving wildlands on the land and in the sea. The pieces range in size from the major desert and forest wildernesses to pockets of restored habitats as small as a few hectares.

But, you may ask, doesn’t a rising population and per-capita consumption doom the Half-Earth prospect? In this aspect of its biology, humanity appears to have won a throw of the demographic dice. Its population growth has begun to decelerate autonomously, without pressure one way or the other from law or custom. In every country where women have gained some degree of social and financial independence, their average fertility has dropped by a corresponding amount through individual personal choice.

There won’t be an immediate drop in the total world population. An overshoot still exists due to the longevity of the more numerous offspring of earlier, more fertile generations. There also remain high-fertility countries, with an average of more than three surviving children born to each woman, thus higher than the 2.1 children per woman that yields zero population growth. Even as it decelerates toward zero growth, population will reach between 9.6 billion and 12.3 billion, up from the 7.2 billion existing in 2014. That is a heavy burden for an already overpopulated planet to bear, but unless women worldwide switch back from the negative population trend of fewer than 2.1 children per woman, a turn downward in the early 22nd century is inevitable.

And what of per-capita consumption? The footprint will evolve, not to claim more and more space, as you might at first suppose, but less. The reason lies in the evolution of the free market system, and the way it is increasingly shaped by high technology. The products that win are those that cost less to manufacture and advertise, need less frequent repair and replacement, and give highest performance with a minimum amount of energy. Just as natural selection drives organic evolution by competition among genes to produce more copies of themselves per unit cost in the next generation, raising benefit-to-cost of production drives the evolution of the economy. Teleconferencing, online purchase and trade, ebook personal libraries, access on the Internet to all literature and scientific data, online diagnosis and medical practice, food production per hectare sharply raised by indoor vertical gardens with LED lighting, genetically engineered crops and microorganisms, long-distance business conferences and social visits by life-sized images, and not least the best available education in the world free online to anyone, anytime, and anywhere. All of these amenities will yield more and better results with less per-capita material and energy, and thereby will reduce the size of the ecological footprint.

In viewing the future this way, I wish to suggest a means to achieve almost free enjoyment of the world’s best places in the biosphere that I and my fellow naturalists have identified. The cost-benefit ratio would be extremely small. It requires only a thousand or so high-resolution cameras that broadcast live around the clock from sites within reserves. People would still visit any reserve in the world physically, but they could also travel there virtually and in continuing real time with no more than a few keystrokes in their homes, schools, and lecture halls. Perhaps a Serengeti water hole at dawn? Or a teeming Amazon canopy? There would also be available streaming video of summer daytime on the coast in the shallow offshore waters of Antarctica, and cameras that continuously travel through the great coral triangle of Indonesia and New Guinea. With species identifications and brief expert commentaries unobtrusively added, the adventure would be forever changing, and safe.

The spearhead of this intensive economic evolution, with its hope for biodiversity, is contained in the linkage of biology, nanotechnology, and robotics. Two ongoing enterprises within it, the creation of artificial life and artificial minds, seem destined to preoccupy a large part of science and high technology for the rest of the present century.

The creation of artificial life forms is already a reality. On 20 May 2010, a team of researchers at the J Craig Venter Institute in California announced the second genesis of life, this time by human rather than divine command. They had built live cells from the ground up. With simple chemical reagents off the shelf, they assembled the entire genetic code of a bacterial species, Mycoplasma mycoides, a double helix of 1.08 million DNA base pairs. During the process they modified the code sequence slightly, implanting a statement made by the late theoretical physicist Richard Feynman, ‘What I cannot create, I do not understand,’ in order to detect daughters of the altered mother cells in future tests.

If our minds are to break free and dwell in the far more interesting universe of reason triumphant over superstition, it will be through advances in biology

The textbook example of elementary artificial selection of the past 10 millennia is the transformation of teosinte, a species of wild grass with three races in Mexico and Central America, into maize (corn). The food found in the ancestor was a meagre packet of hard kernels. Over centuries of selective breeding it was altered into its modern form. Today maize, after further selection and widespread hybridisation of inbred strains that display ‘hybrid vigour’ is the principal food of hundreds of millions.

The first decade of the present century thus saw the beginning of the next new major phase of genetic modification beyond hybridisation: artificial selection and even direct substitution in single organisms of one gene for another. If we use the trajectory of progress in molecular biology during the previous half century as a historical guide, it appears inevitable that scientists will begin routinely to build cells of wide variety from the ground up, then induce them to multiply into synthetic tissues, organs, and eventually entire independent organisms of considerable complexity.

If people are to live long and healthy lives in the sustainable Eden of our dreams, and our minds are to break free and dwell in the far more interesting universe of reason triumphant over superstition, it will be through advances in biology. The goal is practicable because scientists, being scientists, live with one uncompromising mandate: press discovery to the limit. There has already emerged a term for the manufacture of organisms and parts of organisms: synthetic biology. Its potential benefits, easily visualised as spreading through medicine and agriculture, are limited only by imagination. Synthetic biology will also bring onto centre stage the microbe-based increase of food and energy.

Each passing year sees advances in artificial intelligence and their multitudinous applications – advances that would have been thought distantly futuristic a decade earlier. Robots roll over the surface of Mars. They travel around boulders and up and down slopes while photographing, measuring minutiae of topography, analysing the chemical composition of soil and rocks, and scrutinising everything for signs of life.

In the early period of the digital revolution, innovators relied on machine design of computers without reference to the human brain, much as the earliest aeronautical engineers used mechanical principles and intuition to design aircraft instead of imitating the flight of birds. But with the swift growth of both fields, one-on-one comparisons are multiplying. The alliance of computer technology and brain science has given birth to whole brain emulation as one of the ultimate goals of science.

From the time of the ancient human-destined line of amphibians, then reptiles, then mammals, the neural pathways of every part of the brain were repeatedly altered by natural selection to adapt the organism to the environment in which it lived. Step-by-step, from the Paleozoic amphibians to the Cenozoic primates, the ancient centres were augmented by newer centres, chiefly in the growing cortex, that added to learning ability. All things being equal, the ability of organisms to function through seasons and across different habitats gave them an edge in the constant struggle to survive and reproduce.

Little wonder, then, that neurobiologists have found the human brain to be densely sprinkled with partially independent centres of unconscious operations, along with all of the operators of rational thought. Located through the cortex in what might look at first like random arrays are the headquarters of process variously for numbers, attention, face-recognition, meanings, reading, sounds, fears, values, and error detection. Decisions tend to be made by the brute force of unconscious choice in these centres prior to conscious comprehension.

Next in evolution came consciousness, a function of the human brain that, among other things, reduces an immense stream of sense data to a small set of carefully selected bite-size symbols. The sampled information can then be routed to another processing stage, allowing us to perform what are fully controlled chains of operations, much like a serial computer. This broadcasting function of consciousness is essential. In humans, it is greatly enhanced by language, which lets us distribute our conscious thoughts across the social network.

What has brain science to do with biodiversity? At first, human nature evolved along a zigzag path as a continually changing ensemble of genetic traits while the biosphere continue to evolve on its own. But the explosive growth of digital technology transformed every aspect of our lives and changed our self-perception, bringing the ‘bnr’ industries (biology, nanotechnology, robotics) to the forefront of the modern economy. These three have the potential either to favour biodiversity or to destroy it.

I believe they will favour it, by moving the economy away from fossil fuels to energy sources that are clean and sustainable, by radically improving agriculture with new crop species and ways to grow them, and by reducing the need or even the desire for distant travel. All are primary goals of the digital revolution. Through them the size of the ecological footprint will also be reduced. The average person can expect to enjoy a longer, healthier life of high quality yet with less energy extraction and raw demand put on the land and sea. If we are lucky (and smart), world population will peak at a little more than 10 billion people by the end of the century followed by the ecological footprint soon thereafter. The reason is that we are thinking organisms trying to understand how the world works. We will come awake.

Silicon Valley dreamers of a digitised humanity have failed to give much thought at all to the biosphere

That process is already under way, albeit still far too slowly – with the end in sight in the 23rd century. We and the rest of life with us are in the middle of a bottleneck of rising population, shrinking resources, and disappearing species. As its stewards we need to think of our species as being in a race to save the living environment. The primary goal is to make it through the bottleneck to a better, less perilous existence while carrying through as much of the rest of life as possible. If global biodiversity is given space and security, most of the large fraction of species now endangered will regain sustainability on their own. Furthermore, advances made in synthetic biology, artificial intelligence, whole brain emulation, and other similar, mathematically based disciplines can be imported to create an authentic, predictive science of ecology. In it, the interrelations of species will be explored as fervently as we now search through our own bodies for health and longevity. It is often said that the human brain is the most complex system known to us in the universe. That is incorrect. The most complex is the individual natural ecosystem, and the collectivity of ecosystems comprising Earth’s species-level biodiversity. Each species of plant, animal, fungus, and microorganism is guided by sophisticated decision devices. Each is intricately programmed in its own way to pass with precision through its respective life cycle. It is instructed on when to grow, when to mate, when to disperse, and when to shy away from enemies. Even the single-celled Escherichia coli, living in the bacterial paradise of our intestines, moves toward food and away from toxins by spinning its tail cilium one way, then the other way, in response to chemosensory molecules within its microscopic body.

How minds and decision-making devices evolve, and how they interact with ecosystems is a vast area of biology that remains mostly uncharted – and still even undreamed by those scientists who devote their lives to it. The analytic techniques coming to bear on neuroscience, on Big Data theory, on simulations with robot avatars, and on other comparable enterprises will find applications in biodiversity studies. They are ecology’s sister disciplines.

It is past time to broaden the discussion of the human future and connect it to the rest of life. The Silicon Valley dreamers of a digitised humanity have not done that, not yet. They have failed to give much thought at all to the biosphere. With the human condition changing so swiftly, we are losing or degrading to uselessness ever more quickly the millions of species that have run the world independently of us and free of cost. If humanity continues its suicidal ways to change the global climate, eliminate ecosystems, and exhaust Earth’s natural resources, our species will very soon find itself forced into making a choice, this time engaging the conscious part of our brain. It is as follows: shall we be existential conservatives, keeping our genetically-based human nature while tapering off the activities inimical to ourselves and the rest of the biosphere? Or shall we use our new technology to accommodate the changes important solely to our own species, while letting the rest of life slip away? We have only a short time to decide.

The beautiful world our species inherited took the biosphere 3.8 billion years to build. The intricacy of its species we know only in part, and the way they work together to create a sustainable balance we have only recently begun to grasp. Like it or not, and prepared or not, we are the mind and stewards of the living world. Our own ultimate future depends upon that understanding. We have come a very long way through the barbaric period in which we still live, and now I believe we’ve learned enough to adopt a transcendent moral precept concerning the rest of life.

Reprinted from ‘Half-Earth: Our Planet’s Fight for Life’ by Edward O Wilson. Copyright © 2016 by Edward O Wilson. With permission of the publisher, Liveright Publishing Corporation. All rights reserved.

Excessive empathy can impair understanding of others (Science Daily)

Date:
April 28, 2016
Source:
Julius-Maximilians-Universität Würzburg, JMU
Summary:
People who empathize easily with others do not necessarily understand them well. To the contrary: Excessive empathy can even impair understanding as a new study conducted by psychologists has established.

Excessive empathy can impair understanding as a new study conducted by psychologists from Würzburg and Leipzig has established. Credit: © ibreakstock / Fotolia

People who empathize easily with others do not necessarily understand them well. To the contrary: Excessive empathy can even impair understanding as a new study conducted by psychologists from Würzburg and Leipzig has established.

Imagine your best friend tells you that his girlfriend has just proposed “staying friends.” Now you have to accomplish two things: Firstly, you have to grasp that this nice sounding proposition actually means that she wants to break up with him and secondly, you should feel with your friend and comfort him.

Whether empathy and understanding other people’s mental states (mentalising) — i.e. the ability to understand what others know, plan and want — are interrelated has recently been examined by the psychologists Anne Böckler, Philipp Kanske, Mathis Trautwein, Franca Parianen-Lesemann and Tania Singer.

Anne Böckler has been a junior professor at the University of Würzburg’s Institute of Psychology since October 2015. Previously, the post-doc had worked in the Department of Social Neurosciences at the Max Planck Institute of Human Cognitive and Brain Sciences in Leipzig where she conducted the study together with her co-workers. In the scientific journal Social Cognitive and Affective Neuroscience, the scientists present the results of their work.

“Successful social interaction is based on our ability to feel with others and to understand their thoughts and intentions,” Anne Böckler explains. She says that it had been unclear previously whether and to what extend these two skills were interrelated — that is whether people who empathise easily with others are also capable of grasping their thoughts and intentions. According to the junior professor, the scientists also looked into the question of whether the neuronal networks responsible for these abilities interact.

Answers can be gleaned from the study conducted by Anne Böckler, Philipp Kanske and their colleagues at the Max Planck Institute in Leipzig within the scope of a large-scale study led by Tania Singer which included some 200 participants. The study enabled the scientists to prove that people who tend to be empathic do not necessarily understand other people well at a cognitive level. Hence, social skills seem to be based on multiple abilities that are rather independent of one another.

The study also delivered new insight as to how the different networks in the brain are orchestrated, revealing that networks crucial for empathy and cognitive perspective-taking interact with one another. In highly emotional moments — for example when somebody talks about the death of a close person — activation of the insula, which forms part of the empathy-relevant network, can have an inhibiting effect in some people on brain areas important for taking someone else’s perspective. And this in turn can cause excessive empathy to impair social understanding.

The participants to the study watched a number of video sequences in which the narrator was more or less emotional. Afterwards, they had to rate how they felt and how much compassion they felt for the person in the film. Then they had to answer questions about the video — for example what the persons could have thought, known or intended. Having thus identified persons with a high level of empathy, the psychologists looked at their portion among the test participants who had had good or poor results in the test about cognitive perspective-taking — and vice versa.

Using functional magnetic resonance imaging, the scientists observed which areas of the brain where active at what time.

The authors believe that the results of this study are important both for neuroscience and clinical applications. For example, they suggest that training aimed at improving social skills, the willingness to empathise and the ability to understand others at the cognitive level and take their perspective should be promoted selectively and separately of one another. The group in the Department of Social Neurosciences in Leipzig is currently working on exactly this topic within the scope of the ReSource project, namely how to specifically train different social skills.


Journal Reference:

  1. Artyom Zinchenko, Philipp Kanske, Christian Obermeier, Erich Schröger, Sonja A. Kotz. Emotion and goal-directed behavior: ERP evidence on cognitive and emotional conflictSocial Cognitive and Affective Neuroscience, 2015; 10 (11): 1577 DOI: 10.1093/scan/nsv050

A single-celled organism capable of learning (Science Daily)

Date:
April 27, 2016
Source:
CNRS
Summary:
For the first time, scientists have demonstrated that an organism devoid of a nervous system is capable of learning. Biologists have succeeded in showing that a single-celled organism, the protist, is capable of a type of learning called habituation. This discovery throws light on the origins of learning ability during evolution, even before the appearance of a nervous system and brain. It may also raise questions as to the learning capacities of other extremely simple organisms such as viruses and bacteria.

The slime mold Physarum polycephalum (diameter: around 10 centimeters), made up of a single cell, was here cultivated in the laboratory on agar gel. Credit: Audrey Dussutour (CNRS)

For the first time, scientists have demonstrated that an organism devoid of a nervous system is capable of learning. A team from the Centre de Recherches sur la Cognition Animale (CNRS/Université Toulouse III — Paul Sabatier) has succeeded in showing that a single-celled organism, the protist Physarum polycephalum, is capable of a type of learning called habituation. This discovery throws light on the origins of learning ability during evolution, even before the appearance of a nervous system and brain. It may also raise questions as to the learning capacities of other extremely simple organisms such as viruses and bacteria. These findings are published in the Proceedings of the Royal Society B on 27 April 2016.

An ability to learn, and memory are key elements in the animal world. Learning from experiences and adapting behavior accordingly are vital for an animal living in a fluctuating and potentially dangerous environment. This faculty is generally considered to be the prerogative of organisms endowed with a brain and nervous system. However, single-celled organisms also need to adapt to change. Do they display an ability to learn? Bacteria certainly show adaptability, but it takes several generations to develop and is more a result of evolution. A team of biologists thus sought to find proof that a single-celled organism could learn. They chose to study the protist, or slime mold, Physarum polycephalum, a giant cell that inhabits shady, cool areas[1] and has proved to be endowed with some astonishing abilities, such as solving a maze, avoiding traps or optimizing its nutrition[2]. But until now very little was known about its ability to learn.

During a nine-day experiment, the scientists thus challenged different groups of this mold with bitter but harmless substances that they needed to pass through in order to reach a food source. Two groups were confronted either by a “bridge” impregnated with quinine, or with caffeine, while the control group only needed to cross a non-impregnated bridge. Initially reluctant to travel through the bitter substances, the molds gradually realized that they were harmless, and crossed them increasingly rapidly — behaving after six days in the same way as the control group. The cell thus learned not to fear a harmless substance after being confronted with it on several occasions, a phenomenon that the scientists refer to as habituation. After two days without contact with the bitter substance, the mold returned to its initial behavior of distrust. Furthermore, a protist habituated to caffeine displayed distrustful behavior towards quinine, and vice versa. Habituation was therefore clearly specific to a given substance.

Habituation is a form of rudimentary learning, which has been characterized in Aplysia (an invertebrate also called sea hare)[3]. This form of learning exists in all animals, but had never previously been observed in a non-neural organism. This discovery in a slime mold, a distant cousin of plants, fungi and animals that appeared on Earth some 500 million years before humans, improves existing understanding of the origins of learning, which markedly preceded those of nervous systems. It also offers an opportunity to study learning types in other very simple organisms, such as viruses or bacteria.

[1] This single cell, which contains thousands of nuclei, can cover an area of around a square meter and moves within its environment at speeds that can reach 5 cm per hour.

[2] See “Even single-celled organisms feed themselves in a ‘smart’ manner.” https://www.sciencedaily.com/releases/2010/02/100210164712.htm

[3] Mild tactile stimulation of the animal’s siphon normally causes the defensive reflex of withdrawing the branchiae. If the harmless tactile stimulation is repeated, this reflex diminishes and finally disappears, thus indicating habituation.


Journal Reference:

  1. Romain P. Boisseau, David Vogel, Audrey Dussutour. Habituation in non-neural organisms: evidence from slime mouldsProceedings of the Royal Society B: Biological Sciences, 2016; 283 (1829): 20160446 DOI: 10.1098/rspb.2016.0446

Comissão do Senado aprova PEC que derruba licenciamento ambiental para obras (Estadão)

André Borges, 27/04/2016

BRASÍLIA – Em meio ao terremoto político que toma conta de Brasília, a Comissão de Constituição, Justiça e Cidadania (CCJ) do Senado aprovou nesta quarta-feira, sem alarde, uma Proposta de Emenda à Constituição que simplesmente rasga a legislação ambiental aplicada atualmente em processos de licenciamento de obras públicas.

A PEC 65/2012, de autoria do senador Acir Gurgacz (PDT-RO) e relatada pelo senador Blairo Maggi (PR-MT), estabelece que, a partir da simples apresentação de um Estudo Impacto Ambiental (EIA) pelo empreendedor, nenhuma obra poderá mais ser suspensa ou cancelada. Na prática, isso significa que o processo de licenciamento ambiental, que analisa se um empreendimento é viável ou não a partir dos impactos socioambientais que pode gerar, deixa de existir.

Em um documento de apenas três páginas, os parlamentares informam que “a proposta inova o ordenamento jurídico”, por não permitir “a suspensão de obra ou o seu cancelamento após a apresentação do estudo prévio de impacto ambiental (EIA), exceto por fatos supervenientes”. A mudança, sustentam os parlamentares, “tem por objetivo garantir a celeridade e a economia de recursos em obras públicas sujeitas ao licenciamento ambiental, ao impossibilitar a suspensão ou cancelamento de sua execução após a concessão da licença”.

O licenciamento ambiental, seja ele feito pelo Ibama ou por órgãos estaduais, estabelece que qualquer empreendimento tem que passar por três etapas de avaliação técnica. Para verificar a viabilidade de uma obra, é preciso realizar os estudos de impacto e pedir sua licença prévia ambiental. Este documento estabelece, inclusive, quais serão as medidas compensatórias que a empresa terá de executar para realizar o projeto. Ao obter a licença prévia, o empreendedor precisa, em seguida, obter uma licença de instalação, que permite o início efetivo da obra, processo que também é monitorado e que pode resultar em novas medidas condicionantes. Na terceira etapa, é dada a licença de operação, que autoriza a utilização do empreendimento, seja ele uma estrada, uma hidrelétrica ou uma plataforma de petróleo. O que a PEC 65 faz, basicamente, é ignorar essas três etapas.

“Estamos perplexos com essa proposta. Se a simples apresentação de um EIA passa a ser suficiente para tocar uma obra, independentemente desse documento ser analisado e aprovado previamente, acaba-se com a legislação ambiental. É um flagrante desrespeito à Constituição, que se torna letra morta em tudo o que diz respeito ao meio ambiente”, disse ao ‘Estado’ a coordenadora da 4ª Câmara de meio ambiente e patrimônio cultural do Ministério Público Federal, Sandra Cureau.

O Ministério Público Federal e os estaduais, segunda Sandra, vão adotar um posicionamento contundente contra a proposta. “Temos que mostrar aos parlamentares o absurdo que estão cometendo. O Brasil é signatário de vários pactos internacionais de preservação do meio ambiente. A Constituição tem que ser harmônica, não contraditória em seus incisos”, comentou.

A PEC tem um regime especial de tramitação. Ela precisa ser discutida e votada em cada uma das casas do Congresso Nacional, em dois turnos. Para ser aprovada em ambas, precisa de três quintos dos votos (60%) dos respectivos membros do Senado e da Câmara. A emenda constitucional tem que ser promulgada pelas mesas das duas casas, e não necessita de sanção presidencial.

Em sua análise, o senador Blairo Maggi sustentou que a PEC “visa garantir segurança jurídica à execução das obras públicas”, quando sujeitas ao licenciamento ambiental. “Certo é que há casos em que ocorrem interrupções de obras essenciais ao desenvolvimento nacional e  estratégicas ao País em razão de decisões judiciais de natureza cautelar ou liminar, muitas vezes protelatórias”, declarou.

Segundo Maggi, “claramente se pode observar que a proposta não objetiva afastar a exigência do licenciamento ambiental ou da apresentação de um de seus principais instrumentos de avaliação de impacto, o EIA. Não afeta, assim, o direito ao meio ambiente ecologicamente equilibrado e consagra princípios constitucionais da administração pública, como a eficiência e a economicidade”.

Modelo matemático auxilia a planejar operação de reservatórios de água (Fapesp)

Sistema computacional desenvolvido por pesquisadores da USP e da Unicamp estabelece regras de racionamento de suprimento hídrico em períodos de seca

Pesquisadores da Escola Politécnica da Universidade de São Paulo (Poli-USP) e da Faculdade de Engenharia Civil, Arquitetura e Urbanismo da Universidade Estadual de Campinas (FEC-Unicamp) desenvolveram novos modelos matemáticos e computacionais voltados a otimizar a gestão e a operação de sistemas complexos de suprimento hídrico e de energia elétrica, como os existentes no Brasil.

Os modelos, que começaram a ser desenvolvidos no início dos anos 2000, foram aprimorados por meio do Projeto Temático “HidroRisco: Tecnologias de gestão de riscos aplicadas a sistemas de suprimento hídrico e de energia elétrica”, realizado com apoio da Fapesp.

“A ideia é que os modelos matemáticos e computacionais que desenvolvemos possam auxiliar os gestores dos sistemas de distribuição e abastecimento de água e energia elétrica na tomada de decisões que têm enormes impactos sociais e econômicos, como a de decretar racionamento”, disse Paulo Sérgio Franco Barbosa, professor da FEC-Unicamp e coordenador do projeto, à Agência Fapesp.

De acordo com Barbosa, muitas das tecnologias utilizadas hoje nos setores hídrico e energético no Brasil para gerir a oferta e a demanda e os riscos de desabastecimento de água e energia em situações de eventos climáticos extremos, como estiagem severa, foram desenvolvidas na década de 1970, quando as cidades brasileiras eram menores e o País não dispunha de um sistema hídrico e hidroenergético tão complexo como o atual.

Por essas razões, segundo ele, esses sistemas de gestão apresentam falhas como não levar em conta a conexão entre as diferentes bacias e não estimar a ocorrência de eventos climáticos mais extremos do que os que já aconteceram no passado ao planejar a operação de um sistema de reservatórios e distribuição de água.

“Houve falha no dimensionamento da capacidade de abastecimento de água do reservatório Cantareira, por exemplo, porque não se imaginou que aconteceria uma seca pior do que a que atingiu a bacia em 1953, considerado o ano mais seco da história do reservatório antes de 2014”, afirmou Barbosa.

A fim de aprimorar esses sistemas de gestão de risco existentes hoje, os pesquisadores desenvolveram novos modelos matemáticos e computacionais que simulam a operação de um sistema de suprimento hídrico ou de energia de forma integrada e em diferentes cenários de aumento de oferta e demanda de água.

“Por meio de algumas técnicas estatísticas e computacionais, os modelos que desenvolvemos são capazes de fazer simulações melhores e proteger mais um sistema de suprimento hídrico ou de energia elétrica contra riscos climáticos”, disse Barbosa.

Sisagua

Um dos modelos desenvolvidos pelos pesquisadores em colaboração com colegas da University of California em Los Angeles, nos Estados Unidos, é a plataforma de modelagem de otimização e simulação de sistemas de suprimento hídrico Sisagua.

A plataforma computacional integra e representa todas as fontes de abastecimento de um sistema de reservatórios e distribuição de água de cidades de grande porte, como São Paulo, incluindo os reservatórios, canais, dutos, estações de tratamento e de bombeamento.

“O Sisagua possibilita planejar a operação, estudar a capacidade de suprimento e avaliar alternativas de expansão ou de diminuição do fornecimento de um sistema de abastecimento de água de forma integrada”, apontou Barbosa.

Um dos diferenciais do modelo computacional, segundo o pesquisador, é estabelecer regras de racionamento de um sistema de reservatórios e distribuição de água de grande porte em períodos de seca, como o que São Paulo passou em 2014, de modo a minimizar os danos à população e à economia causados por um eventual racionamento.

Quando um dos reservatórios do sistema atinge um volume abaixo dos níveis normais e próximo do volume mínimo de operação, o modelo computacional indica um primeiro estágio de racionamento, reduzindo a oferta da água armazenada em 10%, por exemplo.

Se a crise de abastecimento do reservatório prolongar, o modelo matemático indica alternativas para minimizar a intensidade do racionamento distribuindo o corte de água de forma mais uniforme ao longo do período de escassez de água e entre os outros reservatórios do sistema.

“O Sisagua possui uma inteligência computacional que indica onde e quando cortar o fornecimento de água de um sistema de abastecimento hídrico, de modo a minimizar os danos no sistema e para a população e a economia de uma cidade”, afirmou Barbosa.

Sistema Cantareira

Os pesquisadores aplicaram o Sisagua para simular a operação e a gestão do sistema de distribuição de água da região metropolitana de São Paulo, que abastece cerca de 18 milhões de pessoas e é considerado um dos maiores do mundo, com vazão média de 67 metros cúbicos por segundo (m³/s).

O sistema de distribuição de água paulista é composto por oito subsistemas de abastecimento, sendo o maior deles o Cantareira, que fornece água para 5,3 milhões de pessoas, com vazão média de 33 m³/s.

A fim de avaliar a capacidade de suprimento do Cantareira em um cenário de escassez de água e, ao mesmo tempo, de aumento da demanda pelo recurso natural, os pesquisadores realizaram uma simulação de planejamento do uso do subsistema em um período de dez anos utilizando o Sisagua.

Para isso, eles usaram dados de vazões afluentes (de entrada de água) do Cantareira entre 1950 e 1960, fornecidos pela Companhia de Saneamento Básico do Estado de São Paulo (Sabesp).

“Essa período de tempo foi escolhido como base para as projeções do Sisagua porque registrou secas severas, quando as afluências ficaram significativamente abaixo das médias por quatro anos seguidos, entre 1952 e 1956”, explicou Barbosa.

A partir dos dados de vazão afluente desse série histórica, o modelo matemático e computacional analisou cenários com demanda variável de água do Cantareira entre 30 e 40 m³/s.

Algumas das constatações do modelo foram que o Cantareira é capaz de atender uma demanda de até 34 m³/s em um cenário de escassez de água como ocorreu entre 1950 a 1960 com um risco insignificante de desabastecimento. Acima desse valor a escassez e, consequentemente, o risco de racionamento de água no reservatório aumenta exponencialmente.

Para que o Cantareira possa atender uma demanda de 38 m³/s em um período de escassez de água, o modelo indicou que seria preciso começar a racionar a água do reservatório 40 meses (3 anos e 4 meses) antes que o nível da bacia atingisse o ponto crítico, abaixo do volume normal e próximo do limite mínimo de operação.

Dessa forma, seria possível atender entre 85% e 90% da demanda de água do reservatório no período de seca até que ele recuperasse seu volume ideal, evitando um racionamento mais grave do que aconteceria caso fosse mantido o nível pleno de abastecimento do reservatório.

“Quanto antes for feito o racionamento de água de um sistema de abastecimento hídrico melhor o prejuízo é distribuído ao longo do tempo”, disse Barbosa. “A população pode se preparar melhor para um racionamento de 15% de água durante um período de dois anos, por exemplo, do que um corte de 40% em apenas dois meses”, comparou.

Sistemas integrados

Em outro estudo, os pesquisadores usaram o Sisagua para avaliar a capacidade de os subsistemas Cantareira, Guarapiranga, Alto Tietê e Alto Cotia atenderem as atuais demandas de água em um cenário de escassez do recurso natural.

Para isso, eles também utilizaram dados de vazões afluentes dos quatro subsistemas no período de 1950 a 1960.

Os resultados das análises feitas pelo método matemático e computacional indicaram que o subsistema de Cotia atingiu um limite crítico de racionamento diversas vezes durante o período simulado de dez anos.

Em contrapartida, o subsistema Alto Tietê ficou com volume de água acima de sua meta frequentemente.

Com base nessas constatações, os pesquisadores sugerem novas interligações para transferência entre esses quatro subsistemas de abastecimento.

Parte da demanda de água do subsistema de Cotia poderia ser fornecida pelos subsistemas de Guarapiranga e Cantareira. Por outro lado, esses dois subsistemas também poderiam receber água do subsistema Alto Tietê, indicaram as projeções do Sisagua.

“A transferência de água entre os subsistemas proporcionaria maior flexibilidade e resultaria em uma melhor distribuição, eficiência e confiabilidade do sistema de abastecimento hídrico da região metropolitana de São Paulo”, avaliou Barbosa.

De acordo com o pesquisador, as projeções feitas pelo Sisagua também indicaram a necessidade de investimentos em novas fontes de abastecimento de água para a região metropolitana de São Paulo.

Segundo ele, as principais bacias que abastecem São Paulo sofrem de problemas como a concentração urbana.

Em torno da bacia do Alto Tietê, por exemplo, que ocupa apenas 2,7% do território paulista, está concentrada quase 50% da população do Estado de São Paulo, superando em cinco vezes a densidade demográfica de países como Japão, Coréia e Holanda.

Já as bacias de Piracicaba, Paraíba do Sul, Sorocaba e Baixada Santista – que representam 20% da área de São Paulo – concentram 73% da população paulista, com densidade demográfica superior ao de países como Japão, Holanda e Reino Unido, apontam os pesquisadores.

“Será inevitável pensar em outras fontes de abastecimento de água para a região metropolitana de São Paulo, como o sistema Juquiá, no interior do estado, que tem água de excelente quantidade e em grandes volumes”, disse Barbosa.

“Em razão da distância, essa obra será cara e tem sido postergada. Mas, agora, não dá mais para adiá-la”, afirmou.

Além de São Paulo, o Sisagua também foi utilizado para modelar os sistemas de suprimento hídrico de Los Angeles, nos Estados Unidos, e Taiwan.

O artigo “Planning and operation of large-scale water distribution systems with preemptive priorities”, (doi: 10.1061/(ASCE)0733-9496(2008)134:3(247)), de Barros e outros, pode ser lido por assinantes do Journal of Water Resources Planning and Managementem ascelibrary.org/doi/abs/10.1061/%28ASCE%290733-9496%282008%29134%3A3%28247%29.

Agência Fapesp

Diretoria da Aciesp se manifesta contra as declarações do governador Geraldo Alckmin sobre o fomento à pesquisa científica no Estado

JC 5405, 28 de abril de 2016

Diretoria da Aciesp se manifesta contra as declarações do governador Geraldo Alckmin sobre o fomento à pesquisa científica no Estado

Segundo a Academia de Ciências do Estado de São Paulo, o artigo publicado na revista Veja “mostra a visão parcial e distorcida, que o Governador demonstra ter sobre a íntima relação ciência básica e aplicada e até sobre a ciência em São Paulo, que é motivo de orgulho para o Estado e para o País”

Veja abaixo o texto na íntegra:

A Academia de Ciências do Estado de São Paulo (Aciesp) vê com grande preocupação a nota publicada em 26 de abril de 2016 pela colunista Vera Magalhães da revista Veja, sobre a crítica do governador Alckmin à Fundação de Amparo à Pesquisa do Estado de São Paulo (Fapesp). Apesar de uma segunda nota, publicada no dia 26, ter desmentido o uso do termo máfia de pesquisadores e de ter havido um bate-boca sobre o assunto durante a reunião do secretariado. O restante do que está no artigo mostra a visão parcial e distorcida, que o Governador demonstra ter sobre a íntima relação ciência básica e aplicada e até sobre a ciência em São Paulo, que é motivo de orgulho para o Estado e para o País, considerando que o impacto da produção acadêmica brasileira no cenário mundial, deve-se em grande parte ao que se produz em São Paulo, devido ao suporte financeiro da Fapesp.

A Fapesp tem sido vista como um exemplo nacional e mundial de financiamento à ciência, tecnologia e inovação, e elogiada em diferentes âmbitos, sendo um exemplo para todos os estados brasileiros, que copiaram o modelo e vêm fazendo com que verba estatal seja mais direcionada a ciência local de cada estado.

O esforço da Fapesp na interação entre os setores acadêmico e produtivo, público e privado tem sido enorme. Em particular, há programas específicos que tratam da interação com o setor produtivo (PIPE, PITE e PAPPE) que visam financiar diretamente iniciativas junto à indústria e/ou de formar novas indústrias em São Paulo. Os esforços nestes programas são comparáveis, em qualidade, ao de países como os Estados Unidos e Alemanha e não há iniciativa comparável na América Latina.

Além destes programas mais específicos, os quatro grandes programas da Fapesp (Bioen, Biota, Mudanças Climáticas, e Computação e Science) congregam a aplicação de milhões de reais para resolver problemas práticos reais que são importantes não somente para São Paulo, mas para todo o Brasil e para o mundo. O foco em energias renováveis, notadamente o etanol, congregado pelo Bioen, avançou o conhecimento científico sobre a cana e o etanol de maneira sem precedentes. Em poucos anos de estímulo a ciência brasileira das energias renováveis está pronta para ser aplicada e mudar o paradigma sobre o etanol de segunda geração. Mesmo com a grande crise que se abateu sobre o setor sucroalcoleiro, a Fapesp nunca deixou de fomentar a pesquisa na área, apoiando os projetos e mantendo o foco. É deste tipo de atitude que o Brasil precisa, ou seja, de consistência nas convicções e, criando uma identidade com base naquilo que fazemos melhor.  No caso do Biota, com mais de 20 anos de existência, o avanço no conhecimento da biodiversidade paulista e brasileira, com reflexos internacionais inquestionáveis, ajuda a nossa sociedade a entender e poder preservar o meio ambiente. Além da preservação há também o uso sustentável da biodiversidade. Por exemplo, as descobertas de compostos que podem se tornar novos fármacos, cosméticos e aditivos de alimentos é enorme. Já o programa de Mudanças Climáticas, irmão mais novo do Biota, se debruça sobre o que tem sido considerado com o problema mais importante que a humanidade já enfrentou: as Mudanças Climáticas Globais. O programa não somente vem gerando modelos climáticos, que são a base para decidir o que fazer para evitar os efeitos extremamente graves que os impactos das Mudanças Climáticas irão produzir, mas também os seus impactos sobre a produção de alimentos, a produção industrial em geral, a saúde da população, entre outros. O Programa de Computação da Fapesp, o mais novo dos quatro, foi montado para preparar a sociedade paulista para a era do big-data, em que temos que aprender a lidar com a imensa produção de informação advinda dos avanços na área de computação.

A aparente distorção da visão do governador sobre a Fapesp é maior quando despreza o financiamento à sociologia. Este é um dos principais focos da pesquisa no Estado de São Paulo, sendo a capital o maior grupo de pesquisadores do Brasil na área. Estes são os pesquisadores que pensam em como melhorar as políticas públicas, o que acontece e porque existem populações pobres e se dedicam a encontrar soluções sobre como podemos solucionar estes problemas. Se abandonarmos as pesquisas em Ciências Sociais, o que será da nossa população?

Na área de ciências da saúde, a Fapesp vem sim investindo em Dengue há muitos anos. Mas é importante lembrar que a pesquisa sozinha não consegue resolver todos os problemas. A Fapesp não tem como missão financiar fábricas que produzem por exemplo vacinas. Estas fábricas tem que ser mantidas pelo Governo. Se o Butantan não tem dinheiro para produzir vacinas, a culpa não é da Fapesp e sim do planejamento do governo que não manteve os Institutos de Pesquisa do Estado de São Paulo em funcionamento adequado. A Fapesp cumpriu sim a sua missão em financiar a pesquisa de como fazer as vacinas.

É preciso que as informações científicas sejam incorporadas pelos políticos da forma mais íntegra possível. É isto que faz com que a probabilidade de erro nas decisões diminua. No caso da crise da água, por exemplo, por mais que os cientistas (tanto da hidrologia e agricultura, quanto da sociologia) tenham tentado avisar o governo do perigo desde a primeira crise em 2009, não houve uma resposta baseada em ciência com a antecedência necessária, mas sim em crenças e em teorias pessoais sem base científica, que levaram São Paulo a atingir uma situação crítica, na qual ainda se encontra.

Mais importante ainda é falta de visão do Governador sobre o que significa a ciência básica, aquela que aparentemente, e só aparentemente, ainda não tem aplicações. É preciso compreender que a ciência básica é a ciência aplicada do futuro e o tempo que separa ambas tem encurtado com o passar dos anos. Sem compreender os fundamentos dos fenômenos da natureza, as aplicações cegas e sem base científica levam a tecnologias fracas e pouco competitivas. Ademais, a própria classificação entre ciências básica e aplicada tem sido cada vez mais questionada.

A Fapesp vem trabalhando incessantemente para encurtar o caminho ente a descoberta básica e a aplicação, principalmente, nas últimas três décadas. As pesquisas aplicadas e de cunho tecnológico só surgem depois que algum pesquisador trabalha em média 10 anos em um problema geralmente sem aplicação aparente. Aí sim surgem as possibilidades de aplicação. E a Fapesp foi sempre sensível a isto, mantendo a pesquisa básica (a nossa galinha dos ovos de ouro) e ao mesmo tempo criando programas cada vez mais focados e que tentam resolver os problemas mais importantes da sociedade contemporânea.

A ciência é um processo lento e a sociedade tem que compreender que não há como acelerar mais do que estamos fazendo, mesmo com investimentos excelentes que a Fapesp vem mantendo em São Paulo. Isto porque a sociedade científica paulista se formou não somente com as verbas para a pesquisa, mas também com as bolsas de estudo para a graduação, pós-graduação e pós-doutoramento, que formam os profissionais em alto nível. Tudo isto leva tempo para conseguir. No caso de São Paulo levamos décadas para chegar ao nível que estamos.

Achar que a dotação de 1% é muito para a pesquisa é uma visão muito perigosa para um Estado que se autodenomina a locomotiva do País. De que adianta uma locomotiva sem combustível?

A Aciesp convoca a população a defender a Fapesp não como um patrimônio dos pesquisadores, mas como um patrimônio de todos os paulistas e brasileiros. Sem a Fapesp o Brasil mergulhará na escuridão e na dependência da ciência e tecnologia feitas em outros países. É isto que a nossa sociedade quer?

Diretoria da Academia de Ciências do Estado de São Paulo

Jornal da Ciência

Leia também:

Anpocs – Nota da diretoria executiva da Anpocs sobre as declarações do governador Geraldo Alckmin acerca do fomento à pesquisa científica

Anthropologies #21: Weather changes people: stretching to encompass material sky dynamics in our ethnography (Savage Minds)

See original text here.

September 24, 2015.

This entry is part 10 of 10 in the Anthropologies #21 series.

Heid Jerstad brings our climate change issue to a close with this thoughtful essay. Jerstad (BA Oxford, MRes SOAS) is writing up her PhD on the effects of weather on peoples lives at the university of Edinburgh. Having done fieldwork in the western Indian Himalayas, she is particularly interested in the range of social and livelihood implications that weather (and thus climate change) has. She is on twitter @entanglednotion –R.A.

For most people, the climate change issue is a bundle of scientific ideas, or maybe a chunk of guilt lurking behind that short haul flight. The words have fused together to form a single stone, immobile and heavy. Change is a bit of a nothing word anyway – anything can change, and who is to say if it is good or bad, drastic or practically unnoticeable?

But what about climate? It is a big science-y word, neither human nor particularly tangible. Climate is about a place – engrained, palimpsested, with time-depth. That big sky, those habits – the Frenchman advising wine and bed on a rainy day, the Croatian judge lenient because there was a hot wind from the Sahara that day. This is weather I am talking about, seasons, years, the heat, damp and sparkling frost.

People care about the weather. We consider ourselves used to this or good at observing that. My home has more weather than other places – it is colder in winter, the air is clearer and brighter – because it is mine. My sunsets – this is eastern Norway – are vibrant and fill the sky, my sky will snow in June with not a cloud, my nose can feel that special tingle when it gets to below -20˚c. The north is not gloomy in winter – the snow is bright white, the hydro-fuelled streetlights illuminate empty streets and windows seal the warmth in.

What is your weather? It would be safe to assume it is part of the climate and I would go out on a limb and say I think you care about it. Am I wrong?

When the weather matters to people, the task becomes one of bridging this caring and the climate change science and projections. Looking at the impact of these weather changes in different areas of life is, then, going to make up a steadily larger part of useful climate change research.

Mead famously convened a conference with Kellogg titled ‘The Atmosphere: Endangered and Endangering’ in 1975, and Douglas published Risk and Blame in 1992. In the new millennium Strauss and Orlove (2003), Crate and Nuttall (2009) and Hastrup and Rubow (2014) brought edited volumes to the debate. It seems to be fairly well established, then, that climate change is a matter for anthropologists, as phrased by the AAA statement on climate change: ‘Climate change is rooted in social institutions and cultural habits. … Climate change is not a natural problem, it is a human problem.’ What then, can anthropologists do, about this problem?

Anthropologists provide description. The mapping of people’s stories of how the weather is ‘going wrong’, stories of change, and of coping and consequences is underway (Crate 2008 described the effects of unusual winter melt on the Vilui Sakha in Siberia, Cruikshank 2005 explored the tendrils of meaning surrounding glaciers between Alaska, British Colombia and the Yukon territory). Linked to the description, of course, and not really disentanglable from it is the explanation. Explanations and understandings of weather and weather changes in the places where they are happening, whether Chesapeake Bay, the Marshall Islands, or Rajasthan, India, fill in the social significance of what had been an empty sky (Paolisso 2003, Rudiak-Gould 2013, Grodzins-Gold 1998). The weather changes, in fact, constitute one of those satisfying areas of inquiry which concern those asked as much as the anthropologist.

The question of knowledge, however, can still seem a barrier when climate scientists are those with a mandate to understand changing weather. Anna Tsing, in the Firth Lecture at the Association of Social Anthropologists of the UK and Commonwealth’s (ASA) 2015 conference in Exeter, brought the contextual ecological study of mushrooms and the trees that they are mutual with in the forests of Japan and China to illustrate the gains anthropology can make when we give up scepticism of natural science. Earlier in the year, Moore, at the launch of the Centre of the Anthropology of Sustainability (CAOS) at University College London used microbial research to break down the bounded image of the body, where on the cellular level culture and biology shape each other – for instance when poor black women in the States eat fish which contains mercury and this affects the biological development of their children. Tsing and Moore brought together what might previously have been considered within the remit of ecology or biology to make important points about the capacity of anthropology—and to suggest where we might go next, expanding vision of social science. When mushrooms and microbes are appropriate topics for anthropological research, then looking at the climate and its material as well as social effects (rotting, drying, illness (Jerstad 2014)) starts to look feasible.

The anthropocene is a term which has been shown to have considerable analytical purchase outside of geology, illuminating moral and political debates about blame, the north-south divide and the global movement of materials, people and plants (Chakrabarty 2014, Tsing 2013). These ideas have been applyied in the study of climate scientists themselves (Simonetti 2015) as well as climate policy (Lahsen 2009). The anthropocene, i.e. the world as subject to the effects of human activities such as climate change, may be read as a set of material relationships, where the weather, bodies and landscapes meet, as Ingold showed (2010). This term allows the larger picture, where the world and all the people in it – those people for whom climate change matters – to be considered in a single conceptual space. In this space climate change can be seen as part of the encompassing extra-somatic human activity which defines our world as we are starting to understand it.

The anthropocene and climate change, however, both involve the challenge of how to follow the conceptual and material threads that lead from these global issues and into particular, ethnographically described lives:

 A close examination of scientific practice makes clear that localizing is as much a problem for climate researchers as it is for ethnographers. This holds not only for the     interconnectedness of the global and the local climate, but also for the separation of climate change as a ‘scientific fact’ on the one hand, and a ‘matter of concern’ on the other. Climate research offers an insight into a messy world of ramifications, surprising activities and unexpected “social” context (Krauss 2009:149–50).

Anthropological work has the reflexive capacity to deal with the messy world Krauss refers to here, where these ramifications, surprising activities and unexpected ‘social’ context are part of the particular places where we, as anthropologists, work, taking cues from events and observations around us. In my own fieldwork I found all kinds of unanticipated connections between weathers and other aspects of life. With a research proposal full of religion and ‘belief’ I ended up with far more material interests, guided by the sometimes patient and sometimes exasperated villagers with whom I lived in the western Indian Himalayas.

I was walking with Karishma to get green grass one day during the monsoon. She told me that our village (Gau) is famous for being misty, and therefore that the girls are known, both for working hard and for being beautiful, because even though they are outside the mistiness keeps them pale. So apparently on festival days people say that the girls from this village are gori (white) because there is so much mist here. But Karishma pointed out that this can’t be true because there is mist only in the rainy season. Then she said that the girls here wear sweaters to stay gori. Also, she said girls of this village have a reputation for being hard working so people ask for them in marriage when there is a household where work is to be done. This (I think) might be part of why quite a few of the new brides in Gau are not used to doing as much work as women do here. But then Karishma said fairly that it is not just the girls who work hard, everyone works hard in this village (well, most people). She said that when girls go away to study, like she did, then they come back more beautiful. That is to say pale from not being outside. She was saying how on the other hand I had become more black (kala) since being there in the village (this was true).

People, whether Himalayan villagers or Norwegian PhD students, live with weather on an ongoing basis, and consistently live in the weather, which is not always catastrophic but does always impinge (think food perishability, wardrobe choices, sitting in the shade). The considerations people have with regards to the weather, then, necessarily translate to potential climate change concerns. Climate change is a threat, it has potentially deadly dimensions, but weather is inherent to our world, and I would not want to pathologize it.

Weather relates in fundamental ways to sensation and the body, thermal infrastructure, agriculture and animal husbandry, health and illness, disasters and other areas of anthropology (that is to say life). Weather may be implicated in all kinds of ways with other areas of life – for instance the hot/cold symbolism in India which classifies illness, the body, food and even moods. I think that it can be surprisingly easy to forget or ignore weather precisely because it is so pervasive. And this resistance of the mind against focusing on it is a risk when it comes to climate change. It can be tiring to think about. How, after all, do you write about the wind? And people have (Parkin 1995, James 1972, Hsu and Low 2007), but personally I find it challenging just to make a start – capturing the sky with a few black marks on paper feels so unrealistic. In that sense it is a great stretching area for our minds, about the material and the social, about what we mean with words like ‘impact’ and ‘atmosphere’ and the connections between people and places.

Finally there is the role of anthropology in clarifying the terms of the climate change debate. This is a new kind of challenge, it is a global one (hence the usefulness of Tsing’s work, who demonstrated the crucial part material relationships and meetings play in globalisation (2005)), it is to do with both technologies and nature (we can apply Latour, who shows in ‘we have never been modern’ (1993) how ‘modernity’ has not succeeded in cutting us off from the material and natural world around us), it is political, historical (hence Chakrabarty, whose work pushes us to think in new ways about how we are positioned in history and what place climate change has in this context), and there is something about it which is pushing at the edges in all these areas and others, in which new terms are required to even conceive of some of these problematics. Building on what we understand and moving further, in ways that might tread new neural pathways and enable new realities, simply from the newness of our thinking, feels like a worthwhile undertaking. I suggest that the orientation of research which maps out the weather-weight of social life can help bring the people back into climate change.

So the immovable stone of ‘climate change’ is being loosened up, pulled apart to reassemble in illuminating and constructive ways by people contributing to blow away the fog obstructing understanding, using the culminations of what we know so far and the ways in which we can think new thoughts. This effort rewards.

References

AAA statement on climate change. 29th January 2015. http://www.aaanet.org/cmtes/commissions/CCTF/upload/AAA-Statement-on-Humanity-and-Climate-Change.pdf Accessed 1st July 2015.

Chakrabarty, Dipesh 2014. Climate and Capital: On Conjoined Histories. Critical Inquiry 41(1):1-23.

Crate, Susan. 2008. Gone the Bull of Winter? Grappling with the Cultural Implications of and Anthropology’s Role(s) in Global Climate Change. Current Anthropology 49:569-595.

Crate, Susan and Mark Nuttall, eds. 2009. Anthropology and Climate Change: from Encounters to Actions. California: Left Coast Press.

Cruikshank, Julie. 2005. Do Glaciers Listen? Local Knowledge, Colonial Encounters, and Social Imagination. Toronto: University of British Columbia Press.

Douglas, Mary. 1992. Risk and Blame. London: Routledge

Grodzins-Gold, Ann. 1998. Sin and Rain: Moral Ecology in Rural North India. In Lance Nelson ed. Purifying the Earthly Body of God. New York: State University of New York Press.

Hsu, Elizabeth and Chris Low eds. 2007: Wind, Life, Health: Anthropological and Historical Perspectives. Special issue. Journal of the Royal Anthropological Institute 13:S1-S181.

Ingold, T. (2010), Footprints through the weather-world: walking, breathing, knowing. Journal of the Royal Anthropological Institute, 16: S121–S139.

James, Wendy. 1972. The politics of rain control among the Uduk. In Ian Cunnison and Wendy James eds. Essays on Sudan ethnography presented to Sir Edward Evans-Pritchard. London: C. Hurst.

Jerstad, Heid. 2014. Damp bodies and smoky firewood: material weather and livelihood in rural Himachal Pradesh. Forum for development studies 41(3):399-414.

Krauss, Werner. 2009. Localizing Climate Change: A Multi-sited Approach. In Marc-Anthony Falzon and Clair Hall eds. Multi-Sited. Ethnography. Theory, Praxis and Locality in Contemporary Research 149-165. Ashgate.

Lahsen, Myanna. 2009. A science-policy interface in the Global South: The politics of carbon sinks and science in Brazil. Climatic Change 97:339–372.

Paolisso, Michael. 2003. Chesapeake Bay watermen, weather and blue crabs: cultural models and fishery policies. In Sarah Strauss and Benjamin Orlove eds. Weather, Climate, Culture. Oxford: Berg.

Rudiak-Gould, Peter. 2013. Climate change and tradition in a small island state: the rising tide. Routledge.

Simonetti, Christian. 2015. The stratification of time. Time and Society .

Strauss, Sarah and Orlove, Benjamin eds. 2003. Weather, climate, culture. Oxford: Berg

Tsing, Anna. 2013. Dancing the Mushroom Forest. PAN: Philosophy, Activism, Nature vol 10.

Tsing, Anna. 2005. Friction. Princeton University Press.

Gut feeling: Research examines link between stomach bacteria, PTSD (Science Daily)

Date:
April 25, 2016
Source:
Office of Naval Research
Summary:
Could bacteria in your gut be used to cure or prevent neurological conditions such as post-traumatic stress disorder (PTSD), anxiety or even depression? Two researchers think that’s a strong possibility.

Dr. John Bienenstock (left) and Dr. Paul Forsythe in their lab. The researchers are studying whether bacteria in the gut can be used to cure or prevent neurological conditions such as post-traumatic stress disorder (PTSD), anxiety or depression. Credit: Photo courtesy of Dr. John Bienenstock and Dr. Paul Forsythe

Could bacteria in your gut be used to cure or prevent neurological conditions such as post-traumatic stress disorder (PTSD), anxiety or even depression? Two researchers sponsored by the Office of Naval Research (ONR) think that’s a strong possibility.

Dr. John Bienenstock and Dr. Paul Forsythe–who work in The Brain-Body Institute at McMaster University in Ontario, Canada–are investigating intestinal bacteria and their effect on the human brain and mood.

“This is extremely important work for U.S. warfighters because it suggests that gut microbes play a strong role in the body’s response to stressful situations, as well as in who might be susceptible to conditions like PTSD,” said Dr. Linda Chrisey, a program officer in ONR’s Warfighter Performance Department, which sponsors the research.

The trillions of microbes in the intestinal tract, collectively known as the gut microbiome, profoundly impact human biology–digesting food, regulating the immune system and even transmitting signals to the brain that alter mood and behavior. ONR is supporting research that’s anticipated to increase warfighters’ mental and physical resilience in situations involving dietary changes, sleep loss or disrupted circadian rhythms from shifting time zones or living in submarines.

Through research on laboratory mice, Bienenstock and Forsythe have shown that gut bacteria seriously affect mood and demeanor. They also were able to control the moods of anxious mice by feeding them healthy microbes from fecal material collected from calm mice.

Bienenstock and Forsythe used a “social defeat” scenario in which smaller mice were exposed to larger, more aggressive ones for a couple of minutes daily for 10 consecutive days. The smaller mice showed signs of heightened anxiety and stress–nervous shaking, diminished appetite and less social interaction with other mice. The researchers then collected fecal samples from the stressed mice and compared them to those from calm mice.

“What we found was an imbalance in the gut microbiota of the stressed mice,” said Forsythe. “There was less diversity in the types of bacteria present. The gut and bowels are a very complex ecology. The less diversity, the greater disruption to the body.”

Bienenstock and Forsythe then fed the stressed mice the same probiotics (live bacteria) found in the calm mice and examined the new fecal samples. Through magnetic resonance spectroscopy (MRS), a non-invasive analytical technique using powerful MRI technology, they also studied changes in brain chemistry.

“Not only did the behavior of the mice improve dramatically with the probiotic treatment,” said Bienenstock, “but it continued to get better for several weeks afterward. Also, the MRS technology enabled us to see certain chemical biomarkers in the brain when the mice were stressed and when they were taking the probiotics.”

Both researchers said stress biomarkers could potentially indicate if someone is suffering from PTSD or risks developing it, allowing for treatment or prevention with probiotics and antibiotics.

Later this year, Bienenstock and Forsythe will perform experiments involving fecal transplants from calm mice to stressed mice. They also hope to secure funding to conduct clinical trials to administer probiotics to human volunteers and use MRS to monitor brain reactions to different stress levels.

Gut microbiology is part of ONR’s program in warfighter performance. ONR also is looking at the use of synthetic biology to enhance the gut microbiome. Synthetic biology creates or re-engineers microbes or other organisms to perform specific tasks like improving health and physical performance. The field was identified as a top ONR priority because of its potential far-ranging impact on warfighter performance and fleet capabilities.


Journal Reference:

  1. S. Leclercq, P. Forsythe, J. Bienenstock. Posttraumatic Stress Disorder: Does the Gut Microbiome Hold the Key? The Canadian Journal of Psychiatry, 2016; 61 (4): 204 DOI: 10.1177/0706743716635535

Peter Sloterdijk, Spheres III: Foams – forthcoming in September 2016

Progressive Geographies

9781584351870The third and final volume of Peter Sloterdijk’s SpheresFoams, is forthcoming in September 2016 from Semiotext(e).

Foams completes Peter Sloterdijk’s celebrated Spheres trilogy: his 2,500-page “grand narrative” retelling of the history of humanity, as related through the anthropological concept of the “Sphere.” For Sloterdijk, life is a matter of form, and in life, sphere formation and thought are two different labels for the same thing. The trilogy also together offers his corrective answer to Martin Heidegger’s Being and Time, reformulating it into a lengthy meditation of Being and Space—a shifting of the question of who we are to a more fundamental question of where we are.

In this final volume, Sloterdijk’s “plural spherology” moves from the historical perspective on humanity of the preceding two volumes to a philosophical theory of our contemporary era, offering a view of life through a multifocal lens. If Bubbles was…

Ver o post original 133 mais palavras

Brasil e mais 169 países assinam acordo sobre mudança climática (Estadão)

Cláudia Trevisan e Altamira Silva Junior – 22 de abril de 201

Dilma: 'O caminho que teremos de percorrer agora será ainda mais desafiador: transformar nossas ambiciosas aspirações em resultados concretos'

Dilma: ‘O caminho que teremos de percorrer agora será ainda mais desafiador: transformar nossas ambiciosas aspirações em resultados concretos’

Representantes de 170 países assinaram nesta sexta-feira, 22, o Acordo de Paris sobre mudança climática, batendo o recorde da história da Organização das Nações Unidas (ONU) de adesão a um tratado internacional em um único dia. Mas todos ouviram o alerta do secretário-geral da entidade, Ban Ki-Moon, de que as boas intenções terão pouco impacto se a convenção não for ratificada pelos países o mais breve possível. Sem isso, o tratado não entrará em vigor.

“Estamos em uma corrida contra o tempo”, disse Ban no discurso de abertura da cerimônia, no plenário da ONU em Nova York. A urgência foi enfatizada por vários chefes de Estado, incluindo os presidentes do Brasil, Dilma Rousseff, e da França, François Hollande.

Dilma assegurou “a pronta entrada em vigor” da convenção, mas essa decisão depende do Congresso. “O caminho que teremos de percorrer agora será ainda mais desafiador: transformar nossas ambiciosas aspirações em resultados concretos”, disse a presidente em seu discurso. E repetiu os compromissos assumidos pelo Brasil durante a negociação do tratado, entre os quais a promessa de reduzir em 37% a emissão de gases poluentes até 2025, na comparação com os patamares registrados em 2005.

Frustração. Carlos Rittl, secretário executivo do Observatório do Clima, disse que Dilma frustrou as expectativas de entidades ambientais que esperavam uma sinalização clara de que o Brasil assumirá metas mais ambiciosas em 2018, quando haverá uma avaliação dos resultados do acordo. “O Brasil precisa reconhecer que deve fazer mais que o prometido no ano passado”, disse. “Todos devem, porque estamos na trajetória de 3ºC de aquecimento.”

Aprovado por representantes de 195 nações em dezembro, o tratado prevê uma série de compromissos nacionais com o objetivo de limitar o aumento da temperatura do planeta a 2ºC até o fim do século, em relação ao patamar anterior ao período industrial. Para que entre em vigor, o Acordo de Paris precisa ser ratificado por pelo menos 55 países que representem ao menos 55% das emissões de gases do efeito estufa.

“A era do consumo sem consequências chegou ao fim. Nós temos de intensificar os esforços para ‘descarbonizar’ nossas economias”, ressaltou o secretário-geral das Nações Unidas. Além do caráter simbólico, a cerimônia desta sexta tinha o objetivo de mobilizar os líderes mundiais em torno da ratificação do acordo, de forma que entre em vigor no próximo ano e não em 2020, como inicialmente previsto.

Primeiro a discursar, o presidente da França lembrou que Paris vivia uma situação trágica em dezembro, sob o impacto dos atentados terroristas que haviam provocado a morte de 130 pessoas no mês anterior. Ainda assim, ressaltou, foi possível fechar o acordo histórico sobre mudança climática.

La Ciudad Hidroespacial (Kosice)

La Ciudad Hidroespacial. Manifiesto.

Gyula Kosice

De acuerdo a sus impulsos y reacciones vitales, la humanidad se ha movido en despareja proporción respecto a su propio hábitat. La arquitectura engloba necesidades elementales muy disímiles y no es aconsejable permanecer oprimidos por la magnitud de su carga inerte.

Hasta ahora sólo utilizamos una mínima proporción de nuestras facultades mentales, adaptadas a módulos que de alguna manera derivan de la arquitectura llamada moderna o “funcional”. Es decir, el departamento o celdilla para habitar, que una sociedad nos impone con su economía compulsiva. Sin contar con la decidida repulsa de los arquitectos e ingenieros que no admiten que toda la nomenclatura en la construcción de edificios pueda, algún día, ser suplantada por otro lenguaje arquitectónico, marcadamente revolucionario, y que ello haga tambalear convicciones rígidas e ideas lógicas de la enseñanza académica, que ya tienden a derrumbarse.

Pero las estructuras sociales y los mecanismos de comportamiento –ruptura, contestación– son síntomas de un cambio hacia la desaparición del rol omnipotente del estado y su reemplazo por una administración eficiente.

En la revista “Arturo”, 1944, expresaba: “El hombre no ha de terminar en la Tierra ” y en el Manifiesto Madí de 1946 se afirmó que la arquitectura debería ser: “ambiente y formas desplazables en el espacio” y si bien estos conceptos estuvieron originados por una visión intuitiva, están marcados por una racionalidad inminente e implacable.

Mis diferentes etapas en las artes visuales nunca cambiaron de orientación y la propuesta de La Ciudad Hidroespacial es un continuo sin tregua. Aunque asumo mis propias contradicciones al crear hidroesculturas, relieves “hidrolumínicos” e hidrocinéticos para una arquitectura que estoy atacando desde sus bases. La premisa es liberar al ser humano de toda atadura, de todas las ataduras. Esta transformación adelantada por la ciencia y la tecnología, nos hace pensar que no es una audacia infiltrarse e investigar lo absoluto, a través de lo posible, a partir de una deliberada interacción imaginativa y en cadena. Una imaginación transindividual y sin metas prefijadas de antemano. De ahí que el primer proyecto o enunciado de una ciudad suspendida en el espacio, publicados en “Arturo” y el Manifiesto “Madí”, no fueron hipótesis o teorías de apoyo, sino más bien originados por una visión de un continuo y otra dimensión.

La aventura de la humanidad no se detiene ante lo imprevisible. Al contrario, vamos dirigidos hacia lo desconocido e inédito, y cuando un cambio se convierte en una necesidad, se acelera esta disposición.

Estar arraigados en la Tierra , o para ser exactos, en el planeta agua, aunque su atmósfera, su alimento y sus aguas estén contaminaos, asistir indefensos ante la persistente depredación geográfica y geológica, contemplar cómo el equilibrio ecológico es destruido lentamente, verificar el aumento constante de la población, son otros tantos incentivos para los cambios rotundos que anunciamos ya, como necesidad biológica.

Proponemos concretamente la construcción del hábitat humano, ocupando realmente el espacio a mil o mil quinientos metros de altura, en ciudades concebidas ah-hoc, con un previo sentimiento de coexistir y otro diferenciado “modus vivendi”.

La arquitectura ha dependido del suelo y las leyes gravídicas. Dichas leyes pueden ser utilizadas científicamente para que la vivienda hidroespacial pueda ser una realidad, es decir viable desde el punto de vista tecnológico. Intentar la construcción de algunas viviendas, como un ensayo previo para llegar paulatinamente a la “Ciudad Hidroespacial” propiamente dicha. La opinión de algunos astrofísicos e ingenieros espaciales coinciden en que tomando agua de las nubes y descomponiéndola por electrólisis, es posible utilizar el oxígeno para respirar y el hidrógeno introducido en una máquina de fisión nuclear proporcionaría energía más que suficiente. Energía capaz de mantener suspendido el hábitat incluido su desplazamiento, mientras otras opiniones se refieren a la posibilidad de cristalización del agua y derivarla hacia una polimerización que la cualifique energéticamente. Así pues, no se trata de vencer las leyes gravídicas sino crear la energía de sustentación. Por ello me dirijo a todos los científicos de la NASA para recabar sus opiniones.

El costo desde luego, es muy alto, pero con sólo detener la producción bélica del mundo por veinticuatro horas e invertir dichas sumas en este proyecto, su realización es posible. La arquitectura hidroespacial está condicionada para estar suspendida en el espacio indefinidamente.

La vivienda nómade hidroespacial deteriora el curso de la economía actual en base a la valoración del terreno y abre interrogantes sociológicos imprevisibles. Apunta asimismo a una apertura del arte, pues nuestra civilización entra en la etapa postindustrial. Se propone pues, un arte de todos y no un arte para todos. Al superar todo intermediarismo, el arte se integra tácitamente al hábitat, se disuelve en él y en la vida, es su presentación, su “modus vivendi”.

Los lugares creados con sentido de síntesis y vida comunitaria son su extensión. ¿Para qué, entonces, la pintura, la escultura, en definitiva el “objeto”, si todo ello ya está contenido en la vivienda ocupando el espacio, el recorrido interno de ese espacio, el volumen, el color, el movimiento?

Más de 4.000 millones de habitantes de la Tierra y este shock de futuro lo viven apenas 30.000 personas. No hay civilización por generación espontánea. Los Mayas, los Incas, la cultura China y de todo el Oriente, el arte gótico, el greco-latino del Mediterráneo, el Renacimiento, han tenido sus ciclos culturales y su parábola se ha cumplido. Nuestra civilización es la mejor porque la estamos viviendo, pero imaginemos por un instante el crash mundial si dejaran de fabricar automóviles; shock de un posible futuro inmediato.

El arte como “Canto de la Historia “, “Moneda de lo Absoluto”, “Aprehensión directa de la realidad”, “superestructura ideológica” o “trascendencia individual” son definiciones que serán rebasadas por los resplandores visionarios de un nuevo pensar y sentir, de una eclosión cultural irreversible, con acceso al infinito, y no solamente terráqueo.

Debemos reemplazar a las habitaciones que se han convertido en ritual arquitectónico y periférico: Living, comedor, dormitorio, baño, cocina, muebles, por serenas o intensas pero en todo diferenciadas, propuestas de lugares para vivir.

Si, dentro de un espacio, pero ocupando el espacio-tiempo con todos sus atributos. Y no como una alteración de la aventura humana sino como una explicable necesidad que emite nuestra condición humana.

Probablemente aparecerán otros condicionamientos pero en la ciudad hidroespacial nos proponemos destruir la angustia y las enfermedades, revalorizar el amor, los recreos de la inteligencia, el humor, el esparcimiento lúdico, los deportes, los júbilos indefinidos, las posibilidades mentales hasta ahora no exploradas, la abolición de los límites geográficos y del pensamiento. ¿Idealismo utópico? En absoluto. Los que no creen en su factibilidad es porque siguen aferrados a la caverna, a las guerras y diluvios. Por lo tanto disolver el arte en la vivienda y en la vida misma es preanunciar síntesis e integración.

Los centros de poder y de decisión económica y política, lo más que pueden hacer es retardar ociosamente esta tendencia que transforma al hombre, a partir del momento en que su cuerpo y su mente se ocuparán de proyectos universales y será así más universo.

Contará con nuevos lenguajes no solamente para comunicar un mensaje, sino la forma completa de un espíritu. Un lenguaje enriquecido por puras tensiones y nuevas presencias empapadas de poesía. Desde luego ser habitante hidroespacial tendrá al comienzo sus desventajas, hasta llegar al ejercicio continuo para desarrollar todas las posibilidades, condición humana, y no como un trabajo obligado. Finalmente la vida cotidiana no estará solamente centrada en la supuesta conquista del espacio, sino en la conquista de su tiempo, su activación, su levadura. El ser humano en definitiva no quiere morirse.

En la célula hidroespacial el hidrociudadano en su pluralidad inventa no solamente su arquitectura, nombra y elige sitios y lugares para vivir, que podrán o no acoplarse a miles de viviendas, plataformas y accesos suspendidos en el espacio.

Hidroespacializar, aterrizar, amerizar, alunizar, venusizar, tender posteriormente conexiones galácticas e interplanetarias atravesando los años luz, serán alternativas multiopcionales. Habrá lugares para tener ganas, para no merecer los trabajos del día y la noche, para alargar la vida y corregir la improvisación, para olvidar el olvido, para disolver el estupor del por qué y para qué y tantos otros lugares como nuestra inagotable imaginación amplifique y conciba.

Buenos Aires, 1971

http://www.kosice.com.ar/esp/la-ciudad-hidroespacial.php

How the introduction of farming changed the human genome (Science Daily)

Study tracks gene changes during the introduction of farming in Europe

Date:
November 23, 2015
Source:
Harvard Medical School
Summary:
Genomic analysis of ancient human remains identifies specific genes that changed during and after the transition in Europe from hunting and gathering to farming about 8,500 years ago. Many of the genes are associated with height, immunity, lactose digestion, light skin pigmentation, blue eye color and celiac disease risk.

Ancient DNA can provide insight into when humans acquired the adaptations seen in our genomes today. Credit: Image courtesy of Harvard Medical School

The introduction of agriculture into Europe about 8,500 years ago changed the way people lived right down to their DNA.

Until recently, scientists could try to understand the way humans adapted genetically to changes that occurred thousands of years ago only by looking at DNA variation in today’s populations. But our modern genomes contain mere echoes of the past that can’t be connected to specific events.

Now, an international team reports in Nature that researchers can see how natural selection happened by analyzing ancient human DNA.

“It allows us to put a time and date on selection and to directly associate selection with specific environmental changes, in this case the development of agriculture and the expansion of the first farmers into Europe,” said Iain Mathieson, a research fellow in genetics at Harvard Medical School and first author of the study.

By taking advantage of better DNA extraction techniques and amassing what is to date the largest collection of genome-wide datasets from ancient human remains, the team was able to identify specific genes that changed during and after the transition from hunting and gathering to farming.

Many of the variants occurred on or near genes that have been associated with height, the ability to digest lactose in adulthood, fatty acid metabolism, vitamin D levels, light skin pigmentation and blue eye color. Two variants appear on genes that have been linked to higher risk of celiac disease but that may have been important in adapting to an early agricultural diet.

Other variants were located on immune-associated genes, which made sense because “the Neolithic period involved an increase in population density, with people living close to one another and to domesticated animals,” said Wolfgang Haak, one of three senior authors of the study, a research fellow at the University of Adelaide and group leader in molecular anthropology at the Max Planck Institute for the Science of Human History.

“Although that finding did not come fully as a surprise,” he added, “it was great to see the selection happening in ‘real time.'”

The work also supports the idea that Europe’s first farmers came from ancient Anatolia, in what is now Turkey, and fills in more details about how ancient groups mixed and migrated.

“It’s a great mystery how present-day populations got to be the way we are today, both in terms of how our ancestors moved around and intermingled and how populations developed the adaptations that help us survive a bit better in the different environments in which we live,” said co-senior author David Reich, professor of genetics at HMS. “Now that ancient DNA is available at the genome-wide scale and in large sample sizes, we have an extraordinary new instrument for studying these questions.”

“From an archaeological perspective, it’s quite amazing,” said co-senior author Ron Pinhasi, associate professor of archaeology at University College Dublin. “The Neolithic revolution is perhaps the most important transition in human prehistory. We now have proof that people did actually go from Anatolia into Europe and brought farming with them. For more than 40 years, people thought it was impossible to answer that question.”

“Second,” he continued, “we now have evidence that genetic selection occurred along with the changes in lifestyle and demography, and that selection continued to happen following the transition.”

Prying more from the past

Members of the current team and others have used ancient DNA in the past few years to learn about Neanderthals and the genes they passed to humans, identify ancestors of present-day Europeans, trace migrations into the Americas and probe the roots of Indo-European languages. Studying natural selection, however, remained out of reach because it required more ancient genomes than were available.

“In the past year, we’ve had a super-exponential rise in the number of ancient samples we can study on a genome scale,” said Reich, who is also an associate member of the Broad Institute of Harvard and MIT and a Howard Hughes Medical Investigator. “In September 2014, we had 10 individuals. In this study, we have 230.”

The DNA came from the remains of people who lived between 3,000 and 8,500 years ago at different sites across what is now Europe, Siberia and Turkey. That time span provided snapshots of genetic variation before, during and after the agricultural revolution in Europe.

Among the 230 ancient individuals were 83 who hadn’t been sequenced before, including the first 26 to be gathered from the eastern Mediterranean, where warm conditions usually cause DNA to degrade.

Members of the team used several technological advances to obtain and analyze the new genetic material. For example, they exploited a method pioneered by Pinhasi’s laboratory to extract DNA from a remarkably rich source: a portion of the dense, pyramid-shaped petrous bone that houses the internal auditory organs. In some cases, the bone yielded 700 times more human DNA than could be obtained from other bones, including teeth.

“That changed everything,” said Pinhasi. “Higher-quality DNA meant we could analyze many more positions on the genome, perform more complex tests and simulations, and start systematically studying allele frequency across populations.”

What made the cut

Although the authors caution that sample size remains the biggest limitation of the study, comparing the ancient genomes to one another and to those of present-day people of European ancestry revealed 12 positions on the genome where natural selection related to the introduction of farming in northern latitudes appears to have happened.

“Some of those specific traits have been studied before,” said Reich. “This work with ancient DNA enriches our understanding of those traits and when they appeared.”

Besides the adaptations that appear to be related to diet, pigmentation, immunity and height, the possible selective pressure on other variants was less clear.

“We can guess by looking at the function of the gene, but our power is limited,” said Mathieson. “It’s quite frustrating.”

It’s too early to tell whether some of the variants were themselves selected for or whether they hitched a ride with a nearby beneficial gene. The question pertains especially to variants that seem to be disadvantageous, like increased disease risk.

Being able to look at numerous positions across the genome also allowed the team to examine complex traits for the first time in ancient DNA.

“We can see the evolution of height across time,” said Mathieson.

Researchers had noticed that people from southern Europe tend to be shorter than those from northern Europe. The new study suggests that the height differential arises both from people in the north having more ancestry from Eurasian steppe populations, who seem to have been taller, and people in the south having more ancestry from Neolithic and Chalcolithic groups from the Iberian peninsula, who seem to have been shorter.

The team wasn’t able to draw conclusions about the other complex traits it investigated: body mass index, waist-hip ratio, type 2 diabetes, inflammatory bowel disease and lipid levels.

Reich, for one, hopes researchers will one day have thousands of ancient genomes to analyze. He would also like to see this type of study applied to non-European populations and even to other species.

“It will be interesting to study selection in domesticated animals and to see if there is coevolution between them and the people who were domesticating them,” said Mathieson.


Journal Reference:

  1. Iain Mathieson, Iosif Lazaridis, Nadin Rohland, Swapan Mallick, Nick Patterson, Songül Alpaslan Roodenberg, Eadaoin Harney, Kristin Stewardson, Daniel Fernandes, Mario Novak, Kendra Sirak, Cristina Gamba, Eppie R. Jones, Bastien Llamas, Stanislav Dryomov, Joseph Pickrell, Juan Luís Arsuaga, José María Bermúdez de Castro, Eudald Carbonell, Fokke Gerritsen, Aleksandr Khokhlov, Pavel Kuznetsov, Marina Lozano, Harald Meller, Oleg Mochalov, Vyacheslav Moiseyev, Manuel A. Rojo Guerra, Jacob Roodenberg, Josep Maria Vergès, Johannes Krause, Alan Cooper, Kurt W. Alt, Dorcas Brown, David Anthony, Carles Lalueza-Fox, Wolfgang Haak, Ron Pinhasi, David Reich. Genome-wide patterns of selection in 230 ancient EurasiansNature, 2015; DOI: 10.1038/nature16152

Impactos visíveis no mar (Pesquisa Fapesp)

Poluentes chegam a 200 km ao norte e ao sul da foz do rio Doce, atingem unidades de conservação, alteram equilíbrio ecológico e se acumulam no assoalho marinho

CARLOS FIORAVANTI | ED. 242 | ABRIL 2016

Poluição à vista: os resíduos que vazaram do reservatório de Mariana formam mancha acastanhada na foz do rio DocePoluição à vista: os resíduos que vazaram do reservatório de Mariana formam mancha acastanhada na foz do rio Doce.

Em janeiro deste ano, ao sobrevoarem o litoral do Espírito Santo e do sul da Bahia, biólogos, oceanógrafos e técnicos de órgãos ambientais do governo federal reconheceram os borrões escuros na superfície do mar formados pelo acúmulo de resíduos metálicos que vazaram do reservatório da mineradora Samarco em Mariana, Minas Gerais, em novembro de 2015. A mancha de resíduos, também chamada de pluma, aproximava-se do arquipélago de Abrolhos, uma das principais reservas de vida silvestre marinha da costa brasileira.

Os borrões não eram apenas os indesejados resquícios da extração de minério de ferro de Minas Gerais, mas uma de suas consequências, como se verificou logo depois. Em meio às manchas verde-escuro havia colônias de algas e outros organismos marinhos microscópicos – o fitoplâncton – com dezenas de quilômetros de extensão, muito maiores que as observadas nos anos anteriores, de acordo com as análises de pesquisadores da Universidade Federal do Espírito Santo (Ufes).

Outra peculiaridade é que os organismos cresciam e se multiplicavam rapidamente, em decorrência do excesso de ferro dos rejeitos da mineradora de Mariana que se espalham pelo mar a partir da foz do rio Doce, onde chegaram no final de novembro. Desde então, levados continuamente ao mar pelo rio, os resíduos formam uma mancha móvel que oscila ao longo de 200 quilômetros (km) ao norte e ao sul da foz do rio Doce, que alterou o equilíbrio marinho, como indicado pela massa de fitoplâncton, e atingiu pelo menos três unidades de conservação de organismos marinhos.

“As manchas de fitoplâncton são comuns no verão, mas não desse modo”, explica Alex Bastos, professor de oceanografia da Ufes, no final de fevereiro. Análises preliminares indicaram que as colônias de algas são constituídas por organismos que se formam e morrem em poucos dias, mais rapidamente que o habitual. A decomposição acelerada dos organismos consome oxigênio da água do mar, com consequências imprevisíveis sobre as comunidades de organismos marinhos.

Além disso, a diversidade de espécies havia sido reduzida quase à metade. Camilo Dias Júnior, com sua equipe de oceanografia da Ufes, encontrou no máximo 40 espécies de fitoplâncton por amostra analisada; antes da chegada dos resíduos os pesquisadores reconheciam de 50 a 70 espécies. A hipótese dos pesquisadores e técnicos é de que já poderia ter ocorrido uma seleção de variedades mais adaptadas ao excesso de ferro trazido com a descarga dos resíduos no mar.

Nos sobrevoos do litoral do Espírito Santo e da Bahia, Claudio Dupas, coordenador do Núcleo de Geoprocessamento e Monitoramento Ambiental da Superintendência do Instituto Brasileiro do Meio Ambiente e dos Recursos Naturais Renováveis (Ibama) em São Paulo, observou muitos barcos de pesca próximos às manchas de fitoplâncton na foz do rio Doce. Atraídos pela abundância de alimento, o grande número de peixes chamou a atenção dos pescadores.

Em Governador Valadares, MG: a lama ocupou o rio Doce em novembro, prejudicando o abastecimento de água para os moradores da cidade

Com base nas análises preliminares da qualidade de água e na observação do cenário, a equipe do Ibama elaborou um relatório técnico alertando sobre alterações na qualidade da água, prejudicada com a descarga de resíduos no mar. Com base no documento e no princípio da precaução – para evitar que a população seja prejudicada pelo consumo de peixes contaminados –, no dia 22 de fevereiro um juiz federal de Vitória proibiu por tempo indeterminado a pesca na região da foz do rio Doce. “Assim que saiu a decisão do juiz, o superintendente do Ibama em Vitória, Guanadir Gonçalves, pediu-me para fazer um mapa com a delimitação da área de proibição, que foi para a internet e para os celulares dos fiscais em campo no mesmo dia”, diz Dupas.

Desde janeiro os movimentos da mancha de resíduos podem ser acompanhados por meio de mapas gerados pelo Ibama a partir de imagens de satélites no site governancapelodoce.com.br, mantido pela Samarco. Já o site siscom.ibama.gov.br/mariana contém imagens de satélite de alta resolução de antes e depois do incidente, da barragem à foz. Os mapas indicam que os resíduos já chegaram a 50 km ao sul de Vitória, capital do Espírito Santo, e atingiram três unidades de conservação do ambiente marinho, o Refúgio de Vida Silvestre de Santa Cruz, a Área de Proteção Ambiental (APA) Costa das Algas e uma das principais áreas de desova da tartaruga-cabeçuda (Caretta caretta), uma faixa de 37 km de praias conhecida como Reserva Biológica Comboios. “Ainda não é possível avaliar o impacto sobre o ambiente, a vida dos organismos marinhos e dos moradores da região”, diz Dupas.

Desde que vazou da barragem de Fundão, em 5 de novembro, até chegar ao mar, a enorme massa de resíduos da extração de minério de ferro causou uma transformação profunda. Destruiu casas e matas às margens do rio Doce, provocando a morte de 18 pessoas e de toneladas de peixes e outros organismos aquáticos. A bióloga Flávia Bottino participou das expedições do Grupo Independente para Análise do Impacto Ambiental (Giaia) ao longo do rio Doce em novembro e observou uma intensa turbidez da água, que dificultava a penetração da luz e a sobrevivência dos organismos. Os biólogos encontraram camarões de água doce que sobreviveram ao desastre, mas os organismos bentônicos, que viviam no fundo do rio, tinham sido soterrados.

Limites incertos 
A alta concentração de partículas sólidas que absorvem calor pode ter causado o aumento da temperatura da água para cerca de 30º Celsius. “A água do rio estava quente”, ela notou. As análises das amostras de água coletadas em dezembro ao longo de um trecho de cerca de 800 km do rio, realizadas nas unidades das universidades de São Paulo (USP) em Ribeirão Preto, Federal de São Carlos (UFSCar) em São Carlos e Sorocaba, Estadual Paulista (Unesp) em São Vicente, e na de Brasília (UnB), indicaram concentrações elevadas de manganês, ferro, arsênio e chumbo. As chuvas podem agravar a situação ao lavar as margens dos rios, cobertas de resíduos, e transportá-los ao mar.

Por meio de coletas realizadas com o navio Vital de Oliveira Moura, da Marinha, a equipe da Ufes verificou que 25 km a leste da foz do Rio Doce os resíduos formam uma camada de 1 a 2 centímetros sobre a lama do fundo do mar, a 25 metros de profundidade. “Está havendo um acúmulo rápido do rejeito no assoalho marinho”, diz Bastos, da Ufes, com base em coletas realizadas desde novembro, logo após o rompimento da barragem (ver Pesquisa FAPESP no 239). “Nem nas maiores cheias o acúmulo de sedimentos no rio no fundo do mar foi tão alto.”

042-047_Poluentes_242No início de fevereiro, em uma reunião dos pesquisadores da Ufes com representantes do Ibama, Instituto Estadual do Meio Ambiente (Iema) e Instituto Chico Mendes de Conservação da Biodiversidade (ICMBio), Bastos comentou que a concentração de ferro no fundo do mar havia aumentado 20 vezes, em comparação com os níveis de antes do acidente, a de alumínio 10 vezes e a de cromo e manganês, cinco. Outro professor da Ufes, Renato Rodrigues Neto, observou que a vazão do rio passou de 300 metros cúbicos por segundo (m³/s), antes do rompimento da barragem, para cerca de 4.000 m³/s, aumentando a quantidade de lama com resíduos metálicos despejada no mar.

As imagens de satélite indicam que os resíduos metálicos podem ter chegado até o arquipélago de Abrolhos no início de janeiro, embora, ressalta Dupas, ainda não seja possível diferenciar os sedimentos vindos do rio Doce, a cerca de 200 km de distância, dos do rio Caravelas, que deságua na região. Segundo ele, os resultados das análises em andamento devem ser anunciados em abril.

Vários estudos em outras áreas marinhas têm indicado que os resíduos industriais podem ir muito além dos lugares onde foram produzidos, misturar-se com os sedimentos do fundo do mar, aflorando se revolvidos por redes de pesca, ou ser absorvidos por organismos marinhos. Uma equipe do Instituto Oceanográfico (IO) da USP identificou metais pesados (chumbo, cobre e zinco) e compostos orgânicos derivados de petróleo produzidos na zona industrial de Santos e do polo industrial de Cubatão, a 15 km do mar, misturados com a lama do assoalho marinho a uma profundidade de 100 metros e a uma distância de 200 km da costa. Não se pensava que a poluição gerada em terra pudesse chegar tão longe.

Condições ambientais 
As conclusões ajudam a pensar o que poderia se passar no litoral do Espírito Santo e dos estados vizinhos, à medida que a lama da mineradora se espalha. “Os eventos, a rigor, não têm conexão à primeira vista”, disse Michel Mahiques, professor de oceanografia do IO-USP que coordenou os estudos em Santos. O vazamento da Samarco em Mariana foi um fenômeno agudo, com uma descarga intensa de resíduos, enquanto Santos e outros, como a baía da Guanabara, são casos crônicos, de décadas de liberação contínua de poluentes. “O fato comum”, ele diz, “é que existem porções do fundo marinho nas quais as condições ambientais permitem a deposição de materiais gerados pela atividade humana, ainda que a grandes distâncias”.

Em um estudo anterior no litoral de Santos, seu grupo identificou isótopos de césio 137 originários de explosões atômicas ou de usinas nucleares, nas quais esse tipo de material é gerado. “O césio foi transportado pela atmosfera e aderiu a partículas muito pequenas do fundo do mar”, conta. “Podemos chamar esses casos de teleconexões, em que um evento em um determinado ponto do planeta pode afetar regiões muito distantes.” Segundo ele, os casos clássicos são os acidentes das usinas nucleares de Chernobyl em 1986 e de Fukushima em 2011.

Vila de Mariana devastada pela lama da barragem de Fundão: efeito a mais de 800 km de distância na terra, no rio e no mar

“Precisamos lançar outro olhar para o potencial de acumulação de material no meio marinho”, comenta Mahiques. Seus estudos indicaram que os poluentes se acumulam principalmente nos cinturões de lama, faixas em geral com 3 a 4 km de largura e dezenas de quilômetros de extensão, na chamada plataforma continental, sobre estruturas antigas de relevo. “Há um efeito a distância. Os sedimentos permanecem em pontos bem distantes da origem. Duzentos quilômetros foi o limite a que chegamos, mas ainda não sabemos se poderiam ir mais longe.” Mahiques argumenta que dois conceitos básicos sobre o funcionamento da plataforma continental deveriam ser revistos. O primeiro é que a quantidade de materiais do continente que chega ao mar seria pequena. O segundo é que os ambientes costeiros retêm a sujeira. “A quantidade não é pequena, nem os estuários são um filtro perfeito dos resíduos gerados no continente.”

Os pesquisadores analisaram 21 amostras de sedimentos coletadas em 2005 e outras, mais recentes, reunidas por meio do navio oceanográfico Alpha Crucis. Os resultados indicaram que os níveis de chumbo, zinco e cobre a 100 metros de profundidade a mais de 100 km da costa eram próximos aos encontrados na baía de Santos, embora mais baixos que os limites mais altos do estuário santista, um ambiente próximo à terra que mistura água de rios e do mar. No estuário, a concentração de chumbo no sedimento marinho variava de 9 miligramas por quilograma (mg/kg) em áreas não contaminadas a 59 mg/kg em amostras do fundo do porto, indicando um aumento de cinco a 10 vezes em comparação com os valores anteriores ao processo de industrialização. Os autores desse trabalho afirmaram que os poluentes industriais misturados com a lama no fundo do mar poderiam facilmente voltar à circulação, como resultado de movimentos intensos da água ou de atividade humana como a dragagem para a ampliação de portos ou a pesca com redes pesadas que revolvem o fundo do mar.

Estudos anteriores de pesquisadores do IO-USP já haviam mostrado que a descarga contínua de esgotos domésticos e de poluentes industriais na baía de Santos era provavelmente uma das causas da reduzida diversidade de organismos marinhos na região, em comparação com áreas menos poluídas.

Em paralelo, uma equipe da Unesp em São Vicente encontrou níveis acima dos permitidos em lei de quatro metais pesados – cádmio, cobre, chumbo e mercúrio – em amostras de água, sedimento e em caranguejos-uçá dos manguezais dos municípios de Cubatão, Bertioga, Iguape, São Vicente e Cananeia. Nas regiões com maior concentração desses metais, os caranguejos apresentavam uma proporção maior de células com alterações genéticas associadas à ocorrência de malformações (verPesquisa FAPESP no 225). Estudo de uma equipe da Universidade Federal do Rio Grande publicado em novembro de 2015 associou a contaminação por metal como possível causa da fibropapilomatose, uma doença específica de tartarugas marinhas, caracterizada pela formação de tumores benignos sobre a pele, em tartarugas-verde (Chelonia mydas) de Ubatuba, SP, já que os animais examinados apresentavam um nível acima do normal de cobre, ferro e chumbo, em comparação com animais saudáveis.

“Quando pensarmos em legislação e políticas públicas, para fazer uma projeção do impacto de eventuais acidentes ambientais, temos de olhar mais longe e rever o conceito de área de influência, já que o efeito pode ser muito maior do que o imaginado”, disse Mahiques. Bastos, da Ufes, observou que os danos ambientais podem ser intensos em consequência de pequenas alterações na concentração de metais na água do mar, mesmo que os limites ainda estejam abaixo dos máximos estabelecidos pela legislação ambiental.

Artigos científicos
FIGUEIRA, R.C.L. et alDistribution of 137Cs, 238Pu and 239 + 240Pu in sediments of the southeastern Brazilian shelf – SW Atlantic marginScience of the Total Environment. v. 357, p. 146-59. 2006.
MAHIQUES, M.M. et alMud depocentres on the continental shelf: a neglected sink for anthropogenic contaminants from the coastal zone. EnvironmentalEarth Sciences. v. 75, n. 1, p. 44-55. 2016.
SILVA, C.C. da et alMetal contamination as a possible etiology of fibropapillomatosis in juvenile female green sea turtles Chelonia mydas from the southern Atlantic OceanAquatic Toxicology. v. 170, p. 42-51. 2016.

The Boy Whose Brain Could Unlock Autism (Matter)

 

Autism changed Henry Markram’s family. Now his Intense World theory could transform our understanding of the condition.


SOMETHING WAS WRONG with Kai Markram. At five days old, he seemed like an unusually alert baby, picking his head up and looking around long before his sisters had done. By the time he could walk, he was always in motion and required constant attention just to ensure his safety.

“He was super active, batteries running nonstop,” says his sister, Kali. And it wasn’t just boyish energy: When his parents tried to set limits, there were tantrums—not just the usual kicking and screaming, but biting and spitting, with a disproportionate and uncontrollable ferocity; and not just at age two, but at three, four, five and beyond. Kai was also socially odd: Sometimes he was withdrawn, but at other times he would dash up to strangers and hug them.

Things only got more bizarre over time. No one in the Markram family can forget the 1999 trip to India, when they joined a crowd gathered around a snake charmer. Without warning, Kai, who was five at the time, darted out and tapped the deadly cobra on its head.

Coping with such a child would be difficult for any parent, but it was especially frustrating for his father, one of the world’s leading neuroscientists. Henry Markram is the man behind Europe’s $1.3 billion Human Brain Project, a gargantuan research endeavor to build a supercomputer model of the brain. Markram knows as much about the inner workings of our brains as anyone on the planet, yet he felt powerless to tackle Kai’s problems.

“As a father and a neuroscientist, you realize that you just don’t know what to do,” he says. In fact, Kai’s behavior—which was eventually diagnosed as autism—has transformed his father’s career, and helped him build a radical new theory of autism: one that upends the conventional wisdom. And, ironically, his sideline may pay off long before his brain model is even completed.

IMAGINE BEING BORN into a world of bewildering, inescapable sensory overload, like a visitor from a much darker, calmer, quieter planet. Your mother’s eyes: a strobe light. Your father’s voice: a growling jackhammer. That cute little onesie everyone thinks is so soft? Sandpaper with diamond grit. And what about all that cooing and affection? A barrage of chaotic, indecipherable input, a cacophony of raw, unfilterable data.

Just to survive, you’d need to be excellent at detecting any pattern you could find in the frightful and oppressive noise. To stay sane, you’d have to control as much as possible, developing a rigid focus on detail, routine and repetition. Systems in which specific inputs produce predictable outputs would be far more attractive than human beings, with their mystifying and inconsistent demands and their haphazard behavior.

This, Markram and his wife, Kamila, argue, is what it’s like to be autistic.

They call it the “intense world” syndrome.

The behavior that results is not due to cognitive deficits—the prevailing view in autism research circles today—but the opposite, they say. Rather than being oblivious, autistic people take in too much and learn too fast. While they may appear bereft of emotion, the Markrams insist they are actually overwhelmed not only by their own emotions, but by the emotions of others.

Consequently, the brain architecture of autism is not just defined by its weaknesses, but also by its inherent strengths. The developmental disorder now believed to affect around 1 percent of the population is not characterized by lack of empathy, the Markrams claim. Social difficulties and odd behavior result from trying to cope with a world that’s just too much.

After years of research, the couple came up with their label for the theory during a visit to the remote area where Henry Markram was born, in the South African part of the Kalahari desert. He says “intense world” was Kamila’s phrase; she says she can’t recall who hit upon it. But he remembers sitting in the rust-colored dunes, watching the unusual swaying yellow grasses while contemplating what it must be like to be inescapably flooded by sensation and emotion.

That, he thought, is what Kai experiences. The more he investigated the idea of autism not as a deficit of memory, emotion and sensation, but an excess, the more he realized how much he himself had in common with his seemingly alien son.


HENRY MARKRAM IS TALL, with intense blue eyes, sandy hair and the air of unmistakable authority that goes with the job of running a large, ambitious, well-funded research project. It’s hard to see what he might have in common with a troubled, autistic child. He rises most days at 4 a.m. and works for a few hours in his family’s spacious apartment in Lausanne before heading to the institute, where the Human Brain Project is based. “He sleeps about four or five hours,” says Kamila. “That’s perfect for him.”

As a small child, Markram says, he “wanted to know everything.” But his first few years of high school were mostly spent “at the bottom of the F class.” A Latin teacher inspired him to pay more attention to his studies, and when a beloved uncle became profoundly depressed and died young—he was only in his 30s, but “just went downhill and gave up”—Markram turned a corner. He’d recently been given an assignment about brain chemistry, which got him thinking. “If chemicals and the structure of the brain can change and then I change, who am I? It’s a profound question. So I went to medical school and wanted to become a psychiatrist.”

Markram attended the University of Cape Town, but in his fourth year of medical school, he took a fellowship in Israel. “It was like heaven,” he says, “It was all the toys that I ever could dream of to investigate the brain.” He never returned to med school, and married his first wife, Anat, an Israeli, when he was 26. Soon, they had their first daughter, Linoy, now 24, then a second, Kali, now 23. Kai came four years afterwards.

During graduate research at the Weizmann Institute in Israel, Markram made his first important discovery, elucidating a key relationship between two neurotransmitters involved in learning, acetylcholine and glutamate. The work was important and impressive—especially so early in a scientist’s career—but it was what he did next that really made his name.

During a postdoc with Nobel laureate Bert Sakmann at Germany’s Max Planck Institute, Markram showed how brain cells that “fire together, wire together.” That had been a basic tenet of neuroscience since the 1940s—but no one had been able to figure out how the process actually worked.

By studying the precise timing of electrical signaling between neurons, Markram demonstrated that firing in specific patterns increases the strength of the synapses linking cells, while missing the beat weakens them. This simple mechanism allows the brain to learn, forging connections both literally and figuratively between various experiences and sensations—and between cause and effect.

Measuring these fine temporal distinctions was also a technical triumph. Sakmann won his 1991 Nobel for developing the required “patch clamp” technique, which measures the tiny changes in electrical activity inside nerve cells. To patch just one neuron, you first harvest a sliver of brain, about 1/3 of a millimeter thick and containing around 6 million neurons, typically from a freshly guillotined rat.

To keep the tissue alive, you bubble it in oxygen, and bathe the slice of brain in a laboratory substitute for cerebrospinal fluid. Under a microscope, using a minuscule glass pipette, you carefully pierce a single cell. The technique is similar to injecting a sperm into an egg for in vitro fertilization—except that neurons are hundreds of times smaller than eggs.

It requires steady hands and exquisite attention to detail. Markram’s ultimate innovation was to build a machine that could study 12 such carefully prepared cells simultaneously, measuring their electrical and chemical interactions. Researchers who have done it say you can sometimes go a whole day without getting one right—but Markram became a master.

Still, there was a problem. He seemed to go from one career peak to another—a Fulbright at the National Institutes of Health, tenure at Weizmann, publication in the most prestigious journals—but at the same time it was becoming clear that something was not right in his youngest child’s head. He studied the brain all day, but couldn’t figure out how to help Kai learn and cope. As he told a New York Times reporter earlier this year, “You know how powerless you feel. You have this child with autism and you, even as a neuroscientist, really don’t know what to do.”


AT FIRST, MARKRAM THOUGHT Kai had attention deficit/ hyperactivity disorder (ADHD): Once Kai could move, he never wanted to be still. “He was running around, very difficult to control,” Markram says. As Kai grew, however, he began melting down frequently, often for no apparent reason. “He became more particular, and he started to become less hyperactive but more behaviorally difficult,” Markram says. “Situations were very unpredictable. He would have tantrums. He would be very resistant to learning and to any kind of instruction.”

Preventing Kai from harming himself by running into the street or following other capricious impulses was a constant challenge. Even just trying to go to the movies became an ordeal: Kai would refuse to enter the cinema or hold his hands tightly over his ears.

However, Kai also loved to hug people, even strangers, which is one reason it took years to get a diagnosis. That warmth made many experts rule out autism. Only after multiple evaluations was Kai finally diagnosed with Asperger syndrome, a type of autism that includes social difficulties and repetitive behaviors, but not lack of speech or profound intellectual disability.

“We went all over the world and had him tested, and everybody had a different interpretation,” Markram says. As a scientist who prizes rigor, this infuriated him. He’d left medical school to pursue neuroscience because he disliked psychiatry’s vagueness. “I was very disappointed in how psychiatry operates,” he says.

Over time, trying to understand Kai became Markram’s obsession.

It drove what he calls his “impatience” to model the brain: He felt neuroscience was too piecemeal and could not progress without bringing more data together. “I wasn’t satisfied with understanding fragments of things in the brain; we have to understand everything,” he says. “Every molecule, every gene, every cell. You can’t leave anything out.”

This impatience also made him decide to study autism, beginning by reading every study and book he could get his hands on. At the time, in the 1990s, the condition was getting increased attention. The diagnosis had only been introduced into the psychiatric bible, then the DSM III, in 1980. The 1988 Dustin Hoffman film Rain Man, about an autistic savant, brought the idea that autism was both a disability and a source of quirky intelligence into the popular imagination.

The dark days of the mid–20th century, when autism was thought to be caused by unloving “refrigerator mothers” who icily rejected their infants, were long past. However, while experts now agree that the condition is neurological, its causes remain unknown.

The most prominent theory suggests that autism results from problems with the brain’s social regions, which results in a deficit of empathy. This “theory of mind” concept was developed by Uta Frith, Alan Leslie, and Simon Baron-Cohen in the 1980s. They found that autistic children are late to develop the ability to distinguish between what they know themselves and what others know—something that other children learn early on.

In a now famous experiment, children watched two puppets, “Sally” and “Anne.” Sally has a marble, which she places in a basket and then leaves. While she’s gone, Anne moves Sally’s marble into a box. By age four or five, normal children can predict that Sally will look for the marble in the basket first because she doesn’t know that Anne moved it. But until they are much older, most autistic children say that Sally will look in the box because they know it’s there. While typical children automatically adopt Sally’s point of view and know she was out of the room when Anne hid the marble, autistic children have much more difficulty thinking this way.

The researchers linked this “mind blindness”—a failure of perspective-taking—to their observation that autistic children don’t engage in make-believe. Instead of pretending together, autistic children focus on objects or systems—spinning tops, arranging blocks, memorizing symbols, or becoming obsessively involved with mechanical items like trains and computers.

This apparent social indifference was viewed as central to the condition. Unfortunately, the theory also seemed to imply that autistic people are uncaring because they don’t easily recognize that other people exist as intentional agents who can be loved, thwarted or hurt. But while the Sally-Anne experiment shows that autistic people have difficulty knowing that other people have different perspectives—what researchers call cognitive empathy or “theory of mind”—it doesn’t show that they don’t care when someone is hurt or feeling pain, whether emotional or physical. In terms of caring—technically called affective empathy—autistic people aren’t necessarily impaired.

Sadly, however, the two different kinds of empathy are combined in one English word. And so, since the 1980s, this idea that autistic people “lack empathy” has taken hold.

“When we looked at the autism field we couldn’t believe it,” Markram says. “Everybody was looking at it as if they have no empathy, no theory of mind. And actually Kai, as awkward as he was, saw through you. He had a much deeper understanding of what really was your intention.” And he wanted social contact.

 The obvious thought was: Maybe Kai’s not really autistic? But by the time Markram was fully up to speed in the literature, he was convinced that Kai had been correctly diagnosed. He’d learned enough to know that the rest of his son’s behavior was too classically autistic to be dismissed as a misdiagnosis, and there was no alternative condition that explained as much of his behavior and tendencies. And accounts by unquestionably autistic people, like bestselling memoirist and animal scientist Temple Grandin, raised similar challenges to the notion that autistic people could never really see beyond themselves.

Markram began to do autism work himself as visiting professor at the University of California, San Francisco in 1999. Colleague Michael Merzenich, a neuroscientist, proposed that autism is caused by an imbalance between inhibitory and excitatory neurons. A failure of inhibitions that tamp down impulsive actions might explain behavior like Kai’s sudden move to pat the cobra. Markram started his research there.


MARKRAM MET HIS second wife, Kamila Senderek, at a neuroscience conference in Austria in 2000. He was already separated from Anat. “It was love at first sight,” Kamila says.

Her parents left communist Poland for West Germany when she was five. When she met Markram, she was pursuing a master’s in neuroscience at the Max Planck Institute. When Markram moved to Lausanne to start the Human Brain Project, she began studying there as well.

Tall like her husband, with straight blonde hair and green eyes, Kamila wears a navy twinset and jeans when we meet in her open-plan office overlooking Lake Geneva. There, in addition to autism research, she runs the world’s fourth largest open-access scientific publishing firm, Frontiers, with a network of over 35,000 scientists serving as editors and reviewers. She laughs when I observe a lizard tattoo on her ankle, a remnant of an adolescent infatuation with The Doors.

When asked whether she had ever worried about marrying a man whose child had severe behavioral problems, she responds as though the question never occurred to her. “I knew about the challenges with Kai,” she says, “Back then, he was quite impulsive and very difficult to steer.”

The first time they spent a day together, Kai was seven or eight. “I probably had some blue marks and bites on my arms because he was really quite something. He would just go off and do something dangerous, so obviously you would have to get in rescue mode,” she says, noting that he’d sometimes walk directly into traffic. “It was difficult to manage the behavior,” she shrugs, “But if you were nice with him then he was usually nice with you as well.”

“Kamila was amazing with Kai,” says Markram, “She was much more systematic and could lay out clear rules. She helped him a lot. We never had that thing that you see in the movies where they don’t like their stepmom.”

At the Swiss Federal Institute of Technology in Lausanne (EPFL), the couple soon began collaborating on autism research. “Kamila and I spoke about it a lot,” Markram says, adding that they were both “frustrated” by the state of the science and at not being able to help more. Their now-shared parental interest fused with their scientific drives.

They started by studying the brain at the circuitry level. Markram assigned a graduate student, Tania Rinaldi Barkat, to look for the best animal model, since such research cannot be done on humans.

Barkat happened to drop by Kamila’s office while I was there, a decade after she had moved on to other research. She greeted her former colleagues enthusiastically.

She started her graduate work with the Markrams by searching the literature for prospective animal models. They agreed that the one most like human autism involved rats prenatally exposed to an epilepsy drug called valproic acid (VPA; brand name, Depakote). Like other “autistic” rats, VPA rats show aberrant social behavior and increased repetitive behaviors like excessive self-grooming.

But more significant is that when pregnant women take high doses of VPA, which is sometimes necessary for seizure control, studies have found that the risk of autism in their children increases sevenfold. One 2005 study found that close to 9 percent of these children have autism.

Because VPA has a link to human autism, it seemed plausible that its cellular effects in animals would be similar. A neuroscientist who has studied VPA rats once told me, “I see it not as a model, but as a recapitulation of the disease in other species.”

Barkat got to work. Earlier research showed that the timing and dose of exposure was critical: Different timing could produce opposite symptoms, and large doses sometimes caused physical deformities. The “best” time to cause autistic symptoms in rats is embryonic day 12, so that’s when Barkat dosed them.

At first, the work was exasperating. For two years, Barkat studied inhibitory neurons from the VPA rat cortex, using the same laborious patch-clamping technique perfected by Markram years earlier. If these cells were less active, that would confirm the imbalance that Merzenich had theorized.

She went through the repetitious preparation, making delicate patches to study inhibitory networks. But after two years of this technically demanding, sometimes tedious, and time-consuming work, Barkat had nothing to show for it.

“I just found no difference at all,” she told me, “It looked completely normal.” She continued to patch cell after cell, going through the exacting procedure endlessly—but still saw no abnormalities. At least she was becoming proficient at the technique, she told herself.

Markram was ready to give up, but Barkat demurred, saying she would like to shift her focus from inhibitory to excitatory VPA cell networks. It was there that she struck gold.

 “There was a difference in the excitability of the whole network,” she says, reliving her enthusiasm. The networked VPA cells responded nearly twice as strongly as normal—and they were hyper-connected. If a normal cell had connections to ten other cells, a VPA cell connected with twenty. Nor were they under-responsive. Instead, they were hyperactive, which isn’t necessarily a defect: A more responsive, better-connected network learns faster.

But what did this mean for autistic people? While Barkat was investigating the cortex, Kamila Markram had been observing the rats’ behavior, noting high levels of anxiety as compared to normal rats. “It was pretty much a gold mine then,” Markram says. The difference was striking. “You could basically see it with the eye. The VPAs were different and they behaved differently,” Markram says. They were quicker to get frightened, and faster at learning what to fear, but slower to discover that a once-threatening situation was now safe.

While ordinary rats get scared of an electrified grid where they are shocked when a particular tone sounds, VPA rats come to fear not just that tone, but the whole grid and everything connected with it—like colors, smells, and other clearly distinguishable beeps.

“The fear conditioning was really hugely amplified,” Markram says. “We then looked at the cell response in the amygdala and again they were hyper-reactive, so it made a beautiful story.”


THE MARKRAMS RECOGNIZED the significance of their results. Hyper-responsive sensory, memory and emotional systems might explain both autistic talents and autistic handicaps, they realized. After all, the problem with VPA rats isn’t that they can’t learn—it’s that they learn too quickly, with too much fear, and irreversibly.

They thought back to Kai’s experiences: how he used to cover his ears and resist going to the movies, hating the loud sounds; his limited diet and apparent terror of trying new foods.

“He remembers exactly where he sat at exactly what restaurant one time when he tried for hours to get himself to eat a salad,” Kamila says, recalling that she’d promised him something he’d really wanted if he did so. Still, he couldn’t make himself try even the smallest piece of lettuce. That was clearly overgeneralization of fear.

The Markrams reconsidered Kai’s meltdowns, too, wondering if they’d been prompted by overwhelming experiences. They saw that identifying Kai’s specific sensitivities preemptively might prevent tantrums by allowing him to leave upsetting situations or by mitigating his distress before it became intolerable. The idea of an intense world had immediate practical implications.

 The amygdala.

The VPA data also suggested that autism isn’t limited to a single brain network. In VPA rat brains, both the amygdala and the cortex had proved hyper-responsive to external stimuli. So maybe, the Markrams decided, autistic social difficulties aren’t caused by social-processing defects; perhaps they are the result of total information overload.


CONSIDER WHAT IT MIGHT FEEL like to be a baby in a world of relentless and unpredictable sensation. An overwhelmed infant might, not surprisingly, attempt to escape. Kamila compares it to being sleepless, jetlagged, and hung over, all at once. “If you don’t sleep for a night or two, everything hurts. The lights hurt. The noises hurt. You withdraw,” she says.

Unlike adults, however, babies can’t flee. All they can do is cry and rock, and, later, try to avoid touch, eye contact, and other powerful experiences. Autistic children might revel in patterns and predictability just to make sense of the chaos.

At the same time, if infants withdraw to try to cope, they will miss what’s known as a “sensitive period”—a developmental phase when the brain is particularly responsive to, and rapidly assimilates, certain kinds of external stimulation. That can cause lifelong problems.

Language learning is a classic example: If babies aren’t exposed to speech during their first three years, their verbal abilities can be permanently stunted. Historically, this created a spurious link between deafness and intellectual disability: Before deaf babies were taught sign language at a young age, they would often have lasting language deficits. Their problem wasn’t defective “language areas,” though—it was that they had been denied linguistic stimuli at a critical time. (Incidentally, the same phenomenon accounts for why learning a second language is easy for small children and hard for virtually everyone else.)

This has profound implications for autism. If autistic babies tune out when overwhelmed, their social and language difficulties may arise not from damaged brain regions, but because critical data is drowned out by noise or missed due to attempts to escape at a time when the brain actually needs this input.

The intense world could also account for the tragic similarities between autistic children and abused and neglected infants. Severely maltreated children often rock, avoid eye contact, and have social problems—just like autistic children. These parallels led to decades of blaming the parents of autistic children, including the infamous “refrigerator mother.” But if those behaviors are coping mechanisms, autistic people might engage in them not because of maltreatment, but because ordinary experience is overwhelming or even traumatic.

The Markrams teased out further implications: Social problems may not be a defining or even fixed feature of autism. Early intervention to reduce or moderate the intensity of an autistic child’s environment might allow their talents to be protected while their autism-related disabilities are mitigated or, possibly, avoided.

The VPA model also captures other paradoxical autistic traits. For example, while oversensitivities are most common, autistic people are also frequently under-reactive to pain. The same is true of VPA rats. In addition, one of the most consistent findings in autism is abnormal brain growth, particularly in the cortex. There, studies find an excess of circuits called mini-columns, which can be seen as the brain’s microprocessors. VPA rats also exhibit this excess.

Moreover, extra minicolumns have been found in autopsies of scientists who were not known to be autistic, suggesting that this brain organization can appear without social problems and alongside exceptional intelligence.

Like a high-performance engine, the autistic brain may only work properly under specific conditions. But under those conditions, such machines can vastly outperform others—like a Ferrari compared to a Ford.


THE MARKRAMS’ FIRST PUBLICATION of their intense world research appeared in 2007: a paper on the VPA rat in the Proceedings of the National Academy of Sciences. This was followed by an overview in Frontiers in Neuroscience. The next year, at the Society for Neuroscience (SFN), the field’s biggest meeting, a symposium was held on the topic. In 2010, they updated and expanded their ideas in a second Frontiers paper.

Since then, more than three dozen papers have been published by other groups on VPA rodents, replicating and extending the Markrams’ findings. At this year’s SFN, at least five new studies were presented on VPA autism models. The sensory aspects of autism have long been neglected, but the intense world and VPA rats are bringing it to the fore.

Nevertheless, reaction from colleagues in the field has been cautious. One exception is Laurent Mottron, professor of psychiatry and head of autism research at the University of Montreal. He was the first to highlight perceptual differences as critical in autism—even before the Markrams. Only a minority of researchers even studied sensory issues before him. Almost everyone else focused on social problems.

But when Mottron first proposed that autism is linked with what he calls “enhanced perceptual functioning,” he, like most experts, viewed this as the consequence of a deficit. The idea was that the apparently superior perception exhibited by some autistic people is caused by problems with higher level brain functioning—and it had historically been dismissed as mere“splinter skills,” not a sign of genuine intelligence. Autistic savants had earlier been known as “idiot savants,” the implication being that, unlike “real” geniuses, they didn’t have any creative control of their exceptional minds. Mottron described it this way in a review paper: “[A]utistics were not displaying atypical perceptual strengths but a failure to form global or high level representations.”

 However, Mottron’s research led him to see this view as incorrect. His own and other studies showed superior performance by autistic people not only in “low level” sensory tasks, like better detection of musical pitch and greater ability to perceive certain visual information, but also in cognitive tasks like pattern finding in visual IQ tests.

In fact, it has long been clear that detecting and manipulating complex systems is an autistic strength—so much so that the autistic genius has become a Silicon Valley stereotype. In May, for example, the German software firm SAP announced plans to hire 650 autistic people because of their exceptional abilities. Mathematics, musical virtuosity, and scientific achievement all require understanding and playing with systems, patterns, and structure. Both autistic people and their family members are over-represented in these fields, which suggests genetic influences.

“Our points of view are in different areas [of research,] but we arrive at ideas that are really consistent,” says Mottron of the Markrams and their intense world theory. (He also notes that while they study cell physiology, he images actual human brains.)

Because Henry Markram came from outside the field and has an autistic son, Mottron adds, “He could have an original point of view and not be influenced by all the clichés,” particularly those that saw talents as defects. “I’m very much in sympathy with what they do,” he says, although he is not convinced that they have proven all the details.

Mottron’s support is unsurprising, of course, because the intense world dovetails with his own findings. But even one of the creators of the “theory of mind” concept finds much of it plausible.

Simon Baron-Cohen, who directs the Autism Research Centre at Cambridge University, told me, “I am open to the idea that the social deficits in autism—like problems with the cognitive aspects of empathy, which is also known as ‘theory of mind’—may be upstream from a more basic sensory abnormality.” In other words, the Markrams’ physiological model could be the cause, and the social deficits he studies, the effect. He adds that the VPA rat is an “interesting” model. However, he also notes that most autism is not caused by VPA and that it’s possible that sensory and social defects co-occur, rather than one causing the other.

His collaborator, Uta Frith, professor of cognitive development at University College London, is not convinced. “It just doesn’t do it for me,” she says of the intense world theory. “I don’t want to say it’s rubbish,” she says, “but I think they try to explain too much.”


AMONG AFFECTED FAMILIES, by contrast, the response has often been rapturous. “There are elements of the intense world theory that better match up with autistic experience than most of the previously discussed theories,” says Ari Ne’eman, president of the Autistic Self Advocacy Network, “The fact that there’s more emphasis on sensory issues is very true to life.” Ne’eman and other autistic people fought to get sensory problems added to the diagnosis in DSM-5 — the first time the symptoms have been so recognized, and another sign of the growing receptiveness to theories like intense world.

Steve Silberman, who is writing a history of autism titled NeuroTribes: Thinking Smarter About People Who Think Differently, says, “We had 70 years of autism research [based] on the notion that autistic people have brain deficits. Instead, the intense world postulates that autistic people feel too much and sense too much. That’s valuable, because I think the deficit model did tremendous injury to autistic people and their families, and also misled science.”

Priscilla Gilman, the mother of an autistic child, is also enthusiastic. Her memoir, The Anti-Romantic Child, describes her son’s diagnostic odyssey. Before Benjamin was in preschool, Gilman took him to the Yale Child Study Center for a full evaluation. At the time, he did not display any classic signs of autism, but he did seem to be a candidate for hyperlexia—at age two-and-a-half, he could read aloud from his mother’s doctoral dissertation with perfect intonation and fluency. Like other autistic talents, hyperlexia is often dismissed as a “splinter” strength.

At that time, Yale experts ruled autism out, telling Gilman that Benjamin “is not a candidate because he is too ‘warm’ and too ‘related,’” she recalls. Kai Markram’s hugs had similarly been seen as disqualifying. At twelve years of age, however, Benjamin was officially diagnosed with Autism Spectrum Disorder.

According to the intense world perspective, however, warmth isn’t incompatible with autism. What looks like antisocial behavior results from being too affected by others’ emotions—the opposite of indifference.

Indeed, research on typical children and adults finds that too much distress can dampen ordinary empathy as well. When someone else’s pain becomes too unbearable to witness, even typical people withdraw and try to soothe themselves first rather than helping—exactly like autistic people. It’s just that autistic people become distressed more easily, and so their reactions appear atypical.

“The overwhelmingness of understanding how people feel can lead to either what is perceived as inappropriate emotional response, or to what is perceived as shutting down, which people see as lack of empathy,” says Emily Willingham. Willingham is a biologist and the mother of an autistic child; she also suspects that she herself has Asperger syndrome. But rather than being unemotional, she says, autistic people are “taking it all in like a tsunami of emotion that they feel on behalf of others. Going internal is protective.”

At least one study supports this idea, showing that while autistic people score lower on cognitive tests of perspective-taking—recall Anne, Sally, and the missing marble—they are more affected than typical folks by other people’s feelings. “I have three children, and my autistic child is my most empathetic,” Priscilla Gilman says, adding that when her mother first read about the intense world, she said, “This explains Benjamin.”

Benjamin’s hypersensitivities are also clearly linked to his superior perception. “He’ll sometimes say, ‘Mommy, you’re speaking in the key of D, could you please speak in the key of C? It’s easier for me to understand you and pay attention.”

Because he has musical training and a high IQ, Benjamin can use his own sense of “absolute pitch”—the ability to name a note without hearing another for comparison—to define the problem he’s having. But many autistic people can’t verbalize their needs like this. Kai, too, is highly sensitive to vocal intonation, preferring his favorite teacher because, he explains, she “speaks soft,” even when she’s displeased. But even at 19, he isn’t able to articulate the specifics any better than that.


ON A RECENT VISIT to Lausanne, Kai wears a sky blue hoodie, his gray Chuck Taylor–style sneakers carefully unlaced at the top. “My rapper sneakers,” he says, smiling. He speaks Hebrew and English and lives with his mother in Israel, attending a school for people with learning disabilities near Rehovot. His manner is unselfconscious, though sometimes he scowls abruptly without explanation. But when he speaks, it is obvious that he wants to connect, even when he can’t answer a question. Asked if he thinks he sees things differently than others do, he says, “I feel them different.”

He waits in the Markrams’ living room as they prepare to take him out for dinner. Henry’s aunt and uncle are here, too. They’ve been living with the family to help care for its newest additions: nine-month-old Charlotte and Olivia, who is one-and-a-half years old.

“It’s our big patchwork family,” says Kamila, noting that when they visit Israel, they typically stay with Henry’s ex-wife’s family, and that she stays with them in Lausanne. They all travel constantly, which has created a few problems now and then. None of them will ever forget a tantrum Kai had when he was younger, which got him barred from a KLM flight. A delay upset him so much that he kicked, screamed, and spat.

Now, however, he rarely melts down. A combination of family and school support, an antipsychotic medication that he’s been taking recently, and increased understanding of his sensitivities has mitigated the disabilities Kai associated with his autism.

 “I was a bad boy. I always was hitting and doing a lot of trouble,” Kai says of his past. “I was really bad because I didn’t know what to do. But I grew up.” His relatives nod in agreement. Kai has made tremendous strides, though his parents still think that his brain has far greater capacity than is evident in his speech and schoolwork.

As the Markrams see it, if autism results from a hyper-responsive brain, the most sensitive brains are actually the most likely to be disabled by our intense world. But if autistic people can learn to filter the blizzard of data, especially early in life, then those most vulnerable to the most severe autism might prove to be the most gifted of all.

Markram sees this in Kai. “It’s not a mental retardation,” he says, “He’s handicapped, absolutely, but something is going crazy in his brain. It’s a hyper disorder. It’s like he’s got an amplification of many of my quirks.”

One of these involves an insistence on timeliness. “If I say that something has to happen,” he says, “I can become quite difficult. It has to happen at that time.

He adds, “For me it’s an asset, because it means that I deliver. If I say I’ll do something, I do it.” For Kai, however, anticipation and planning run wild. When he travels, he obsesses about every move, over and over, long in advance. “He will sit there and plan, okay, when he’s going to get up. He will execute. You know he will get on that plane come hell or high water,” Markram says. “But he actually loses the entire day. It’s like an extreme version of my quirks, where for me they are an asset and for him they become a handicap.”

If this is true, autistic people have incredible unrealized potential. Say Kai’s brain was even more finely tuned than his father’s, then it might give him the capacity to be even more brilliant. Consider Markram’s visual skills. Like Temple Grandin, whose first autism memoir was titled Thinking In Pictures, he has stunning visual abilities. “I see what I think,” he says, adding that when he considers a scientific or mathematical problem, “I can see how things are supposed to look. If it’s not there, I can actually simulate it forward in time.”

At the offices of Markram’s Human Brain Project, visitors are given a taste of what it might feel like to inhabit such a mind. In a small screening room furnished with sapphire-colored, tulip-shaped chairs, I’m handed 3-D glasses. The instant the lights dim, I’m zooming through a brightly colored forest of neurons so detailed and thick that they appear to be velvety, inviting to the touch.

 The simulation feels so real and enveloping that it is hard to pay attention to the narration, which includes mind-blowing facts about the project. But it is also dizzying, overwhelming. If this is just a smidgen of what ordinary life is like for Kai it’s easier to see how hard his early life must have been. That’s the paradox about autism and empathy. The problem may not be that autistic people can’t understand typical people’s points of view—but that typical people can’t imagine autism.

Critics of the intense world theory are dismayed and put off by this idea of hidden talent in the most severely disabled. They see it as wishful thinking, offering false hope to parents who want to see their children in the best light and to autistic people who want to fight the stigma of autism. In some types of autism, they say, intellectual disability is just that.

“The maxim is, ‘If you’ve seen one person with autism, you’ve seen one person with autism,’” says Matthew Belmonte, an autism researcher affiliated with the Groden Center in Rhode Island. The assumption should be that autistic people have intelligence that may not be easily testable, he says, but it can still be highly variable.

He adds, “Biologically, autism is not a unitary condition. Asking at the biological level ‘What causes autism?’ makes about as much sense as asking a mechanic ‘Why does my car not start?’ There are many possible reasons.” Belmonte believes that the intense world may account for some forms of autism, but not others.

Kamila, however, insists that the data suggests that the most disabled are also the most gifted. “If you look from the physiological or connectivity point of view, those brains are the most amplified.”

The question, then, is how to unleash that potential.

“I hope we give hope to others,” she says, while acknowledging that intense-world adherents don’t yet know how or even if the right early intervention can reduce disability.

The secret-ability idea also worries autistic leaders like Ne’eman, who fear that it contains the seeds of a different stigma. “We agree that autistic people do have a number of cognitive advantages and it’s valuable to do research on that,” he says. But, he stresses, “People have worth regardless of whether they have special abilities. If society accepts us only because we can do cool things every so often, we’re not exactly accepted.”


The MARKRAMS ARE NOW EXPLORING whether providing a calm, predictable early environment—one aimed at reducing overload and surprise—can help VPA rats, soothing social difficulties while nurturing enhanced learning. New research suggests that autism can be detected in two-month-old babies, so the treatment implications are tantalizing.

So far, Kamila says, the data looks promising. Unexpected novelty seems to make the rats worse—while the patterned, repetitive, and safe introduction of new material seems to cause improvement.

In humans, the idea would be to keep the brain’s circuitry calm when it is most vulnerable, during those critical periods in infancy and toddlerhood. “With this intensity, the circuits are going to lock down and become rigid,” says Markram. “You want to avoid that, because to undo it is very difficult.”

For autistic children, intervening early might mean improvements in learning language and socializing. While it’s already clear that early interventions can reduce autistic disability, they typically don’t integrate intense-world insights. The behavioral approach that is most popular—Applied Behavior Analysis—rewards compliance with “normal” behavior, rather than seeking to understand what drives autistic actions and attacking the disabilities at their inception.

Research shows, in fact, that everyone learns best when receiving just the right dose of challenge—not so little that they’re bored, not so much that they’re overwhelmed; not in the comfort zone, and not in the panic zone, either. That sweet spot may be different in autism. But according to the Markrams, it is different in degree, not kind.

Markram suggests providing a gentle, predictable environment. “It’s almost like the fourth trimester,” he says.

To prevent the circuits from becoming locked into fearful states or behavioral patterns you need a filtered environment from as early as possible,” Markram explains. “I think that if you can avoid that, then those circuits would get locked into having the flexibility that comes with security.”

Creating this special cocoon could involve using things like headphones to block excess noise, gradually increasing exposure and, as much as possible, sticking with routines and avoiding surprise. If parents and educators get it right, he concludes, “I think they’ll be geniuses.”

IN SCIENCE, CONFIRMATION BIAS is always the unseen enemy. Having a dog in the fight means you may bend the rules to favor it, whether deliberately or simply because we’re wired to ignore inconvenient truths. In fact, the entire scientific method can be seen as a series of attempts to drive out bias: The double-blind controlled trial exists because both patients and doctors tend to see what they want to see—improvement.

At the same time, the best scientists are driven by passions that cannot be anything but deeply personal. The Markrams are open about the fact that their subjective experience with Kai influences their work.

But that doesn’t mean that they disregard the scientific process. The couple could easily deal with many of the intense world critiques by simply arguing that their theory only applies to some cases of autism. That would make it much more difficult to disprove. But that’s not the route they’ve chosen to take. In their 2010 paper, they list a series of possible findings that would invalidate the intense world, including discovering human cases where the relevant brain circuits are not hyper-reactive, or discovering that such excessive responsiveness doesn’t lead to deficiencies in memory, perception, or emotion. So far, however, the known data has been supportive.

But whether or not the intense world accounts for all or even most cases of autism, the theory already presents a major challenge to the idea that the condition is primarily a lack of empathy, or a social disorder. Intense world theory confronts the stigmatizing stereotypes that have framed autistic strengths as defects, or at least as less significant because of associated weaknesses.

And Henry Markram, by trying to take his son Kai’s perspective—and even by identifying so closely with it—has already done autistic people a great service, demonstrating the kind of compassion that people on the spectrum are supposed to lack. If the intense world does prove correct, we’ll all have to think about autism, and even about typical people’s reactions to the data overload endemic in modern life, very differently.

From left: Kamila, Henry, Kai, and Anat


This story was written by Maia Szalavitz, edited by Mark Horowitz, fact-checked by Kyla Jones, and copy-edited by Tim Heffernan, with photography by Darrin Vanselow and an audiobook narrated by Jack Stewart.


free download
ePub • Kindle • Audiobook

Decolonizing Anthropology (Savage Minds)

April 19, 2016

Decolonizing Anthropology is a new series on Savage Minds edited by Carole McGranahan and Uzma Z. Rizvi. Welcome.

Just about 25 years ago Faye Harrison poignantly asked if “an authentic anthropology can emerge from the critical intellectual traditions and counter-hegemonic struggles of Third World peoples? Can a genuine study of humankind arise from dialogues, debates, and reconciliation amongst various non-Western and Western intellectuals — both those with formal credentials and those with other socially meaningful and appreciated qualifications?” (1991:1). In launching this series, we acknowledge the key role that Black anthropologists have played in thinking through how and why to decolonize anthropology, from the 1987 Association of Black Anthropologists’ roundtable at the AAAs that preceded the 1991 volume on Decolonizing Anthropology edited by Faye Harrison, to the World Anthropologies Network, to Jafari Sinclaire Allen and Ryan Cecil Jobson’s essay out this very month in Current Anthropology on “The Decolonizing Generation: (Race and) Theory in Anthropology since the Eighties.”

Decolonizing Anthropology HarrisonThese questions continue to haunt anthropology and all those striving to bring some resolution to these issues. It has become increasingly important to also recognize the ways in which those questions have changed, and how the separation between Western and NonWestern is less about locality and geography, but rather an epistemic question related to the colonial histories of anthropology. Decolonization then has multiple facets to its approach: it is philosophical, methodological, and praxis-oriented, particularly within the fields of anthropology. Here at Savage Minds, we have decided to take these questions on again in a different public, and work through a series of dialogues, debates and possibly even reconciliation. 

We feel it imperative to decolonize anthropology; not doing so reiterates hierarchies of control and oppressive systems of knowledge production. But what does that really mean and what does it look like? What might it mean to decolonize anthropology? Various subfields of anthropology have been contending with this issue in different ways. For example, within archaeological literature, decolonization emerged as political necessity developed through an engagement with the postcolonial critique. Being inspired by Linda Tuhiwai Smith’s influential work on decolonizing methodologies (among others) resulted in the development of indigenous archaeology. Most archaeologists would argue that anthropological archaeology continues to exist within neocolonial, neoliberal, and late capitalist frameworks, and thus these critiques and methodologies need to be constantly revised utilizing interdisciplinary projects that locate decolonization across academia (including decolonizing epistemologies, aesthetics, pedagogy, etc).

decolonize-stickersCalls for decolonization have now emerged as mainstream politics in the academy: an era when academics across disciplines are calling for historical, financial, and intellectual accountability for not only the work we do, but also for the academic institutions in which we study, teach, and learn. We contend, therefore, that decolonizing anthropology (at a minimum) has now grown to a project beyond its initial impetus in treating non-anthropologist intellectuals as just that: intellectuals rather than local interlocutors. In its development across the discipline, in both archaeology and cultural anthropology, for example, decolonizing anthropology is a project about rethinking epistemology, methodology, community, and political commitments.

Epistemology. Decolonizing anthropology means rethinking epistemology. Anthropologists have long acknowledged the development of our field with a colonial impulse, and how the construction of knowledge reiterate systems of control. It is important to continue working through epistemic concerns to realign how our discipline might undiscipline itself and realign how it evaluates what research is considered important. Decolonizing epistemology destabilizes the canon. It is not enough to only add certain voices into our anthro-core classes; a decolonizing movement focused on epistemology provides rigor to the multiplicity and plurality of voices. Deeply linked to the ways in which knowledge is produced and constructed, is our pedagogy and the methodologies by which we practice.

Pedagogy. If we are to realign our discipline, it becomes imperative for us to reconsider how decolonization might impact our pedagogy. This is not a new concept in the academy: decolonizing pedagogy is a subfield within the field of education. As mediators/translators/facilitators of knowledge, it is our responsibility to consider how anthropological conversations about race and difference might be supported and developed in the classroom through a decolonized pedagogical practices. A decolonized pedagogy should be listed within as best practices in our guides to teaching and learning. Pedagogy also includes what one teaches as well as how. What forms the anthropological “canon” of works that one must know? Part of the decolonizing of the discipline is to reassess whose scholarship we mark as important via inclusion on course syllabi. The rediscovery of Zora Neale Hurston’s scholarship by anthropologists is the most obvious example; who else are we–or should we–be learning from and thinking with anew today?

Methodology. Decolonizing anthropology means rethinking methodology, Our history is full of taking information from communities without enough consideration of the impact of this form of anthropological research. This does not only mean filling out our IRB forms, but also thinking carefully about power. Institutionally, our bodies are disciplined to hold and claim certain statuses as anthropologists. How does tending to such manifestations of power redirect our relationships in the field, our research questions, the ways we teach, and the way we work with communities?

Community. Decolonizing anthropology means rethinking community. Rethinking who the communities are within which we do our research. Rethinking the way we stretch and build our community of conversation to open beyond the academy, and learning how to extend our deep anthropological practice of listening with our ears and with our hands, and cultivating a spirit of reciprocity for a new era. And at the heart of today’s decolonial project, rethinking who our community of anthropologists is, and rethinking strategies of recruitment and retention for an anthropology that reflects and includes the communities whose stories, beliefs, and practices have long been those which comprised our discipline.

Political Commitment. Decolonizing anthropology means rethinking our political commitments. It also means to acknowledge that we are not the first to have them. Anthropology has long been a discipline with a political edge to its scholarship for some of its practitioners. However, as decades turn into centuries, what was once politically edgy looks embarrassingly not so, conventional or racist or both. We believe that a decolonized anthropology involves research that advances our understanding of the human world in a way that moves us forward.

All of this involves communication. As editors, our goals for this series are both personal and professional. Our first collaboration was an India Review special issue on Public Anthropology (2006), edited by Carole McGranahan, with Uzma Z. Rizvi as a contributor to the issue. Carole recently revisited her introduction to that volume in a keynote lecture for the annual American University’s Public Anthropology conference in 2014. In a talk on “Tibet, Ferguson, Gaza: On Political Crisis and Anthropological Responsibility,” she reflected on political changes in the discipline over the last decade, including our need to not only address anthropology’s colonial past, but also our imperial present. This is the sort of thinking we began together in 2006. Uzma’s article entitled “Accounting for Multiple Desires: Decolonizing Methodologies, Archaeology and the Public Interest” was based on her PhD research (2000-2003) in Rajasthan, India. The project was designed as a community based-participatory action research project that was explicitly linked to decolonizing archaeology in India. Both of us have had a long standing engagement with this literature and consider this contemporary moment to be significant within the praxis of our discipline, which is why we are thrilled to launch this series!

We have invited anthropologists writing and thinking about decolonizing the discipline to contribute essays to this series. Essays will be posted roughly every two weeks, and if any readers would like to submit an essay for consideration, please send us an email at decolonizinganthropology[at]gmail.com.

Our series schedule of contributors is as follows:

April 25–Faye Harrison, in conversation with Carole McGranahan, Kaifa Roland, and Bianca Williams

May 9–Melissa Rosario

May 23–Zodwa Radebe

June 6–Lisa Uperesa

June 20–Public Anthropology Institute (Gina Athena Ulysse, with Faye Harrison, Carole McGranahan, Melissa Rosario, Paul Stoller, and Maria Vesperi)

July 4–Krysta Ryzewski

August 1–Asmeret Mehari

August 8–Nokuthula Hlabangane

August 15–Zoe Todd

August 29–Didier Sylvain and Les Sabiston

September 12–Claudia Serrato

September 26–Gina Athena Ulysse

October 10–Paige West

November 7–Kristina Lyons

November 14–Marisol de la Cadena

Debate com Eduardo Viveiros de Castro sobre filme “O Abraço da Serpente” (Almanaque Virtual)

Indicado ao Oscar 2016 e distribuído pela Esfera Filmes, debate sobre o filme O Abraço da Serpente lotou o Cinema Estação Botafogo

por 

28 de fevereiro de 2016

O renomado Antropólogo e profundo conhecedor da Etnologia Ameríndia, Eduardo Viveiros de Castro, foi convidado para um debate sobre o novo filme distribuído pela Esfera Filmes, candidato ao Oscar 2016 na categoria de melhor filme em língua não-inglesa, o multipremiado representante da Colômbia “O Abraço da Serpente”, dirigido por Ciro Guerra. O Almanaque Virtual traz por escrito este grandioso debate de Viveiros de Castro com o público que lotou a sala do Cinema Estação Botafogo no Rio de Janeiro na manhã do Sábado do dia 27 de fevereiro.

12799439_1064736346933493_4771678075448367952_n

É importante ressaltar que os personagens cientistas do filme são inspirados em grandes nomes reais para a Antropologia, nos dois tempos em que o filme se passa: o alemão Theodor Koch-Grünberg, que estudou os povos indígenas do Amazonas Colombiano e do Alto Rio Negro entre 1903-1905; e o americano Richard Evans Schultes, renomado etnobotânico que viveu em terras indígenas entre 1941-1952.

Embora bastante alusivo à realidade, há diferenças como as apontadas por Viveiros de Castro: “O personagem de Theodore pode parecer no filme que sumiu ao final de sua jornada exploradora, porém viveu mais 20 anos em Roraima, cujos estudos resultou na obra ‘De Roraima a Orinoco’, coleção em cinco volumes, e apenas faleceu em 1924.”

12800133_1064761743597620_9026066127734794476_n

Já o segundo personagem, no salto de tempo que o filme faz de 1903 para 1941, era o etnobotânico mais famoso da área, que fez grandes estudos sobre o ‘Peiote’, e outras plantas com efeitos alucinógenos. Neste período, quando plantações da Malásia e Ceilão estavam fechadas, explorou-se bastante o Brasil, durante o apogeu do ciclo da borracha em que o alvo eram as seringueiras da América do Sul. Dizimaram muitos índios à conta das guerras da borracha, principalmente na Amazônia Ocidental, entre Brasil, Peru e Colômbia, tanto que nas cenas do filme que aludem a este período, o personagem de um messias falava português.

Na verdade, a representação do personagem que alude a Schultes, o etnobotânico, queria exprimir uma dupla função: procurar a Borracha/seringueira mais fina e melhor (porque a que antes ia pra Europa estava sob poder dos japoneses durante a 2ª Guerra Mundial para a fabricação de armas), e também procurar as plantas referidas no filme como Yakruna, que na realidade possivelmente se refere a plantas como a Chacrona que é uma das substâncias utilizadas para a confecção do Santo Daime (bebida consumida em cultos e experiências transcendentais/religiosas). De igual forma, a representação da fictícia Yakruna (Chacrona) não seria como a bela árvore frondosa mostrada no filme, nem tampouco a linda flor branca que nasce dela. Tratam-se de licenças líricas utilizadas pelo diretor do filme.

o-abraco-da-serpente-2-1280x640

Já a língua falada no filme pelos nativos provavelmente era da família Tucano, representada pelo personagem do índio mais velho, Karamakate, que era possivelmente um descendente Huitoto, que representaria um Pagé/Xamã que teria se isolado do grupo que se “rendeu” aos brancos, como é mostrado mais ao final do filme, quando aparece a representação de índios aprisionados ou embriagados em uma espécie de campo de concentração ou acampamento militar/religioso dos exploradores e exportadores das riquezas nativas.

O filme destaca Missões Capuchinhas como a de Santa Maria do Vaupés, que fica a beira de um rio de mesmo nome que nasce na Colômbia e se junta ao Rio Negro no Brasil. Trata-se de Missões criadas pelos padres, ao longo deste rio importantíssimo na formação das tribos originárias desta região, representada de forma bastante fiel às missões educacionais de índios sob tortura nos anos de 1920 a 1960 até o governo tirar o apoio que dava às missões católicas, como ficou marcado na notória visita do Presidente Juscelino Kubitschek que chegou a dizer que a exploração daquelas terras teria a mesma importância que teve a construção de Brasília para o Brasil. Havia ainda internatos separados de meninos e meninas, “Órfãos das guerras da borracha”, que visava transformá-los não apenas em “bons cristãos” como bons “empregados”.

abraço-capa

Noutro quesito, como se aprofunda Eduardo Viveiros de Castro, o velho Karamakate vestido com trajes nativos clássicos é mais uma ficção improvável para o período de 1960. Viver completamente sozinho faz parte da ficção, porque a lógica ameríndia é completamente diferente da ocidental e, portanto, e seria muito difícil para alguém isoladamente se manter nos costumes tradicionais, preservando artefatos como a sua Maloca (casa) como originalmente seriam.

O personagem do segundo explorador, na figura de Schultes, foi um dos últimos representantes expedicionários. Já o primeiro, Theodor, foi quem trouxe o mito do Macunaíma, usado depois popularmente na literatura pela obra de Mário de Andrade e depois na adaptação pelo cinema brasileiro (interpretado por Grande Othelo).

Outro ponto importante foi a escolha do narrativa do filme em adaptar a diferença epistemológica entre o Conhecimento pelos fatos e o Conhecimento pelos sonhos, ao abordar as diferenças da cultura dos ‘brancos’ e dos indígenas, respectivamente. A percepção ameríndia não se dá a partir da visão do humano para fora, mas sim de fora para dentro, pelas plantas, árvores e montanhas em total harmonia com a natureza, em oposição ao conhecimento classificatório e taxonômico do Ocidente.

El abrazo de la serpiente wala

Metaforicamente, todos os personagens do filme são bastante isolados, tanto os antropólogos quanto os índios que se isolam do grupo por natureza. E daí advém a bela passagem dos 40 anos que ocorre na dupla representação temporal no filme (1903 em oposição a 1941): o personagem mais velho de Karamakate diz que não sabia mais nada, nem realizar os rituais mais tradicionais para ele, mas ‘ao mesmo tempo sabia’, como um simbolismo de que o conhecimento estava sendo perdido, mas tinha de ser merecido, honrado para se passar adiante. A última Yakruna é uma ficção do filme para demonstrar justamente isso.

Viveiros de Castro atenta para o fato de que Antropólogos geralmente reclamam que filmes sobre o gênero de culturas indígenas “explicam demais o tema”. Raros filmes nesta seara são realmente bons. E tem os mais lúdicos/poéticos/alusivos como “O Abraço da Serpente”, que para o professor não se detém tanto em explicações conceituais, que se apresentam raras em alguns diálogos diretamente pra facilitar um pouco para o público em geral, como na cena que alude ao mapa.

abraco

Só ao final da projeção se tem a informação de que os personagens eram inspirados em figuras reais. Schultes, por exemplo, era amigo de Albert Hofmann, o químico que sintetizou o LSD, conhecimento este em parte obtido a partir dos estudos do Schultes sobre a Chacrona, a ‘Yakruna’ do filme. É inegável o resultado destas pesquisas para os avanços farmacológicos da História. Schultes foi quem descobriu o princípio ativo do Peiote/chacrona e etc pra farmacologia moderna. É uma bonita Mensagem sobre precisar do conhecimento dos índios pra evoluir

Vale ressaltar que para Viveiros de Castro foi uma ótima solução a escolha da fotografia P&B pelo diretor Ciro Guerra e só ao final exibir efeitos coloridos como o obtido pelo uso de alucinógenos.

Viveiros de Castro ressalta que há filmagens em regiões bastante diferentes, como mostram as plantas e montanhas de algumas cenas que se pretenderiam passar no mesmo lugar. Apenas um olhar mais atento de quem estudou a geografia da região poderá identificar tais nuances e tal fato não prejudica o resultado final do filme. Faz uma brincadeira ao afirmar que ficou imaginando como na cena da bússola, o personagem do explorador estrangeiro teria explicado sobre o eixo magnético da Terra na língua dos índios, já que este tipo de conhecimento nem mesmo faz parte do universo estrutural daquelas tribos.

Claro que o filme tomou liberdades para contar a história, como a flor de Yakruna, que no máximo talvez venha a desagradar aos botânicos, que podem não gostar da alteração da licença poética relacionada a “verdadeira Yakruna” e o modo como o filme a representa. Porém, há no filme um bom aproveitamento de um contraste real entre os conhecimentos do ‘branco’ e do indígena. Não foi necessariamente o foco dos escritos dos cientistas falar sobre este lado mais místico da apreensão da realidade etnobotânica, mas sim adotado pelo diretor Ciro Guerra para formular uma dramaturgia com uma outra forma de proceder o conhecimento do mundo diferente da nossa. Um bom exemplo cultural neste sentido que Eduardo indica é o livro: “A queda do céu”, livro Yanomani do xamã Davi Kopenawa e Bruce Albert.

El abrazo de la serpiente

Afirma ainda Viveiros de Castro que existem diferenças na abordagem do filme perante os reais estudos. Há de citar o título “O Abraço da Serpente” e a real representação da Anaconda para a cultura indígena, não muito abordada no roteiro. Pra eles o animal é importante porque advém de uma sucuri a gênese da vida, como que deixando de seu ventre as tribos no curso dos rios. Geralmente os pontos onde tem cachoeiras, região muito importante para os povos do Rio Valpés, onde os índios evoluíram após sair da cobra (que também pode ser entendida como uma representação de canoa no Rio). E, similar à onça pra eles, as serpentes também são animais multicoloridos. Teriam uma dupla significação. Primeiro de origem do homem, lembrando que a sucuri é uma cobra aquática. Segundo que é multicolorida e desenhada, um arquétipo da arte e dos desenhos indígenas, também associada ao arco-íris, a cobra como cor, grafismo e cromatismo.

Sobre a imagem do vazio na expressão usada no filme do “Chullachaqui”, uma cópia vazia de nós mesmos, afirma que está também muito bem retratada na fotografia. Sim, este símbolo existe em outras culturas indígenas, mas Eduardo nunca havia visto o uso desta forma, e é interessante como colocaram. Geralmente, a representação do ‘Chullachaqui’ é o aparecimento de uma pessoa após a morte, um outro eu perpétuo na memória de quem fica. Mas no filme Karamakate chama o estrangeiro de duplo/Chullachaqui, porque ele tinha 2 interesses: não só o conhecimento, mas usá-lo para a guerra. O próprio Karamakate se diz ‘Chullachaqui’ porque perdeu a memória. Era um duplo de si mesmo. O que ocorre muito na realidade demonstrada no filme é memória em demasia do passado que traz os mortos de volta. Mas esta forma em vida do Chullachaqui o próprio Eduardo abre a dúvida se existe em um dos povos ou se é mais licença poética.

Num outro momento, o uso da “última planta de Yakruna” é uma visão trágica, como se dissesse que o conhecimento indígena só sobreviveria se passado para o ‘branco’ (por isso ensina o bom uso e não deturpado). Para o professor, leia-se o ‘homem branco’ como o antropólogo. Mas o povo tucano, por exemplo, não se extinguiu. Ainda é numeroso. Esta visão trágica é mais relativa ao personagem do Manduca, do índio tido como “vendido para o branco”. Dependendo da região há índios mais “Manducas” e outros mais “Karamakates” (dependendo do grau de preservação da cultura).

Ironicamente, por se falar em ‘vendido para os brancos’, Viveiros de Castro conta que vários ornamentos indígenas foram roubados pelos missionários por serem por eles consideradas “coisas do demônio”. Os indígenas usavam caixas para guardar o que era sagrado e os missionários lhes tiravam isso e expunham em museus (Mas não diziam q era coisa de demônio? Então por que guardaram pra expor? Os índios querem de volta). Um exemplo deste resgate é “Iauaretê, Cachoeira das Onças” de Vincent Carelli, um filme que aborda este assunto.

Perguntado se hoje com conflitos de terra e um filme de visibilidade como esse, indicado ao Oscar, pode gerar mais debates, Eduardo se mostra positivo, lembrando que já há bons filmes sim neste sentido, que talvez ganhem mais repercussão, como “A bicicleta de Nhanderú” filme Mbyá-guarani de Ariel Ortega e Patrícia Ferreira de 2011.

Voltando ao assunto das simbologias para os indígenas, a onça também tem muita importância, embora apareça apenas rapidamente no filme. Entre as tribos ameríndias da região amazônica existe a figura do pagé-onça, que é quem negocia com os “Guardiões” dos recursos naturais. Há ainda a importante questão sobre as proibições alimentares, com o mundo extra humano, sobre restrição de períodos do ano, e que aparece de forma superficial no filme. A sucuri é muito importante principalmente ao norte da Amazônia. A ‘Boa’ mencionada no roteiro pode ser lida tanto como a jiboia como a sucuri, que são da mesma família de cobras. Ambas não-peçonhentas que matam por estrangulamento. A primeira terrestre e a segunda aquática. E a onça em toda mitologia seria o grande antagonista competidor do humano, modelo de força, agilidade e ao mesmo tempo perigoso, como podendo se apoderar do espírito humano.

Para finalizar, Viveiros de Castro encerra sua fala discorrendo sobre a noção de tempo: o indígena do filme usa palavras como ‘milhares’, ‘milhão’ de anos, mas ‘milhão’ não existe naquela língua. É visão poética do filme. E ele cita ainda várias outras formas poéticas do tempo na percepção das culturas ameríndias, com por exemplo o tempo do amanhecer, da gênese, antes dos diferentes grupos humanos saírem da cobra, antes de se separarem. Humanos eram peixes, alguns viraram humanos, outros não, por isso os povos do Rio Negro eram principalmente pescadores. Este seria um Tempo pré-cosmológico, onde os animais e humanos falavam entre si, todos eram o mesmo ser. Há também o tempo cíclico, onde netos recebem nomes dos avós, pois só existe a concepção de 2 gerações que se reconstituem a cada vez, a 3ª é igual à 1ª e a 4ª à 2ª, bidimensional e assim sucessivamente.

Transcrição por Filippo Pitanga

Edição por Samantha Brasil

El abrazo de la serpiente13

50 anos de calamidades na América do Sul (Pesquisa Fapesp)

Terremotos e vulcões matam mais, mas secas e inundações atingem maior número de pessoas 

MARCOS PIVETTA | ED. 241 | MARÇO 2016

Um estudo sobre os impactos de 863 desastres naturais registrados nas últimas cinco décadas na América do Sul indica que fenômenos geológicos relativamente raros, como os terremotos e o vulcanismo, produziram quase o dobro de mortes do que eventos climáticos e meteorológicos de ocorrência mais frequente, como inundações, deslizamento de encostas, tempestades e secas. Dos cerca de 180 mil óbitos decorrentes dos desastres, 60% foram em razão de tremores de terra e da atividade de vulcões, um tipo de ocorrência que se concentra nos países andinos, como Peru, Chile, Equador e Colômbia. Os terremotos e o vulcanismo representaram, respectivamente, 11% e 3% dos eventos contabilizados no trabalho.

Aproximadamente 32% das mortes ocorreram em razão de eventos associados a ocorrências meteorológicas ou climáticas, categoria que engloba quatro de cada cinco desastres naturais registrados na região entre 1960 e 2009. Epidemias de doenças – um tipo de desastre biológico com dados escassos sobre a região, segundo o levantamento – levaram 15 mil pessoas a perder a vida, 8% do total. No Brasil, 10.225 pessoas morreram ao longo dessas cinco décadas em razão de desastres naturais, pouco mais de 5% do total, a maioria em inundações e deslizamentos de encostas durante tempestades.

Seca no Nordeste...

O trabalho foi feito pela geógrafa Lucí Hidalgo Nunes, professora do Instituto de Geociências da Universidade Estadual de Campinas (IG-Unicamp) para sua tese de livre-docência e resultou no livro Urbanização e desastres naturais – Abrangência América do Sul (Oficina de Textos), lançado em meados do ano passado. “Desde os anos 1960, a população urbana da América do Sul é maior do que a rural”, diz Lucí. “O palco maior das calamidades naturais tem sido o espaço urbano, que cresce em área ocupada pelas cidades e número de habitantes.”

A situação se inverteu quando o parâmetro analisado foi, em vez da quantidade de mortos, o número de indivíduos afetados em cada tipo de desastre. Dos 138 milhões de vítimas não fatais atingidas por esses eventos, 1% foi alvo de epidemias, 11% de terremotos e vulcanismo, 88% de fenômenos climáticos ou meteorológicos. As secas e as inundações foram as ocorrências que provocaram impactos em mais indivíduos. As grandes estiagens atingiram 57 milhões de pessoas (41% de todos os afetados), e as enchentes, 52,5 milhões de habitantes (38%). O Brasil respondeu por cerca de 85% das vítimas não fatais de secas, essencialmente moradores do Nordeste, e por um terço dos atingidos por inundações, fundamentalmente habitantes das grandes cidades do Sul-Sudeste.

...inundação em Caracas, na Venezuela: esses dois tipos de desastres são os que afetam o maior número de pessoas

Estimados em US$ 44 bilhões ao longo das cinco décadas, os prejuízos materiais associados aos quase 900 desastres contabilizados foram decorrentes, em 80% dos casos, de fenômenos de natureza climática ou meteorológica. “O Brasil tem quase 50% do território e mais da metade da população da América do Sul. Mas foi palco de apenas 20% dos desastres, 5% das mortes e 30% dos prejuízos econômicos associados a esses eventos”, diz Lucí. “O número de pessoas afetadas aqui, no entanto, foi alto, 53% do total de atingidos por desastres na América do Sul. Ainda temos vulnerabilidades, mas não tanto quanto países como Peru, Colômbia e Equador.”

Para escrever o estudo, a geógrafa com-pilou, organizou e analisou os registros de desastres naturais das últimas cinco décadas nos países da América do Sul, além da Guiana Francesa (departamento ultramarino da França), que estão armazenados no Em-Dat – International Disaster Database. Essa base de dados reúne informações sobre mais de 21 mil desastres naturais ocorridos em todo o mundo desde 1900 até hoje. Ela é mantida pelo Centro de Pesquisa em Epidemiologia de Desastres (Cred, na sigla em inglês), que funciona na Escola de Saúde Pública da Universidade Católica de Louvain, em Bruxelas (Bélgica). “Não há base de dados perfeita”, pondera Lucí. “A do Em-Dat é falha, por exemplo, no registro de desastres biológicos.” Sua vantagem é juntar informações oriundas de diferentes fontes – agências não governamentais, órgãos das Nações Unidas, companhias de seguros, institutos de pesquisa e meios de comunicação – e arquivá-las usando sempre a mesma metodologia, abordagem que possibilita a realização de estudos comparativos.

O que caracteriza um desastre
Os eventos registrados no Em-Dat como desastres naturais devem preencher ao menos uma de quatro condições: provocar a morte de no mínimo 10 pessoas; afetar 100 ou mais indivíduos; motivar a declaração de estado de emergência; ou ainda ser a razão para um pedido de ajuda internacional. No trabalho sobre a América do Sul, Lucí organizou os desastres em três grandes categorias, subdivididas em 10 tipos de ocorrências. Os fenômenos de natureza geofísica englobam os terremotos, as erupções vulcânicas e os movimentos de massa seca (como a queda de uma pedra morro abaixo em um dia sem chuva). Os eventos de caráter meteorológico ou climático abarcam as tempestades, as inundações, os deslocamentos de terra em encostas, os extremos de temperatura (calor ou frio fora do normal), as secas e os incêndios. As epidemias representam o único tipo de desastre biológico contabilizado (ver quadro).

062-065_Desastres climáticos_241O climatologista José Marengo, chefe da divisão de pesquisas do Centro Nacional de Monitoramento e Alertas de Desastres Naturais (Cemaden), em Cachoeira Paulista, interior de São Paulo, afirma que, além de eventos naturais, existem desastres considerados tecnológicos e casos híbridos. O rompimento em novembro passado de uma barragem de rejeitos da mineradora Samarco, em Mariana (MG), que provocou a morte de 19 pessoas e liberou toneladas de uma lama tóxica na bacia hidrográfica do rio Doce, não tem relação com eventos naturais. Pode ser qualificado como um desastre tecnológico, em que a ação humana está ligada às causas da ocorrência. Em 2011, o terremoto de 9.0 graus na escala Richter, seguido de tsunamis, foi o maior da história do Japão. Matou quase 16 mil pessoas, feriu 6 mil habitantes e provocou o desaparecimento de 2.500 indivíduos. Destruiu também cerca de 138 mil edificações. Uma das construções afetadas foi a usina nuclear de Fukushima, de cujos reatores vazou radioatividade. “Nesse caso, houve um desastre tecnológico causado por um desastre natural”, afirma Marengo.

Década após década, os registros de desastres naturais têm aumentado no continente, seguindo uma tendência que parece ser global. “A qualidade das informações sobre os desastres naturais melhorou muito nas últimas décadas. Isso ajuda a engrossar as estatísticas”, diz Lucí. “Mas parece haver um aumento real no número de eventos ocorridos.” Segundo o estudo, grande parte da escalada de eventos trágicos se deveu ao número crescente de fenômenos meteorológicos e climáticos de grande intensidade que atingiram a América do Sul. Na década de 1960, houve 51 eventos desse tipo. Nos anos 2000, o número subiu para 257. Ao longo das cinco décadas, a incidência de desastres geofísicos, que provocam muitas mortes, manteve-se mais ou menos estável e os casos de epidemias diminuíram.

Risco urbano 
O número de mortes em razão de eventos extremos parece estar diminuindo depois de ter atingido um pico de 75 mil óbitos nos anos 1970. Na década passada, houve pouco mais de 6 mil mortes na América do Sul causadas por desastres naturais, de acordo com o levantamento de Lucí. Historicamente, as vítimas fatais se concentram em poucas ocorrências de enormes proporções, em especial os terremotos e as erupções vulcânicas. Os 20 eventos com mais fatalidades (oito ocorridos no Peru e cinco na Colômbia) responderam por 83% de todas as mortes ligadas a fenômenos naturais entre 1960 e 2009. O pior desastre foi um terremoto no Peru em maio de 1970, com 66 mil mortes, seguido de uma inundação na Venezuela em dezembro de 1999 (30 mil mortes) e uma erupção vulcânica na Colômbia em novembro de 1985 (20 mil mortes). O Brasil contabiliza o 9º evento com mais fatalidades (a epidemia de meningite em 1974, com 1.500 óbitos) e o 19° (um deslizamento de encostas, em razão de fortes chuvas, que matou 436 pessoas em março de 1967 em Caraguatatuba, litoral de São Paulo).

Também houve declínio na quantidade de pessoas afetadas nos anos mais recentes, mas as cifras continuam elevadas. Nos anos 1980, os desastres produziram cerca de 50 milhões de vítimas não fatais na América do Sul. Na década passada e também na retrasada, o número caiu para cerca de 20 milhões.

062-065_Desastres climáticos_241-02Sete em cada 10 latino-americanos moram atualmente em cidades, onde a ocupação do solo sem critérios e algumas características geoclimáticas específicas tendem a aumentar a vulnerabilidade da população local a desastres naturais. Lucí comparou a situação de 56 aglomerados urbanos com mais de 750 mil habitantes da América do Sul em relação a cinco fatores que aumentam o risco de calamidades: seca, terremoto, inundação, deslizamento de encostas e vulcanismo. Quito, capital do Equador, foi a única metrópole que estava exposta aos cinco fatores. Quatro cidades colombianas (Bogotá, Cáli, Cúcuta e Medellín) e La Paz, na Bolívia, vieram logo atrás, com quatro vulnerabilidades. As capitais brasileiras apresentaram no máximo dois fatores de risco, seca e inundação (ver quadro). “Os desastres resultam da junção de ameaças naturais e das vulnerabilidades das áreas ocupadas”, diz o pesquisador Victor Marchezini, do Cemaden, sociólogo que estuda os impactos de longo prazo desses fenômenos extremos. “São um evento socioambiental.”

É difícil mensurar os custos de um desastre. Mas a partir de dados da edição de 2013 do Atlas brasileiro de desastres naturais, que usa uma metodologia dife-rente da empregada pela geógrafa da Unicamp para contabilizar calamidades na América do Sul, o grupo de Carlos Eduardo Young, do Instituto de Economia da Universidade Federal do Rio de Janeiro (UFRJ), fez no final do ano passado um estudo. Baseado em estimativas do Banco Mundial de perdas provocadas por desastres em alguns estados brasileiros, Young calculou que enxurradas, inundações e movimentos de massa ocorridos entre 2002 e 2012 provocaram prejuízos econômicos de ao menos R$ 180 bilhões para o país. Em geral, os estados mais pobres, como os do Nordeste, sofreram as maiores perdas econômicas em relação ao tamanho do seu PIB. “A vulnerabilidade a desastres pode ser inversamente proporcional ao grau de desenvolvimento econômico dos estados”, diz o economista. “As mudanças climáticas podem acirrar a questão da desigualdade regional no Brasil.”

Words for snow revisited: Languages support efficient communication about the environment (Carnegie Mellon University)

13-APR-2016

CARNEGIE MELLON UNIVERSITY

 

The claim that Eskimo languages have many words for different types of snow is well known among the public, but it has been greatly exaggerated and is therefore often dismissed by scholars of language. However, a new study published in PLOS ONE supports the general idea behind the original claim.

The claim that Eskimo languages have many words for different types of snow is well known among the public, but it has been greatly exaggerated and is therefore often dismissed by scholars of language.

However, a new study published in PLOS ONE supports the general idea behind the original claim. Carnegie Mellon University and University of California, Berkeley researchers found that languages that use the same word for snow and ice tend to be spoken in warmer climates, reflecting lower communicative need to talk about snow and ice.

“We wanted to broaden the investigation past Eskimo languages and look at how different languages carve up the world into words and meanings,” said Charles Kemp, associate professor of psychology in CMU’s Dietrich College of Humanities and Social Sciences.

For the study, Kemp, and UC Berkeley’s Terry Regier and Alexandra Carstensen analyzed the connection between local climates, patterns of language use and word(s) for snow and ice across nearly 300 languages. They drew on multiple sources of data including library reference works, Twitter and large digital collections of linguistic and meteorological data.

The results revealed a connection between temperature and snow and ice terminology, suggesting that local environmental needs leave an imprint on languages. For example, English originated in a relatively cool climate and has distinct words for snow and ice. In contrast, the Hawaiian language is spoken in a warmer climate and uses the same word for snow and for ice. These cases support the claim that languages are adapted to the local communicative needs of their speakers — the same idea that lies behind the overstated claim about Eskimo words for snow. The study finds support for this idea across language families and geographic areas.

“These findings don’t resolve the debate about Eskimo words for snow, but we think our question reflects the spirit of the initial snow claims — that languages reflect the needs of their speakers,” said Carstensen, a psychology graduate student at UC Berkeley.

The researchers suggest that in the past, excessive focus on the specific example of Eskimo words for snow may have obscured the more general principle behind it.

Carstensen added, “Here, we deliberately asked a somewhat different question about a broader set of languages.”

The study also connects with previous work that explores how the sounds and structures of language are shaped in part by a need for efficiency in communication.

“We think our study reveals the same basic principle at work, modulated by local communicative need,” said Regier, professor of linguistics and cognitive science at UC Berkeley.

###

Read the full study at http://dx.plos.org/10.1371/journal.pone.0151138.