Arquivo mensal: janeiro 2015

Doomsday Clock Set at 3 Minutes to Midnight (Live Science)

by Megan Gannon, News Editor   |   January 22, 2015 01:25pm ET

Crise da água em SP: Especialistas apontam cenários para quando a água acabar e lições a serem tomadas pelo colapso estadual (Brasil Post)

Publicado: 21/01/2015 11:29 BRST  Atualizado: 21/01/2015 11:53 BRST 

MONTAGEMCRISEDAAGUA

Promessa de campanha do governador Geraldo Alckmin (PSDB), a falta de água em São Paulo é uma realidade há meses em diversos pontos do Estado. Na semana passada, ele admitiu que há sim racionamento (diante da repercussão, tentou voltar atrás), algo que a população – sobretudo a dos bairros mais carentes – já sabia. O que também já se sabe é que, sim, a água vai mesmo acabar. Se não chegar a zerar, terá níveis baixíssimos que afetarão a vida de todos, a partir de março.

Os especialistas ouvidos pelo Brasil Post viram com bons olhos o fato de que o governo paulista, com atraso, reconheceu o racionamento. Também aprovaram a aplicação de multa contra aqueles que consomem muita água – embora a medida, tardia, devesse ser uma política sempre presente, e não para ‘apagar incêndios’ como agora. Contudo, o cenário que se colocará com a chegada do período de estiagem, entre o fim de março e começo de abril, se estendendo até outubro, vai requerer novos hábitos, seja dos gestores ou da população.

“Quando acabar a água serão interrompidas atividades que não são consideradas essenciais, com cortes para o comércio, para a indústria e o fechamento de locais com muito uso de água, como shoppings, escolas e universidades”, analisou o professor Antonio Carlos Zuffo, especialista na área de recursos hídricos na Unicamp. Parece exagerado, mas não é. Segundo o jornal O Estado de S. Paulo desta quarta-feira (21), os seis mananciais que abastecem 20 milhões de pessoas na Grande São Paulo têm registrado déficit de 2,5 bilhões de litros por dia em pleno período no qual deveriam encher para suprir os meses de seca.

Já em 2002, a Saneas, revista da Associação dos Engenheiros da Companhia de Saneamento Básico do Estado de São Paulo (AESabesp), publicava um texto no qual apontava “uma inegável situação de estresse hídrico”, a qual podia “ter um final trágico, com previsões de escassez crônica em 15 anos”. A Agência Nacional de Águas (ANA) apontava, na outorga de uso do Sistema Cantareirade 2004, que era preciso diminuir a dependência desse sistema. Em plena crise, na tentativa de renovação em 2014, havia uma tentativa de aumentar, e não diminuir, o uso do Cantareira. Ou seja, algo impraticável e ignorando as previsões. Não, a culpa não é de São Pedro.

“Hoje a situação é muito pior que no ano passado. Em janeiro de 2014 tínhamos 27,2% positivos no Cantareira, hoje temos 23,5% negativos. Ou seja, consumimos 50% do volume nesse período. Mantida a média de consumo, a água acaba no fim de março. É preciso lembrar que janeiro é o mês com maior incidência de chuva em SP, seguido por dezembro. No mês passado, choveu 25% a menos do que a média. Esse mês só choveu 22%, 23% da média. A equação é simples: não vai ter água para todo mundo”, completou Zuffo.

Informação e transparência

Para a ambientalista Malu Ribeiro, da ONG SOS Mata Atlântica, a demora em admitir o óbvio por parte das autoridades trouxe mais prejuízos do que benefícios ao longo dos últimos 13 meses. “A sociedade precisa ter a noção clara da gravidade dessa crise. Quando as autoridades passam certa confiança, como era o caso do governo Alckmin, a tendência é que não se alerte da forma necessária e as pessoas se mantenham em uma situação confortável. Muita gente não acredita na proporção dessa crise, muito se agravou e agora é preciso cautela”, avaliou.

As mudanças na Secretaria de Recursos Hídricos e na presidência da Companhia de Saneamento Básico do Estado de São Paulo (Sabesp), com as entradas de Benedito Braga e Jerson Kelman, respectivamente, também foram benéficas, já que colocam em posições estratégicas dois especialistas no tema. Entretanto, isso não basta. A necessidade de discutir a gestão da água sob o âmbito estratégico, algo muito teórico e pouco prático no Brasil, é vista como fundamental em tempos de crise.

“Há ainda muita ocupação em áreas de mananciais, por exemplo. Então vemos que o comportamento, apesar da crise não ser nova, não mudou. Veja em Itu, onde eu moro, onde a crise foi muito pior e, agora que choveu um pouco, as pessoas acham que não precisam mais poupar, que tudo voltou ao normal. O combate ao desperdício deve ser permanente e temos de ter prevenção. É preciso doer no bolso, por isso a multa deve ser permanente”, disse Malu.

“A falta de informação resultou em uma insegurança, sem informar à população sobre o seu papel na crise. A ONU já apontava que a década entre 2010 e 2020 seria da água, e não por acaso, mas no Brasil há uma timidez nesse sentido. É preciso mudar essa cultura de abundância que se tem no Sudeste e desenvolver um plano estratégico, com mais poder aos comitês de bacia. É absurdo o desperdício de água na agricultura, e isso não é discutido. É hora de acordar”, completou a ambientalista.

‘Água cara’ veio para ficar

De acordo com os especialistas, a crise da água expõe também um cenário já esperado, já que a Terra passa por ciclos alternados entre seca e chuvas a cada 30 anos. O atual, iniciado em 2010 e que segue até 2040, será recheado de períodos de seca em regiões populosas, quadro a se inverter apenas daqui a 25 anos. Assim, é preciso mudar hábitos, antes de mais nada. Mesmo em tempos de calor excessivo, há quem ainda não tenha se dado conta disso.

“Muita gente se vê alheia ao problema e, com o calor, acaba correndo para compras piscininhas e usa a água para o lazer. O Carnaval que está chegando também ajuda a tirar o cidadão comum do foco, como ocorreu durante as eleições. Isso não é mais possível. Há a responsabilidade dos gestores, mas também é preciso que o cidadão se atente ao seu papel, sob pena de termos novas ‘cidades mortas’, como no Vale do Paraíba ou no Vale do Jequitinhonha, onde os recursos naturais foram exauridos”, afirmou Malu.

E que ninguém se anime com a promessa da Sabesp de que ainda há uma terceira cota de 41 bilhões de litros do volume morto do Cantareira, cujo uso deve ser solicitado pelo governo paulista junto à ANA nos próximos dias. “Sabemos que 45% do Cantareira que não é captado é volume morto. A terceira cota restante não é toda ela captável. Teríamos com ela mais uns 10%, suficiente só para mais algumas semanas”, comentou Zuffo.

Medidas sugeridas ao longo da crise, o reuso da água e a dessalinização são medidas caras e que dependem de outros aspectos para serem implementadas – e, com o possível racionamento de energia elétrica, podem não sair do papel. Ou seja, não são a solução a curto prazo. O uso de mais água de represas como a Billings (com sua notória poluição) também dependem de obras – outro entrave para quem gostaria de não ver a falta de água por dias seguidos se tornar uma realidade por meses a fio. Sem chuva, só há um caminho a seguir.

“Há uma variabilidade cíclica natural, que nada tem a ver com o aquecimento global, mas não temos engenharia para resolver a questão no curto prazo. Temos é que ter inteligência para nos adaptar e reduzir de 250 litros para 150 litros, ou ainda menos, o consumo de água por cada pessoa. Há países europeus em que o uso não passa de 60 litros/pessoa. É preciso usar menos e tratar a água de maneira que ela possa ser reutilizada. Tudo depende de tecnologia e novos hábitos”, concluiu Zuffo.

LEIA TAMBÉM

– Brasil desperdiça 37% da água tratada, aponta relatório do governo federal

– Ao invés de seguir a lei federal, governo Alckmin promete brigar na Justiça para sobretaxar consumo

– Especialistas sugerem menos obras e mais políticas de reflorestamento para gestão de recursos hídricos

– Professor da Unicamp afirma que volume captado do Sistema Cantareira tinha de ter sido reduzido há anos

“The Fuse is Blown”. Glaciologist’s Jaw Dropping Account of a Shattering Moment (Climate Denial Crock of the Week)

January 22, 2015

Peter Sinclair

If you’ve missed the other segments of our interview with Glaciologist Eric Rignot – do not, repeat, do not, miss this one.

Rignot was a co-author of the “holy shit moment” paper from last spring, showing that large areas of the West Antarctic Ice sheet are now in “irreversible decline”.
That news made for one of my most harrowing videos of the last year, which you can, and should view if you have not – below the fold.

I’m keeping these clips from our interviews minimally edited – I want the raw video to speak for itself to current readers, and to historians, who will undoubtedly understand all too well why we were peeling our jaws off the floor after this one.

The Question That Could Unite Quantum Theory With General Relativity: Is Spacetime Countable? (The Physics Arxiv Blog)

Current thinking about quantum gravity assumes that spacetime exists in countable lumps, like grains of sand. That can’t be right, can it?

The Physics arXiv Blog

One of the big problems with quantum gravity is that it generates infinities that have no physical meaning. These come about because quantum mechanics implies that accurate measurements of the universe on the tiniest scales require high-energy. But when the scale becomes very small, the energy density associated with a measurement is so great that it should lead to the formation of a black hole, which would paradoxically ruin the measurement that created it.

These kinds of infinities are something of an annoyance. Their paradoxical nature makes them hard to deal with mathematically and difficult to reconcile with our knowledge of the universe, which as far as we can tell, avoids this kind of paradoxical behaviour.

So physicists have invented a way to deal with infinities called renormalisation. In essence, theorists assume that space-time is not infinitely divisible. Instead, there is a minimum scale beyond which nothing can be smaller, the so-called Planck scale. This limit ensures that energy densities never become high enough to create black holes.

This is also equivalent to saying that space-time is discrete, or as a mathematician might put it, countable. In other words, it is possible to allocate a number to each discrete volume of space-time making it countable, like grains of sand on a beach or atoms in the universe. That means space-time is entirely unlike uncountable things, such as straight lines which are infinitely divisible, or the degrees of freedom of in the fields that constitute the basic building blocks of physics, which have been mathematically proven to be uncountable.

This discreteness is certainly useful but it also raises an important question: is it right? Can the universe really be fundamentally discrete, like a computer model? Today, Sean Gryb from Radboud University in the Netherlands argues that an alternative approach is emerging in the form of a new formulation of gravity called shape dynamics. This new approach implies that spacetime is smooth and uncountable, an idea that could have far-reaching consequences for the way we understand the universe.

At the heart of this new theory is the concept of scale invariance. This is the idea that an object or law has the same properties regardless of the scale at which it is viewed.

The current laws of physics generally do not have this property. Quantum mechanics, for example, operates only at the smallest scale, while gravity operates at the largest. So it is easy to see why scale invariance is a property that theorists drool over — a scale invariant description of the universe must encompass both quantum theory and gravity.

Shape dynamics does just this, says Gryb. It does this by ignoring many ordinary features of physical objects, such as their position within the universe. Instead, it focuses on objects’ relationships to each other, such as the angles between them and the shape that this makes (hence the term shape dynamics).

This approach immediately leads to a scale invariant picture of reality. Angles are scale invariant because they are the same regardless of the scale at which they are viewed. So the new thinking is describe the universe as a series of instantaneous snapshots on the relationship between objects.

The result is a scale invariance that is purely spatial. But this, of course, is very different to the more significant notion of spacetime scale invariance.

So a key part of Gryb’s work is in using the mathematical ideas of symmetry to show that spatial scale invariance can be transformed into spacetime scale invariance.

Specifically, Gryb shows exactly how this works in a closed, expanding universe in which the laws of physics are the same for all inertial observers and for whom the speed of light is finite and constant.

If those last two conditions sound familiar, it’s because they are the postulates Einstein used to derive special relativity. And Gryb’s formulation is equivalent to this. “Observers in Einstein’s special theory of relativity can be reinterpreted as observers in a scale-invariant space,” he says.

That raises some interesting possibilities for a broader theory of theuniversegravity, just as special relativity lead to a broader theory of gravity in the form of general relativity.

Gryb describes how it is possible to create models of curved space-time by gluing together local patches of flat space-times. “Could it be possible to do something similar in Shape Dynamics; i.e., glue together local patches of conformally flat spaces that could then be related to General Relativity?” he asks.

Nobody has succeeded in doing this on a model that includes the three dimensions of space and one of time but this is early days for shape dynamics. But Gryb and others are working on the problem.

He is clearly excited by the future possibilities, saying that it suggests a new way to think about quantum gravity in scale invariant terms. “This would provide a new mechanism for being able to deal with the uncountably infinite number of degrees of freedom in the gravitational field without introducing discreteness at the Plank scale,” he says.

That’s an exciting new approach. And it is one expounded by a fresh new voice who is able to explain his ideas in a highly readable fashion to a broad audience. There is no way of knowing how this line of thinking will evolve but we’ll look forward to more instalments from Gryb.

Ref: arxiv.org/abs/1501.02671 : Is Spacetime Countable?

Why it’s good to laugh at climate change (The Guardian)

Climate gags are notable by their absence, but an RSA event on Tuesday night hopes to show that climate change comedy can raise laughs and awareness

Marcus Brigstocke will perform on tonight RSA live stream ;event : Seven Serious Jokes About Climate Change

Marcus Brigstocke will perform an RSA live stream event tonight: Seven Serious Jokes About Climate Change. Photograph: Andrew Aitchison/Corbis

Did you hear the one about the climate policy analyst? Or the polar bear who walked into a bar?

Climate change is not generally considered a source of amusement: in terms of comedic material, the forecast is an ongoing cultural drought. But perhaps campaigners have missed a trick in overlooking the powerful role that satire and subversion can play in social change. Could humour cut through the malaise that has smothered the public discourse, activating our cultural antennae in a way that graphs, infographics and images of melting ice could never do?

This is the challenge that a panel of British comedians, including Marcus Brigstocke – a seasoned climate humourist, will take up at an event on Tuesday evening hosted by the RSA and the Climate Outreach and Information Network in London (the event is fully booked but it will be streamed live online). Maybe laughing about something as serious as climate change is just another form of denial. But perhaps its relative absence from the comedy realm is another warning sign: despite decades of awareness raising, the cultural footprint of climate change is faint, fragile and all-too-easily ignored.

The first example of a climate-policy parody was probably the ‘Cheat Neutral’ project: a slick spoof of the logic of carbon offsetting whereby people could pay someone else to be faithful, giving them the opportunity to cheat on their husband or wife. And there have other good video mockeries – including onewarning that wind farms will blow the Earth off-orbit – which have captured the comedy potential of bizarre debates about energy policy.

This year, Greenpeace teamed up with the surreal comedian Reggie Watts to promote the idea of a 100% renewably powered internet. There have been sporadic examples of climate change ‘stand-up’. And the ever-reliable Simpsonshas been occasionally willing to engage.

Reggie Watts yodels for a wind-powered internet.

But these are the exceptions that prove the rule: for the most part, climate gags are notable by their absence.

An ongoing challenge is the polarised nature of the climate debate, with climate scepticism closely pegged to political ideology. According to Nick Comer-Calder, of the Climate Media Net, getting people laughing is a good first step to getting them talking – even across political divides. One analysisfound that major US satirists, such as Jon Stewart and Steven Colbert, have given more coverage to climate change than many of the news channels – although admittedly, this is a pretty low bar to clear.

But while online ridicule directed towards climate ‘deniers’ (generally portrayed as either too stupid to understand the science, or as conspiracy theorists) may appeal to the usual crowd, its hard to see how this kind of approach will breach the political divide. After all, the feeling of being laughed at by a sneering, left-leaning elite is not appealing. One notorious attempt by the 10:10 campaign and director Richard Curtis at ‘humorously’ marginalising opposition towards environmentalism backfired completely. It turns out that most people don’t find graphic depictions of children’s heads exploding all that hilarious after all…

What’s required is for climate change to seep into the fabric of satirical and humourous TV programming, in the same way that other ‘current affairs’ often provide the backdrop and context for creative output. Jokes ‘about’ climate change can in fact be ‘about’ any of the dozens of subjects – family disputes over energy bills, travel and tourism, or changing consumer habits – that are directly impacted by climate change.

Its an interesting irony that while the ‘pro-climate’ discourse can often feel po-faced and pious, climate sceptics have wasted no time in parodying the climate community. The Heretic, a play by Richard Bean, built its dramatic tension around the conflict between a sceptical climate scientist and her cynical departmental head who is suppressing her data in order to keep his grants flowing. The characters are overdrawn and instantly recognisable. And, as a result, it works: it is good drama, entertaining, and laugh-out-loud funny.

While climate change itself is never going to be a barrel of laughs, we seem to be suffering from a collective lack of imagination in teasing out the tragi-comic narratives that climate change surely provides.

Thinking harder about how to plug climate change into our cultural circuits – not as ‘edutainment’ but simply as a target of satire in its own right – will be crucial in overcoming the social silence around the issue. The science-communicators don’t seem to be making much progress with the public: maybe its time to let the comedians have their turn.

Is a climate disaster inevitable? (Book Forum)

From De Ethica, Michel Bourban (Lausanne): Climate Change, Human Rights and the Problem of Motivation; Robert Heeger (Utrecht): Climate Change and Responsibility to Future Generations: Reflections on the Normative Questions; Casey Rentmeester (Finlandia): Do No Harm: A Cross-Disciplinary, Cross-Cultural Climate Ethics; and Norbert Campagna (Luxembourg): Climate Migration and the State’s Duty to Protect. Harvard’s David Keith knows how to dial down the Earth’s thermostat — is it time to try? Renzo Taddei (UNIFESP): Alter Geoengineering. Tobias Boes and Kate Marshall on writing the Anthropocene. People don’t work as hard on hot days — or on a warming planet. James West on 2014 was the year we finally started to do something about climate change. How much is climate change going to cost us? David Roberts investigates. Is a climate disaster inevitable? Adam Frank on what astrobiology can tell us about the fate of the planet. If we’re all headed for extinction anyway—AND WE ARE—won’t it be a lot more enjoyable to run out the clock with everyone looking a little more pleasant? Welcome to the latest exciting opportunity in the sights of investors: the collapse of planet Earth. You can download Tropic of Chaos: Climate Change and the New Geography of Violence by Christian Parenti (2011). You can download Minimal Ethics for the Anthropocene by Joanna Zylinska (2014).

[Emphasis added]

The Paradoxes That Threaten To Tear Modern Cosmology Apart (The Physics Arxiv Blog)

Some simple observations about the universe seem to contradict basic physics. Solving these paradoxes could change the way we think about the cosmos

The Physics arXiv Blog on Jan 20

Revolutions in science often come from the study of seemingly unresolvable paradoxes. An intense focus on these paradoxes, and their eventual resolution, is a process that has leads to many important breakthroughs.

So an interesting exercise is to list the paradoxes associated with current ideas in science. It’s just possible that these paradoxes will lead to the next generation of ideas about the universe.

Today, Yurij Baryshev at St Petersburg State University in Russia does just this with modern cosmology. The result is a list of paradoxes associated with well-established ideas and observations about the structure and origin of the universe.

Perhaps the most dramatic, and potentially most important, of these paradoxes comes from the idea that the universe is expanding, one of the great successes of modern cosmology. It is based on a number of different observations.

The first is that other galaxies are all moving away from us. The evidence for this is that light from these galaxies is red-shifted. And the greater the distance, the bigger this red-shift.

Astrophysicists interpret this as evidence that more distant galaxies are travelling away from us more quickly. Indeed, the most recent evidence is that the expansion is accelerating.

What’s curious about this expansion is that space, and the vacuum associated with it, must somehow be created in this process. And yet how this can occur is not at all clear. “The creation of space is a new cosmological phenomenon, which has not been tested yet in physical laboratory,” says Baryshev.

What’s more, there is an energy associated with any given volume of the universe. If that volume increases, the inescapable conclusion is that this energy must increase as well. And yet physicists generally think that energy creation is forbidden.

Baryshev quotes the British cosmologist, Ted Harrison, on this topic: “The conclusion, whether we like it or not, is obvious: energy in the universe is not conserved,” says Harrison.

This is a problem that cosmologists are well aware of. And yet ask them about it and they shuffle their feet and stare at the ground. Clearly, any theorist who can solve this paradox will have a bright future in cosmology.

The nature of the energy associated with the vacuum is another puzzle. This is variously called the zero point energy or the energy of the Planck vacuum and quantum physicists have spent some time attempting to calculate it.

These calculations suggest that the energy density of the vacuum is huge, of the order of 10^94 g/cm^3. This energy, being equivalent to mass, ought to have a gravitational effect on the universe.

Cosmologists have looked for this gravitational effect and calculated its value from their observations (they call it the cosmological constant). These calculations suggest that the energy density of the vacuum is about 10^-29 g/cm3.

Those numbers are difficult to reconcile. Indeed, they differ by 120 orders of magnitude. How and why this discrepancy arises is not known and is the cause of much bemused embarrassment among cosmologists.

Then there is the cosmological red-shift itself, which is another mystery. Physicists often talk about the red-shift as a kind of Doppler effect, like the change in frequency of a police siren as it passes by.

The Doppler effect arises from the relative movement of different objects. But the cosmological red-shift is different because galaxies are stationary in space. Instead, it is space itself that cosmologists think is expanding.

The mathematics that describes these effects is correspondingly different as well, not least because any relative velocity must always be less than the speed of light in conventional physics. And yet the velocity of expanding space can take any value.

Interestingly, the nature of the cosmological red-shift leads to the possibility of observational tests in the next few years. One interesting idea is that the red-shifts of distant objects must increase as they get further away. For a distant quasar, this change may be as much as one centimetre per second per year, something that may be observable with the next generation of extremely large telescopes.

One final paradox is also worth mentioning. This comes from one of the fundamental assumptions behind Einstein’s theory of general relativity—that if you look at the universe on a large enough scale, it must be the same in all directions.

It seems clear that this assumption of homogeneity does not hold on the local scale. Our galaxy is part of a cluster known as the Local Group which is itself part of a bigger supercluster.

This suggests a kind of fractal structure to the universe. In other words, the universe is made up of clusters regardless of the scale at which you look at it.

The problem with this is that it contradicts one of the basic ideas of modern cosmology—the Hubble law. This is the observation that the cosmological red-shift of an object is linearly proportional to its distance from Earth.

It is so profoundly embedded in modern cosmology that most currently accepted theories of universal expansion depend on its linear nature. That’s all okay if the universe is homogeneous (and therefore linear) on the largest scales.

But the evidence is paradoxical. Astrophysicists have measured the linear nature of the Hubble law at distances of a few hundred megaparsecs. And yet the clusters visible on those scales indicate the universe is not homogeneous on the scales.

And so the argument that the Hubble law’s linearity is a result of the homogeneity of the universe (or vice versa) does not stand up to scrutiny. Once again this is an embarrassing failure for modern cosmology.

It is sometimes tempting to think that astrophysicists have cosmology more or less sewn up, that the Big Bang model, and all that it implies, accounts for everything we see in the cosmos.

Not even close. Cosmologists may have successfully papered over the cracks in their theories in a way that keeps scientists happy for the time being. This sense of success is surely an illusion.

And that is how it should be. If scientists really think they are coming close to a final and complete description of reality, then a simple list of paradoxes can do a remarkable job of putting feet firmly back on the ground.

Ref: arxiv.org/abs/1501.01919 : Paradoxes Of Cosmological Physics In The Beginning Of The 21-St Century

O que esperar da ciência em 2015 (Zero Hora)

Apostamos em cinco coisas que tendem a aparecer neste ano

19/01/2015 | 06h01

O que esperar da ciência em 2015 SpaceX/Youtube
Foto: SpaceX/Youtube

Em 2014, a ciência conseguiu pousar em um cometa, descobriu que estava errada sobre a evolução genética das aves, revelou os maiores fósseis da história. Miguel Nicolelis apresentou seu exoesqueleto na Copa do Mundo, o satélite brasileiro CBERS-4, em parceria com a China, foi ao espaço com sucesso, um brasileiro trouxe a principal medalha da matemática para casa.

Mas e em 2015, o que veremos? Apostamos em cinco coisas que poderão aparecer neste ano.

Foguetes reusáveis


Se queremos colonizar Marte, não adianta passagem só de ida. Esses foguetes, capazes de ir e voltar, são a promessa para transformar o futuro das viagens espaciais. Veremos se a empresa SpaceX, que já está nessa, consegue.

Robôs em casa


Os japoneses da Softbank começam a vender, em fevereiro, um robô humanoide chamado Pepper. Ele usa inteligência artificial para reconhecer o humor do dono e fala quatro línguas. Apesar de ser mais um ajudante do que um cara que faz, logo logo aprenderá novas funções.

Universo invisível


Grande Colisor de Hádrons vai voltar a funcionar em março e terá potência duas vezes maior de quebrar partículas. Uma das possibilidades é que ele ajude a descobrir novas superpartículas que, talvez, componham a matéria escura. Seria o primeiro novo estado da matéria descoberto em um século.

Cura para o ebola


Depois da crise de 2014, pode ser que as vacinas para o ebola comecem a funcionar e salvem muitas vidas na África. Vale o mesmo para a aids. O HIV está cercado, esperamos que a ciência finalmente o vença neste ano.

Discussões climáticas


2014 foi um dos mais quentes da história e, do jeito que a coisa vai, 2015 seguirá a mesma trilha. Em dezembro, o mundo vai discutir um acordo para tentar reverter o grau de emissões de gases em Paris. São medidas para ser implementadas a partir de 2020. Que sejam sensatos nossos líderes.

What will post-democracy look like? (The Sociological Imagination)

 ON JANUARY 19, 2015

As anyone who reads my blog regularly might have noticed, I’m a fan of Colin Crouch’s notion of post-democracy. I’ve interviewed him about it a couple of times: once in 2010 and again in 2013. Whereas he’d initially offered the notion to illuminate a potential trajectory, in the sense that we risk becoming post-democratic, we more latterly see a social order that might be said to have become post-democratic. He intends the term to function analogously to post-industrial: it is not that democracy is gone but that it has been hollowed out:

The term was indeed a direct analogy with ‘post-industrial’. A post-industrial society is not a non-industrial one. It continues to make and to use the products of industry, but the energy and innovative drive of the system have gone elsewhere. The same applies in a more complex way to post-modern, which is not the same as anti-modern or of course pre-modern. It implies a culture that uses the achievements of modernism but departs from them in its search for new possibilities. A post-democratic society therefore is one that continues to have and to use all the institutions of democracy, but in which they increasingly become a formal shell. The energy and innovative drive pass away from the democratic arena and into small circles of a politico-economic elite. I did not say that we were now living in a post-democratic society, but that we were moving towards such a condition.

http://blogs.lse.ac.uk/politicsandpolicy/five-minutes-with-colin-crouch/

Crouch is far from the only theorist to have made such a claim. But I think there’s a precision to his argument which distinguishes it from the manner in which someone like, say, Bauman talks about depoliticisation. My current, slightly morbid, interest in representations of civilisational collapsehas left me wondering what entrenched post-democracy would look like. Asking this question does not refer to an absence of democracy, for which endless examples are possible, but rather for a more detailed sketch of what a social order which was once democratic but is now post-democratic would look like. While everyday life might look something like that which can be seen in Singapore, ‘the city of rules’ as this Guardian article puts it, I think there’s more to be said than this. However we can see in Singapore a vivid account of how micro-regulation can be deployed to facilitate a city in which ‘nothing goes wrong, but nothing really happens’ as one ex-pat memorably phrases it in that article. Is it so hard to imagine efficiency and orderliness being used to secure consent, at least amongst some, for a similar level of social control in western Europe or America?

Perhaps we’d also see the exceptional justice that intruded into UK life after the 2011 riots, with courts being kept open 24/7 in order to better facilitate the restoration of social order. There’s something akin to this in mega sporting events: opaque centralised planning overwhelms democratic consultation, ‘world cup courts’ dish out ad hoc justice, the social structure contorts itself for the pleasure of an international oligopoly upon whom proceedings depend, specialised security arrangements are intensively deployed in the interests of the event’s success and we often see a form of social cleansing (destruction of whole neighbourhoods) presented as a technocratic exercise in event management. We also see pre-arrests and predictive policing deployed to these ends and only a fool would not expect to see more of this as the technological apparatus and the political pressures encouraging them grow over time.

These security arrangements point to another aspect of a post-democratic social order: the economic vibrancy of the security sector. There is a technological dimension to this, with a long term growth fuelled by the ‘war on terror’ coupled with an increasing move towards ‘disruptive policing’ that offers technical solutions at a time of fiscal retrenchment, but we shouldn’t forget the more mundane side of the security industry and its interests in privatisation of policing. This is how Securitas, one of the world’s largest security companies, describe the prospects of the security industry. Note the title of the page: taking advantage of changes.

The global security services market employs several million people and is projected to reach USD 110 billion by 2016. Security services are in demand all over the world, in all industries and in both the public and private sectors. Demand for our services is closely linked to global economic development and social and demographic trends. As the global economy grows and develops, so do we.

Historically, the security market has grown 1–2 percent faster than GDP in mature markets. In recent years, due to current market dynamics and the gradual incorporation of technology into security solutions, security markets in Europe and North America have grown at the same pace as GDP. This trend is likely to continue over the next three to five years.

Market growth is crucial to Securitas’ future profitability and growth, but capitalizing on trends and changes in demand is also important. Developing new security solutions with a higher technology content and improved cost efficiency will allow the private security industry to expand the market by assuming responsibility for work presently performed by the police or other authorities. This development will also be a challenge for operations with insourced security services and increase interest in better outsourced solutions.

http://www.securitas.com/en/About-Securitas/Taking-advantage-of-changes/

Consider this against a background of terrorism, as the spectacular narrative of the ‘war on terror’ comes to be replaced by a prospect of state of alert without end. We’ve not seen the end of the ‘war on terror’, we’ve seen a spectacular narrative become a taken for granted part of everyday life. It doesn’t need to be narrativised any more because it’s here to stay. Against this backdrop, we’re likely see an authoritarian slide in political culture, supplementing the institutional arrangements already in place, in which ‘responsibility’ becomes the key virtue in the exercise of freedoms – as I heard someone say on the radio yesterday, “it’s irresponsible to say democracy is the only thing that matters when we face a threat like this” (or words to that effect).

Crucially, I don’t think this process is inexorable and it’s certainly not the unfolding of an historical logic. It’s enacted by people at every level – including those who reinforce the slide at the micro level of everyday social interaction. The intractability of the problem comes because the process itself involves a hollowing out of processes of contestation at the highest level, such that the corporate agents pursuing this changing social order are also benefiting from it by potential sources of resistance being increasingly absent or at least passive on the macro level.  This is how Wolfgang Streeck describes this institutional project, as inflected through management of the financial crisis:

The utopian ideal of present day crisis management is to complete, with political means, the already far-advanced depoliticization of the economy; anchored in recognised nation-stated under the control of internal governmental and financial diplomacy insulated from democratic participation, with a population that would have learned, over years of hegemonic re-education, to regard the distributional outcomes of free markets as fair, or at least as without alternative.

Buying Time, pg 46

Another Weird Story: Intentional, Post-Intentional, and Unintentional Philosophy (The Cracked Egg)

JANUARY 18, 2015
KAT CRAIG

I was a “2e” kid: gifted with ADHD but cursed with the power to ace standardized tests. I did so well on tests they enrolled me in a Hopkins study, but I couldn’t remember to brush my hair. As if that wasn’t enough, there were a lot of other unusual things going on, far too many to get into here. My brain constantly defied people’s expectations. It was never the same brain from day to day. I am, apparently, a real neuropsychiatric mystery, in both good and bad ways. I’m a walking, breathing challenge to people’s assumptions and perceptions. Just a few examples: the assumption that intelligence is a unitary phenomenon, and the perception that people who think like you are smarter than those who think differently. Even my reasons for defying expectations were misinterpreted. I hated the way people idolized individuality, because being different brought me only pain. People mistook me for trying to be different. Being different is a tragedy!

And it got weirder: I inherited the same sociocognitive tools as everyone else, so I made the same assumptions. Consequently, I defied even my own expectations. So I learned to mistrust my own perceptions, always looking over my shoulder, predicting my own behavior as if I were an outside observer. I literally had to re-engineer myself in order to function in society, and that was impossible to do without getting into some major philosophical questions. I freely admit that this process has taken me my entire life and only recently have I had any success. I am just now learning to function in society–I’m a cracked egg. Cracked once from outside, and once from inside. And just now growing up, a decade late.

So it’s no surprise that I’m so stuck on the question of what people’s brains are actually doing when they theorize.

I stumbled onto R. Scott Bakker’s theories after reading his philosophical thriller, Neuropath. Then I found his blog, and I was blown away that someone besides me was obsessed with the role of ingroup/outgroup dynamics in intellectual circles. As someone with no ingroup (at least not yet), it’s very refreshing. But what really blew my mind was that he had a theory of cognitive science that could explain many of my frustrating experiences: the Blind Brain Theory, or BBT.

The purpose of this post is not to explain BBT, so you’ll have to click the link if you want that. I’ll go more into depth on the specifics of BBT later, but for a ridiculously short summary: it’s a form of eliminativism. Eliminativism is the philosophical view that neuroscience reveals our traditional conceptions of the human being, like free will, mind, and meaning, to be radically mistaken. But BBT is unique among eliminativisms in its emphasis of neglect: the way in which blindness, or lack of information, actually *enables* our brains to solve problems, especially the problem of what we are. And from my perspective, that makes perfect sense.

BBT is a profoundly counterintuitive theory that cautions us against intuition itself. And ironically, it substantiates my skeptical intuitions.  In short, it shows I’m not the only one who has no clue what she’s doing. If BBT is correct, non-neurotypical individuals aren’t really “impaired.” They simply fit differently with other people. Fewer intersecting lines, that’s all. Bakker has developed his theory further since he published this paper, building on his notion of post-intentional theory (see here for a more general introduction). BBT has stirred up quite a lot of drama.

While we all argue over BBT, absorbed in defending our positions, I feel like an outsider, even among people who understand ingroups. Why? Because most of the people in the debate seem to be discussing something hypothetical, something academic. For me, as I’ve explained, the question of intentionality is a question of everyday life. So I can’t shirk my habit of wondering about biology: what’s going on in the brains of intentionalists? What’s going on in the brains of post-intentionalists? And what’s going on inside my own brain? Bakker would say this is precisely the sort of question a post-intentionalist would ask.

But what happens if the post-intentionalist has never done intentional philosophy? Allow me to explain, with a fictionalized example from my own experience. I use the term “intentional” in both an everyday and philosophical sense, interchangeably:

Intentional, Post-Intentional, and Unintentional Philosophy

Imagine you’re an ordinary person. You just want to get on with your life, but you have a terminal illness. It’s an extremely rare neuropsychiatric syndrome: in order to recover, you must solve an ancient philosophical question. You can’t just come up with any old answer. You actually have to prove you solved it, and convince everyone alive you at least have to convince yourself that you could convince anyone whose counterargument could possibly sway you. You’re skeptical to the marrow, and very good at Googling.

Remember, this is a terminal illness, so you have limited time to solve the problem.

In college, philosophy professors said you were a brilliant student. Plus, you have a great imagination from always being forced to do bizarre things. So naturally, you think you can solve it.

But it takes more time than you thought it would. Years more time. Enough time that you turn into a mad hermit. Your life collapses around you and you’re left with no friends, family, or work. But your genes are really damn virulent, and they simply don’t contain the stop codons for self-termination, so you persist.

And finally, after many failed attempts, you cough up something that sticks. An intellectual hairball.

But then the unimaginable happens: you come across a horrifying argument. The argument goes that when it comes to philosophy, intention matters. If your “philosophy” is just a means to survive, it is not philosophy at all; only that which is meant as philosophy can be called philosophical. So therefore, your solution is not valid. It is not even wrong.

So, it’s back to the drawing board for you. You have to find a new solution that makes your intention irrelevant. A solution that satisfies both the intentional philosophers, who do philosophy because they want to, and the unintentional philosophers who do it because they are forced to.

And then you run across something called post-intentional philosophy. It seems like a solution, but…

But post-intentional philosophy, as you see, requires a history: namely, a history of pre-post-intentional philosophy. Or, to oversimplify, intentional philosophy! The kind people do on purpose, not with a gun to their head.

You know that problems cannot be solved from the same level of consciousness that created them, so you try to escape what intentional and post-intentional philosophy share: theory. You think you can tackle your problem by finding a way out of theory altogether. A way that allows for the existence of all sorts of brains generating all sorts of things, intentional, post-intentional, and unintentional. A nonphilosophy, not a Laruellian non-philosophy. That way must exist, otherwise your philosophy will leave your very existence a mystery!

What do you do?

Are Theory and Practice Separate? Separable? Or something completely different?

Philosophy is generally a debate, but as an unintentional thinker I can’t help but remain neutral on everything except responsiveness to reality (more on that coming later). In this section I am attempting neither to support nor to attack it, but to explore it.

Bakker’s heuristic brand of eliminativism appears to bank on the ability to distinguish between the general and the specific, the practical and the theoretical. Correct me if I am wrong.

As the case of the “unintentional philosopher” suggests, philosophers themselves are counterexamples to the robustness of this distinction, just like people with impaired intentional cognition offer counterexamples that question folk psychology. If BBT is empirically testable, the practice-vs-theory distinction must remain empirically testable. We should be able to study everyday cognition (“Square One”) independently of theoretical cognition (“Square Two”) and characterize the neurobiological relationship of the two as either completely modular, somewhat modular, or somewhere in between. We should also be able to predict whether someone is an intentionalist or a post-intentionalist by observing their brains.

From a sociobiological perspective, one possibility is that Bakker is literally trying to hack philosophers’ brains: to separate the neural circuitry that connects philosophical cognition with daily functionality.

If that were the case, their disagreement would come as no surprise.

But my real point here, going back to my struggles with my unusual neurobiology, is that I am personally, neurologically, as close to “non-intentional” as people get. And that presents a problem for my ability to understand any of these philosophical distinctions regarding intentionality, post-intentionality, etc. But just as a person with Aspergers syndrome is forced to intellectually explore the social, my relative deficit of intentionality has simultaneously made it unavoidable–necessary for me to explore intentionality.  My point about theory and practice is to ask whether this state of affairs is “just my problem,” or whether it says something about the entire project of theory.

If nothing else, it certainly questions the assumption that the doctor is never the patient, that the post-intentional theorist is always, necessarily some sort of detached intellectual observer with no deviation from the intentional norm in his own neurobiology.

Come back later for a completely different view…

Você sabia que a maior parte das células do seu corpo não é humana? (UOL)

Tatiana Pronin

Do UOL, em São Paulo

20/01/201506h00

Micróbios infestam nosso corpo: mas sem eles, ninguém é tão humano

Micróbios infestam nosso corpo: mas sem eles, ninguém é tão humano

É de deixar qualquer um espantado: 90% das células presentes no nosso corpo não são humanas. Em outras palavras, você é muito mais micróbios do que você mesmo. Esses “invasores”, embora “invisíveis”, são fundamentais para o nosso equilíbrio. Mas qualquer deslize nesse ecossistema pode causar doenças, muitas delas graves. Por isso, não se descuide: o perigo mora dentro de você e também fora, na superfície da sua pele.

Especialista no tema, o pesquisador Luis Caetano Antunes, da Escola Nacional de Saúde Pública Sergio Arouca, da Fundação Oswaldo Cruz, explica que os seres humanos são colonizados por mais de 35 mil espécies diferentes de bactérias, segundo algumas estimativas. “Lembrando que esse número não leva em conta vírus, protozoários etc”, esclarece.

Considerando apenas um indivíduo, a estimativa é de mais de mil espécies diferentes. “Já se você considerar cepas (que são indivíduos pertencentes à mesma espécie, mas com características peculiares), esse número sobe para mais de 7 mil”, diz. Se você pudesse colocar todas elas numa balança, os ponteiros marcariam aproximadamente 1 kg.

Essa microbiota (flora e fauna microscópica de uma região) é formada assim que chegamos ao mundo. Antunes afirma, inclusive, que bebês nascidos por parto normal têm micróbios diferentes daqueles que nascem por cesariana, pois o contato com o canal vaginal da mãe funciona como um “primeiro banho” de micro-organismos.

Intestino é albergue

Apesar de se estabilizar depois que a pessoa completa 1 ano de idade, a população de micro-organismos está sempre em evolução, graças ao contato com o ambiente externo. Assim, a variedade e a quantidade são maiores em locais mais expostos, como boca, pele, olhos, estômago, intestino, tratos respiratórios, genitais e urinários.

A parte do nosso corpo mais colonizada é de longe o intestino, com 70% do total de bactérias, segundo o pesquisador. “Um dos motivos é que o intestino possui uma quantidade grande de nutrientes para as bactérias. Além disso, ainda existem secreções, células humanas mortas etc”, diz Luis Caetano Antunes.

O especialista também chama atenção para o tamanho desse órgão, que é cheio de vilosidades (dobras, basicamente). “O intestino humano, quando esticado, tem área equivalente a uma quadra de tênis, ou cerca de 200 metros quadrados”, informa.

Médicos, cientistas e nutricionistas têm alertado para a importância da microbiota intestinal. Não é à toa que produtos com lactobacilos se tornaram mais comuns nas prateleiras dos supermercados.

Antunes descreve três funções principais desse exército de micróbios. A primeira é a nutrição: “Os micro-organismos intestinais auxiliam na degradação de nutrientes que o ser humano, sozinho, não conseguiria degradar”, diz. Além disso, eles produzem substâncias, como vitaminas, que nós não produzimos, e afetam as células para que elas consigam extrair mais energia da dieta.

A segunda é treinar o sistema imunológico, fazendo-o identificar o que representa ou não uma ameaça ao nosso organismo. “Um exemplo dessa função vem da observação de que hoje em dia as taxas de doenças relacionadas ao sistema imune (doenças alérgicas, principalmente) está muito mais alta, e isso tem sido associado ao uso indiscriminado de antibióticos, aumento no número de partos por cesariana e excesso de limpeza”, comenta o pesquisador.

A terceira (e não menos importante) missão da microbiota é nos defender contra agentes nocivos. “Sem as bactérias naturais do nosso corpo ficamos muito mais vulneráveis aos ataques de bactérias perigosas”, garante Luis Caetano Antunes, lembrando que há uma série de infecções que são mais comuns em pessoas com histórico de uso recente de antibióticos. “Eles matam as bactérias inofensivas, abrindo espaço para que outras bactérias invadam o nosso organismo e causem doenças.”

Boca cheia

Um dos primeiros cientistas a observar a existência de comunidades de bactérias em nosso corpo foi o holandês Antonie van Leeuwenhoek, que no século 17 analisou um raspado da superfície de seus dentes e descobriu um grande número de seres vivos minúsculos.

Uma organização também holandesa, chamada TNO, divulgou recentemente, após um estudo, que nossa boca abriga cerca de 700 variedades diferentes de bactérias. Os pesquisadores descobriram que um único beijo de língua é capaz de transferir 80 milhões de bactérias de uma boca para outra. Os dados foram publicados na revista Microbiome.

Algumas pessoas podem ficar enojadas, mas a verdade é que beijar pode ser uma maneira de fortalecer o sistema imunológico, tomando por base a lógica descrita pelo pesquisador da Fiocruz.

Pele que habito

Se os micróbios do intestino representam um exército estratégico dentro do corpo, os que habitam nossa pele são a linha de frente. “É a armadura que nos protege contra agentes externos”, considera o médico Jayme de Oliveira Filho, vice-presidente da Sociedade Brasileira de Dermatologia.

Assim como na selva a falta de leões pode levar ao excesso de zebras, qualquer desequilíbrio na microbiota da pele pode levar a problemas variados. A integridade pode ser afetada por banhos longos e quentes, e até pelo uso excessivo de álcool em gel e sabonetes antibacterianos. “Se você usa um produto que promete matar 99% das bactérias, ainda sobrarão muitas, mas você pode matar aquelas que são úteis à pele”, diz o médico.

Tomar muito sol sem filtro também é uma forma de agredir a cútis. É por isso que muita gente tem crises de herpes labial, doença provocada por vírus, depois que volta da praia. Ou adquire manchas nos braços e nas costas (pitiríase versicolor), provocadas por um tipo de fungo. O médico avisa que algumas famílias são mais predispostas a certos tipos de micro-organismos. Se a integridade da pele é afetada, você pode desenvolver um problema que nunca havia aparecido antes. E, acredite, pode até pegar gripe com mais facilidade.

Grupo de especialistas divulga previsão do clima para o próximo trimestre (MCTI)

Na primeira reunião de 2015 do Grupo de Trabalho em Previsão Climática Sazonal do Ministério da Ciência, Tecnologia e Inovação, pesquisadores alertam que haverá chuvas abaixo da média no Norte e Nordeste e acima da média no Sul do País

Chuvas abaixo da média na região Semiárida do Nordeste e na região Norte do Brasil, com possibilidade de queimadas e incêndios em Roraima, e continuidade de precipitação acima da média na região Sul. Essas são as tendências climáticas para os próximos três meses (fevereiro, março e abril). Elas foram apresentadas nesta sexta-feira (16) na primeira reunião de 2015 do Grupo de Trabalho em Previsão Climática Sazonal (GTPCS) do Ministério da Ciência, Tecnologia e Inovação (MCTI).

Paulo Nobre, pesquisador do Instituto Nacional de Pesquisas Espaciais (Inpe/MCTI), atribuiu os resultados da avaliação do grupo à continuidade do fenômeno El Niño. “Temos uma condição sazonal dessas três regiões onde é possível hoje cientificamente e tecnologicamente fazer essas previsões”, afirmou o especialista que conduziu as atividades do primeiro encontro do GTPCS.

Participam do grupo de trabalho, instituído pelo MCTI em novembro de 2013, as principais lideranças na área de previsão climática no País. A cada mês os especialistas se reúnem para traçar prognósticos para o trimestre seguinte. O objetivo é dar subsídios aos tomadores de decisões sobre o cenário climático que se aproxima.

O secretário de Políticas e Programas de Pesquisa e Desenvolvimento do MCTI, Carlos Nobre, alertou que a previsão climática para o próximo trimestre inspira atenção. “O Brasil está vivendo um momento de diferentes extremos climáticos em diferentes partes do país com impactos na economia e na sociedade”, destacou o secretário que também coordena do GTPCS. “As informações geradas pelo grupo de trabalho alimentam imediatamente ministérios e a presidência da República para que sejam tomadas as medidas necessárias.”

Na abertura do encontro, que aconteceu pela primeira vez em Brasília, o ministro da Ciência, Tecnologia e Inovação, Aldo Rebelo, enfatizou a importância de haver previsão climática de curto prazo. “O trabalho dos pesquisadores do GTPCS já contribuiu no ano passado para reduzir os danos da seca no Nordeste e das enchentes em Rondônia”, exemplificou.

Participam do grupo pesquisadores do Centro de Previsão de Tempo e Estudos Climáticos (CPTEC) do Inpe; do Centro de Ciência do Sistema Terrestre (CCST); do Centro Nacional de Monitoramento e Alertas de Desastres Naturais (Cemaden/MCTI); e do Instituto Nacional de Pesquisas da Amazônia (Inpa/MCTI). A cada reunião um dos membros conduzirá as atividades. Nesta sexta, o meteorologista Paulo Nobre, pesquisador do Inpe, coordenou os trabalhos.

Para outras regiões do país não há previsibilidade climática, a exemplo do Sudeste. “O Nordeste, por exemplo, é a região com maior previsibilidade sazonal porque tem a dependência do Oceano e um tempo de variação bem lento. Na região Sudeste, o que causa chuva são as frentes frias que tem um tempo de previsibilidade de uma semana, no máximo duas”, explica Paulo Nobre, pesquisador do Inpe. No limite do conhecimento científico o que se pode afirmar é que as chuvas continuarão abaixo da média neste período.

Acesse aqui o relatório completo emitido pelo GTPCS.

(MCTI)

http://www.mcti.gov.br/visualizar/-/asset_publisher/jIPU0I5RgRmq/content/grupo-de-especialistas-divulga-previsao-do-clima-para-o-proximo-trimestre?redirect=/&

Casos de esquizofrenia poderiam ser evitados se fosse possível prevenir infecção por parasita, diz estudo (SBMT)

Proliferação do parasita, que também está relacionado a outros transtornos mentais, é mais comum em países tropicais

Cerca de 30% da população mundial está infectada com um dos parasitas que mais intriga a ciência, o Toxoplasma gondii. Apesar de inofensivo para a maioria das pessoas saudáveis, pesquisas científicas comprovaram que o protozoário é capaz de alterar o comportamento de seres humanos e animais, além de possível ligação com a esquizofrenia. Recentemente, um estudo produzido nos Estados Unidos foi além, sugerindo que cerca de um quinto dos casos de esquizofrenia entre os norte-americanos pode envolver o parasita. Nos países mais pobres, esse índice tende a ser ainda maior.

O estudo, publicado na revista Preventive Veterinary Medicine, foi conduzido pelo médico veterinário e professor Gary Smith, na Seção de Epidemiologia e Saúde Pública da Escola de Medicina Veterinária da Universidade da Pensilvânia. Smith elaborou um cálculo que mede o quão importante é o fator de risco à infecção, que aumenta com a idade.

“Há cada vez mais evidências por meio de estudos de que pessoas infectadas por Toxoplasma têm um risco aumentado para esquizofrenia”, explica o pesquisador. A partir desse pressuposto, o desafio foi descobrir qual a proporção de casos do transtorno mental poderia ser evitada se fosse possível para prevenir a infecção humana com o parasita.

Pelos cálculos feitos em um programa de computador, esse índice seria de 21,4% para países como os Estados Unidos e os da Europa Ocidental, em que a incidência de infecção pelo T. gondiinão varia com a idade. “O resultado, no entanto, seria diferente para muitos países da América do Sul, porque a não incidência de infecção é claramente maior nos grupos etários mais jovens, especialmente entre os mais pobres”, disse.

Só no Brasil – País que tem o maior índice mundial de infectados (66,7%), cerca de 126 milhões de pessoas são hospedeiras do parasita. A proliferação deste, aliás, é mais comum nos países de clima tropical, principalmente nas nações mais pobres, onde há grandes concentrações urbanas e sem saneamento básico.

O mal é transmitido tanto pela ingestão de carne crua e terra contaminada quanto por meio do contato direto com secreções e fezes de gato. Também pode ser repassada ao feto durante a gravidez através da placenta – sendo recomendado, inclusive, que mulheres grávidas evitem contatos com gatos durante o período de gestação. Apesar de ser uma infecção comum tanto em pessoas quanto em animais, o Toxoplasma afeta especialmente os gatos – únicos seres onde o parasita consegue se reproduzir.

Suicídio

Pesquisas feitas em diversos países têm demonstrado como o T. gondii pode estar relacionado a problemas neurológicos, como a depressão, principalmente em pessoas do sexo feminino. Segundo reportagem da revista Scientific American, um desses estudos, desenvolvido no Instituto de Pesquisas Médicas Stanley, em Maryland (EUA), concluiu que mulheres infectadas com quantidades altas de Toxoplasma apresentavam maior tendência a ter filhos esquizofrênicos.

Outro trabalho, produzido por cientistas dinamarqueses obteve um resultado ainda mais alarmante. Segundo a pesquisa, as mulheres que tinham infecções do parasita apresentaram tendência 54% maior de tentarem o suicídio. Em geral, as tentativas eram violentas, utilizando armas brancas e de fogo. Entre aquelas sem histórico de doenças mentais, o índice também foi alto: 56% tinham mais chances de cometerem atentado contra a própria vida.

A preocupação quanto os efeitos do protozoário no organismo são também evidentes em ratos. De acordo com pesquisas, o parasita pode alterar o comportamento desses animais, fazendo-os, por exemplo, perder o medo do cheiro de gatos – alguns chegam até mesmo a sentir atração sexual com o odor. Além disso, pesquisadores descobriram que ratos infectados conseguem recuperar o comportamento normal tanto com remédios antiparasitários quanto com antipsicóticos.

Já se descobriu que a infecção aumenta os níveis do neurotransmissor conhecido como dopamina, que é um dos fatores da esquizofrenia quando em altas doses. Isso porque oToxoplasma possui um gene que codifica uma enzima fundamental para a produção de dopamina, sendo este o método de influência sobre o cérebro de seres humanos e animais. Os cientistas, agora, tentam entender de forma clara como o parasita se comporta no cérebro.

(Newsletter da SBMT)

http://sbmt.org.br/portal/casos-de-esquizofrenia-poderiam-ser-evitados-se-fosse-possivel-prevenir-infeccao-por-parasita-diz-estudo/?locale=pt-BR

CNPq cria Rede para otimizar produção de animais em laboratórios (JC)

Rebiotério prevê estimular produção e assegurar qualidade nos biotérios

Ao mesmo tempo em que corre para desenvolver métodos alternativos a fim de reduzir o número de animais em testes de laboratórios –  pela chamada Rede Nacional de Métodos Alternativos (RENAMA) – o governo decidiu criar uma Rede para adequar a produção em biotérios de todos os animais para propósitos científicos e didáticos, como ratos, camundongos e coelhos.

A intenção é atender de forma adequada e organizada à demanda nacional. O entendimento é de que o uso de animais ainda é imprescindível nos testes in vivo e que hoje existe um desequilíbrio entre a oferta e a procura no País, em razão do aumento considerável da produção científica nacional.

Na  prática, o Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq), principal agência financiadora de pesquisa experimental do País, criou a chamada Rede Nacional de Biotérios de Produção de Animais para Fins Científicos, Didáticos e Tecnológicos (Rebiotério), informou Marcelo Morales, diretor da área de Ciências Agrárias, Biológicas e da Saúde do CNPq e que comandará a rede, com exclusividade ao Jornal da Ciência.

A Rebiotério, segundo Morales, vai mapear, monitorar,   otimizar e dar suporte à produção de animais utilizados em experimentos científicos e em sala de aula. Todos  os biotérios distribuídos pelo País serão cadastrados na rede. Para Morales, essa é uma tentativa de atender aos anseios da comunidade científica pela pesquisa de qualidade envolvendo animais.

Sem querer estimar o número de animais produzidos hoje em laboratórios, para fins científicos, Morales destaca a atual necessidade da produção qualificada de animais em biotérios de produção para atender a demanda científica. Hoje, segundo disse, pesquisadores aguardam na fila um período de dois a cinco meses para receber animais com qualidade (principalmente os desprovidos de patógenos, Specific Pathogen Free – SPF) e que possam ser utilizados em experimentos científicos.  Atualmente,  a produção com qualidade é vinculada apenas a alguns biotérios que os produzem para atender as próprias necessidades e poucos são aqueles que produzem para outras Instituições.   Além disso, a importação desses animais se torna inviável, diante de barreiras sanitárias e do alto custo de importação.

No caso de roedores, responsáveis por cerca de 70% do total de animais utilizados em pesquisas científicas, Morales afirmou que a necessidade estimada de produção é de 5 milhões/ano desses animais.

Normas e legislações 

Além de propor políticas de fomento para a produção de animais em biotérios qualificados, a Rebiotério prevê, ainda, acompanhar a implementação efetiva de normas e legislações especificas adotadas para uso de animais em experimentos científicos, conjuntamente com o  Conselho Nacional de Controle de Experimentação Animal (Concea). Deverá também estimular a qualidade de produção nos  biotérios e atender aos padrões internacionais de boas práticas de bem-estar animal.

Outra função é assegurar o controle sanitário e genético, averiguando o nível de patógenos, por exemplo, e reforçar os padrões éticos adotados para os animais produzidos em biotérios.

Capacitação profissional

Para garantir a qualidade de produção dos biotérios, a Rebiotério terá o papel, dentre outros, de estimular a capacitação e qualificação de profissionais da área no exterior e no Brasil (bioteristas, veterinários, pesquisadores e etc). Assim, garantir que a produção de animais seja compatível com os padrões internacionais.

“Nossa intenção é fortalecer a produção de animais de experimentação, com ética e qualidade, fazendo com o que o País torne-se referência nessa área no mundo”, disse Morales, também professor associado da Universidade Federal do Rio de Janeiro (UFRJ), ex-coordenador do Conselho Nacional de Controle de Experimentação Animal (Concea) e ex-presidente da Sociedade Brasileira de Biofísica (SBBF).

Para fazer frente a tais desafios, o CNPq aprovou a viabilidade de parcerias internacionais que possam assegurar a produção sustentável e de qualidade nos biotérios. A intenção é ampliar o interesse de empresas internacionais, com expertise em tal área, que hoje já organizam e negociam instalação no Brasil.

Segundo Morales, a parceria com empresas estrangeiras pode ser por intermédio de transferência de tecnologia relacionada às práticas modernas de bioterismo; e pelo apoio à formação de pesquisadores e técnicos brasileiros dessa área no exterior.

Sem querer entrar no mérito do orçamento do CNPq, Morales informou que a qualificação desses profissionais pode ocorrer também pelas bolsas do Programa Ciência sem Fronteiras.

Composição da Rebiotério

Além do CNPq, a Rebiotério será composta pela comunidade científica, pela Secretaria de Políticas e Programas de Pesquisa e Desenvolvimento do Ministério da Ciência, Tecnologia e Inovação (Seped/MCTI); e Secretaria de Ciência, Tecnologia e Insumos Estratégicos do Ministério da Saúde (SCTIE), do Ministério da Saúde. Terá ainda participação do Conselho Nacional de Controle de Experimentação Animal (CONCEA), órgão vinculado ao MCTI, e de membros da Finep (Financiadora de Estudos e Projetos).

Da comunidade científica, haverá representantes da Sociedade Brasileira de Ciência em Animais de Laboratórios (SBCAL), da Sociedade Brasileira para o Progresso da Ciência (SBPC), da Academia Brasileira de Ciências (ABC) e do Conselho Nacional das Fundações Estaduais de Amparo à Pesquisa (Confap).

“Nossa intenção é que a rede tenha uma abrangência nacional”, observa Morales.

(Viviane Monteiro/ Jornal da Ciência)

Oxfam: Em 2016, 1% mais ricos terão mais dinheiro que o resto do mundo (Carta Capital)

19/1/2015 – 09h33

por Redação da Carta Capital

pobreza Oxfam: Em 2016, 1% mais ricos terão mais dinheiro que o resto do mundo

A redução da pobreza é um dos eixos da agenda de desenvolvimento pós-2015. Crianças na favela de Kallayanpur, uma das favelas urbanas em Daca, Bangladesh. Foto: ONU/Kibae Park 

ONG britânica divulga dados sobre a desigualdade social no mundo para tentar guiar as discussões do Fórum Econômico Mundial

Um estudo divulgado nesta segunda-feira 19 pela ONG britânica Oxfam afirma que, em 2016, as 37 milhões de pessoas que compõem o 1% mais rico da população mundial terão mais dinheiro do que os outros 99% juntos. O relatório tem o objetivo de influenciar as discussões a serem travadas no Fórum Econômico Mundial (FEM), que reúne os ricos e poderosos no resort suíço de Davos entre 21 e 24 de janeiro.

O estudo da Oxfam é baseado no relatório anual sobre a riqueza mundial que o banco Credit Suisse divulga anualmente desde 2010. Na versão mais recente, divulgada em outubro 2014, o Credit Suisse mostrou que o 1% mais rico (com bens de 800 mil dólares no mínimo) detinha 48,2% da riqueza mundial, enquanto os outros 99% ficavam com os 51,8%. No grupo dos 99%, também há uma significativa desigualdade: quase toda a riqueza está nas mãos dos 20% mais ricos, enquanto as outras pessoas dividem 5,5% do patrimônio.

No estudo divulgado nesta segunda, a Oxfam extrapolou os dados para o futuro e indica que em 2016 o 1% mais rico terá mais de 50% dos bens e patrimônios existentes no mundo. “Nós realmente queremos viver em um mundo no qual o 1% tem mais do que nós todos juntos?”, questionou Winnie Byanyima, diretora-executiva da Oxfam e co-presidente do Fórum Econômico Mundial. Em artigo publicado no site do FEM, Byanyima afirma que o fórum tem em 2015 o duplo desafio de conciliar a desigualdade social e as mudanças climáticas. “Tanto nos países ricos quanto nos pobres, essa desigualdade alimenta o conflito, corroendo as democracias e prejudicando o próprio crescimento”, afirma Byanyima.

A diretora da Oxfam lembra que há algum tempo os que se preocupavam com a desigualdade eram acusados de ter “inveja”, mas que apenas em 2014 algumas personalidades como o papa Francisco, o presidente dos Estados Unidos, Barack Obama, e a diretora do Fundo Monetário Internacional (FMI), Christine Lagarde, manifestaram preocupação com a desigualdade social. “O crescente consenso: se não controlada, a desigualdade econômica vai fazer regredir a luta contra a pobreza e ameaçará a estabilidade global”, afirma.

A Oxfam mostra que a riqueza do 1% é derivada de atividades em poucos setores, sendo os de finanças e seguros os principais e os de serviços médicos e indústria farmacêutica dois com grande crescimento em 2013 e 2014. A Oxfam lembra que as companhias mais ricas do mundo usam seu dinheiro, entre outras coisas, para influenciar os governos por meio de lobbies, favorecendo seus setores. No caso particular dos Estados Unidos, que concentra junto com a Europa a maior parte dos integrantes do 1% mais rico, o lobby é particularmente prolífico, afirma a Oxfam, para mexer no orçamento e nos impostos do país, destinando a poucos recursos que “deveriam ser direcionados em benefícios de toda a população”.

Para a Oxfam, a desigualdade social não deve ser tratada como algo inevitável. A ONG lista uma série de medidas para colocar a diferença entre ricos e pobres sob controle, como fazer os governos trabalharem para seus cidadãos e terem a redução da desigualdade como objetivo; a promoção dos direitos e a igualdade econômica das mulheres; o pagamento de salários mínimos e a contenção dos salários de executivos; e o objetivo de o mundo todo ter serviços gratuitos de saúde e educação.

* Publicado originalmente no site Carta Capital.

Crise hídrica: Alarme! (Rede Nossa São Paulo)

19/1/2015 – 03h10

por Oded Grajew*

suspensao racionamento nh Crise hídrica: Alarme!

Diante da crise da água em São Paulo, o coordenador geral da Rede Nossa São Paulo e do Programa Cidades Sustentáveis faz um apelo às autoridades e aos cidadãos para que assumam as devidas responsabilidades. Confira:

A cidade de São Paulo está diante de uma catástrofe social, econômica e ambiental sem precedentes. O nível do sistema Cantareira está em cerca de 6% e segue baixando por volta de 0,1% ao dia. O que significa que, em aproximadamente 60 dias, o sistema pode secar COMPLETAMENTE!

O presidente da Sabesp declarou que o sistema pode ZERAR em março ou, na melhor das hipóteses, em junho deste ano. E NÃO HÁ UM PLANO B em curto prazo. Isto significa que seis milhões de pessoas ficarão praticamente SEM UMA GOTA DE ÁGUA ou com enorme escassez. Não é que haverá apenas racionamento ou restrição. Poderá haver ZERO de água, NEM UMA GOTA.

Você já se deu conta do que isto significa em termos sociais, econômicos (milhares de estabelecimentos inviabilizados e enorme desemprego) e ambientais? Você já se deu conta de que no primeiro momento a catástrofe atingirá os mais vulneráveis (pobres, crianças e idosos) e depois todos nós?

O que nos espanta é a passividade da sociedade e das autoridades diante da iminência desta monumental catástrofe. Todas as medidas tomadas pelas autoridades e o comportamento da sociedade são absolutamente insuficientes para enfrentar este verdadeiro cataclismo.

Parece que estamos todos anestesiados e impotentes para agir, para reagir, para pressionar, para alertar, para se mobilizar em torno de propostas e, principalmente, em ações e planos de emergência de curto prazo e políticas e comportamentos que levem a uma drástica transformação da nossa relação com o meio ambiente e os recursos hídricos.

Há uma unanimidade de que esta é uma crise de LONGUÍSSIMA DURAÇÃO por termos deixado, permitido, que se chegasse a esta dramática situação. Agora, o que mais parece é que estamos acomodados e tranquilos num Titanic sem nos dar conta do iceberg que está se aproximando.

Nosso intuito, nosso apelo, nosso objetivo com este alarme é conclamar as autoridades, os formadores de opinião, as lideranças e os cidadãos a se conscientizarem urgentemente da gravíssima situação que vive a cidade, da dimensão da catástrofe que se aproxima a passos largos.

Precisamos parar de nos enganar. É fundamental que haja uma grande mobilização de todos para que se tomem ações e medidas à altura da dramática situação que vivemos. Deixar de lado rivalidades e interesses políticos, eleitorais, desavenças ideológicas. Não faltam conhecimentos, não faltam ideias, não faltam propostas (o Conselho da Cidade de São Paulo aprovou um grande conjunto delas). Mas faltam mobilização e liderança para enfrentar este imenso desafio.

Todos precisamos assumir nossa responsabilidade à altura do nosso poder, de nossa competência e de nossa consciência. O tempo está se esgotando a cada dia.

Oded Grajew é empresário, coordenador da secretaria executiva da Rede Nossa São Paulo, presidente emérito do Instituto Ethos e idealizador do Fórum Social Mundial.

** Publicado originalmente no site Rede Nossa São Paulo.

(Rede Nossa São Paulo)

Can Humanity’s ‘Great Acceleration’ Be Managed and, If So, How? (Dot Earth, New York Times)

By January 15, 2015 5:00 pm

Updated below | Through three-plus decades of reporting, I’ve been seeking ways to better mesh humanity’s infinite aspirations with life on a finite planet. (Do this Google search — “infinite aspirations” “finite planet” Revkin – to get the idea. Also read the 2002 special issue of Science Times titled “Managing Planet Earth.”)

So I was naturally drawn to a research effort that surfaced in 2009 defining a “safe operating space for humanity” by estimating a set of nine “planetary boundaries” for vital-sign-style parameters like levels of greenhouse gases, flows of nitrogen and phosphorus and loss of biodiversity.

Photo

A diagram from a 2009 analysis of "planetary boundaries" showed humans were already hitting limits (red denotes danger zones).
A diagram from a 2009 analysis of “planetary boundaries” showed humans were already hitting limits (red denotes danger zones).Credit Stockholm Resilience Center

The same was true for a related “Great Acceleration” dashboard showing humanity’s growth spurt (the graphs below), created by the International Geosphere-Biosphere Program.

Photo

A graphic illustrating how human social and economic trends, resource appetites and environmental impacts have surged since 1950.
A graphic illustrating how human social and economic trends, resource appetites and environmental impacts have surged since 1950.Credit International Geosphere-Biosphere Program

Who would want to drive a car without gauges tracking engine heat, speed and fuel levels? I use that artwork in all my talks.

Now, both the dashboard of human impacts and planetary boundaries have been updated. For more detail on the dashboard, explore the website of the geosphere-biosphere organization.

In a prepared statement, a co-author of the acceleration analysis, Lisa Deutsch, a senior lecturer at the Stockholm Resilience Center, saw little that was encouraging:

Of all the socio-economic trends only construction of new large dams seems to show any sign of the bending of the curves – or a slowing of the Great Acceleration. Only one Earth System trend indicates a curve that may be the result of intentional human intervention – the success story of ozone depletion. The leveling off of marine fisheries capture since the 1980s is unfortunately not due to marine stewardship, but to overfishing.

And all that acceleration (mostly since 1950, as I wrote yesterday) has pushed us out of four safe zones, according to the 18 authors of the updated assessment of environmental boundaries, published online today by the journal Science here: “Planetary Boundaries: Guiding human development on a changing planet.”

The paper is behind a paywall, but the Stockholm Resilience Center, which has led this work, has summarized the results, including the authors’ conclusion that we’re in the danger zone on four of the nine boundaries: climate change, loss of biosphere integrity, land-system change and alteration of biogeochemical cycles (for the nutrients phosphorus and nitrogen).

Their work has been a valuable prod to the community of scientists and policy analysts aiming to smooth the human journey, resulting in strings of additional studies. Some followup work has supported the concept, and even broadened it, as with a 2011 proposal by Kate Raworth of the aid group Oxfam to add social-justice boundaries, as well: “A Safe and Just Space for Humanity – Can We Live Within the Doughnut?

Photo

In 2011, <a href="http://www.oxfam.org/en/research/safe-and-just-space-humanity">Kate Raworth</a> at the aid group Oxfam proposed a framework for safe and just human advancement illustrated as a doughnut-shaped zone.
In 2011, Kate Raworth at the aid group Oxfam proposed a framework for safe and just human advancement illustrated as a doughnut-shaped zone.Credit Oxfam

But others have convincingly challenged many of the boundaries and also questioned their usefulness, given how both impacts of, and decisions about, human activities like fertilizing fields or tapping aquifers are inherently local — not planetary in scale. (You’ll hear from some critics below.)

In 2012, the boundaries work helped produce a compelling alternative framework for navigating the Anthropocene — “Planetary Opportunities: A Social Contract for Global Change Science to Contribute to a Sustainable Future.”

I hope the public (and policy makers) will realize this is not a right-wrong, win-lose science debate. A complex planet dominated by a complicated young species will never be managed neatly. All of us, including environmental scientists, will continue to learn and adjust.

I was encouraged, for instance, to see the new iteration of the boundaries analysis take a much more refined view of danger zones, including more of an emphasis on the deep level of uncertainty in many areas:

Photo

A diagram from a paper defining "planetary boundaries" for human activities shows areas of greatest risk in red.
A diagram from a paper defining “planetary boundaries” for human activities shows areas of greatest risk in red.Credit Science

The authors, led by Will Steffen of Australian National University and Johan Rockström of the Stockholm Resilience Center, have tried to refine how they approach risks related to disrupting ecosystems – not simply pointing to lost biological diversity but instead devising a measure of general “biosphere integrity.”

That measure, and the growing human influence on the climate through the buildup of long-lived greenhouse gases are the main source of concern, they wrote:

Two core boundaries – climate change and biosphere integrity – have been identified, each of which has the potential on its own to drive the Earth System into a new state should they be substantially and persistently transgressed.

But the bottom line has a very retro feel, adding up to the kind of ominous, but generalized warnings that many environmental scientists and other scholars began giving with the “Limits to Growth” analysis in 1972. Here’s a cornerstone passage from the paper, reprising a longstanding view that the environmental conditions of the Holocene – the equable span since the end of the last ice age – is ideal:

The precautionary principle suggests that human societies would be unwise to drive the Earth System substantially away from a Holocene-like condition. A continuing trajectory away from the Holocene could lead, with an uncomfortably high probability, to a very different state of the Earth System, one that is likely to be much less hospitable to the development of human societies.

I sent the Science paper to a batch of environmental researchers who have been constructive critics of the Boundaries work. Four of them wrote a group response, posted below, which includes this total rejection of the idea that the Holocene is somehow special:

[M]ost species evolved before the Holocene and the contemporary ecosystems that sustain humanity are agroecosystems, urban ecosystems and other human-altered ecosystems….

Here’s their full response:

The Limits of Planetary Boundaries
Erle EllisBarry BrookLinus BlomqvistRuth DeFries

Steffen et al (2015) revise the “planetary boundaries framework” initially proposed in 2009 as the “safe limits” for human alteration of Earth processes (Rockstrom et al 2009). Limiting human harm to environments is a major challenge and we applaud all efforts to increase the public utility of global-change science. Yet the planetary boundaries (PB) framework – in its original form and as revised by Steffen et al – obscures rather than clarifies the environmental and sustainability challenges faced by humanity this century.

Steffen et al concede that “not all Earth system processes included in the PB have singular thresholds at the global/continental/ocean basin level.” Such processes include biosphere integrity (see Brook et al 2013), biogeochemical flows, freshwater use, and land-system change. “Nevertheless,” they continue, “it is important that boundaries be established for these processes.” Why? Where a global threshold is unknown or lacking, there is no scientifically robust way of specifying such a boundary – determining a limit along a continuum of environmental change becomes a matter of guesswork or speculation (see e.g. Bass 2009Nordhaus et al 2012). For instance, the land-system boundary for temperate forest is set at 50% of forest cover remaining. There is no robust justification for why this boundary should not be 40%, or 70%, or some other level.

While the stated objective of the PB framework is to “guide human societies” away from a state of the Earth system that is “less hospitable to the development of human societies”, it offers little scientific evidence to support the connection between the global state of specific Earth system processes and human well-being. Instead, the Holocene environment (the most recent 10,000 years) is assumed to be ideal. Yet most species evolved before the Holocene and the contemporary ecosystems that sustain humanity are agroecosystems, urban ecosystems and other human-altered ecosystems that in themselves represent some of the most important global and local environmental changes that characterize the Anthropocene. Contrary to the authors’ claim that the Holocene is the “only state of the planet that we know for certain can support contemporary human societies,” the human-altered ecosystems of the Anthropocene represent the only state of the planet that we know for certain can support contemporary civilization.

Human alteration of environments produces multiple effects, some advantageous to societies, such as enhanced food production, and some detrimental, like environmental pollution with toxic chemicals, excess nutrients and carbon emissions from fossil fuels, and the loss of wildlife and their habitats. The key to better environmental outcomes is not in ending human alteration of environments but in anticipating and mitigating their negative consequences. These decisions and trade-offs should be guided by robust evidence, with global-change science investigating the connections and tradeoffs between the state of the environment and human well-being in the context of the local setting, rather than by framing and reframing environmental challenges in terms of untestable assumptions about the virtues of past environments.

Even without specifying exact global boundaries, global metrics can be highly misleading for policy. For example, with nitrogen, where the majority of human emissions come from synthetic fertilizers, the real-world challenge is to apply just the right amount of nitrogen to optimize crop yields while minimizing nitrogen losses that harm aquatic ecosystems. Reducing fertilizer application in Africa might seem beneficial globally, yet the result in this region would be even poorer crop yields without any notable reduction in nitrogen pollution; Africa’s fertilizer use is already suboptimal for crop yields. What can look like a good or a bad thing globally can prove exactly the opposite when viewed regionally and locally. What use is a global indicator for a local issue? As in real estate, location is everything.

Finally, and most importantly, the planetary boundaries are burdened not only with major uncertainties and weak scientific theory – they are also politically problematic. Real world environmental challenges like nitrogen pollution, freshwater consumption and land-use change are ultimately a matter of politics, in the sense that there are losers and winners, and solutions have to be negotiated among many stakeholders. The idea of a scientific expert group determining top-down global limits on these activities and processes ignores these inevitable trade-offs and seems to preclude democratic resolution of these questions. It has been argued that (Steffen et al 2011):

Ultimately, there will need to be an institution (or institutions) operating, with authority, above the level of individual countries to ensure that the planetary boundaries are respected. In effect, such an institution, acting on behalf of humanity as a whole, would be the ultimate arbiter of the myriad trade-offs that need to be managed as nations and groups of people jockey for economic and social advantage. It would, in essence, become the global referee on the planetary playing field.

Here the planetary boundaries framework reaches its logical conclusion with a political scenario that is as unlikely as it is unpalatable. There is no ultimate global authority to rule over humanity or the environment. Science has a tremendously important role to play in guiding environmental management, not as a decider, but as a resource for deliberative, evidence-based decision making by the public, policy makers, and interest groups on the challenges, trade-offs and possible courses of action in negotiating the environmental challenges of societal development (DeFries et al 2012). Proposing that science itself can define the global environmental limits of human development is simultaneously unrealistic, hubristic, and a strategy doomed to fail.

I’ve posted the response online as a standalone document for easier downloading; there you can view the authors’ references, as well.

Update, 9:40 p.m.| Will Steffen, the lead author of the updated Planetary Boundaries analysis, sent this reply to Ellis and co-authors tonight:

Response to Ellis et al. on planetary boundaries

Of course we welcome constructive debate on and criticism of the planetary boundaries (PB) update paper. However, the comments of Ellis et al. appear to be more of a knee-jerk reaction to the original 2009 paper than a careful analysis of the present paper. In fact, one wonders if they have even read the paper, including the Supplementary Online Material (SOM) where much methodological detail is provided.

One criticism seems to be based on a rather bizarre conflation of a state of the Earth System with (i) the time when individual biological species evolved, and (ii) the nature and distribution of human-altered terrestrial ecosystems. This makes no sense from an Earth System science perspective. The state of the Earth System (a single system at the planetary level) also involves the oceans, the atmosphere, the cryosphere and very important processes like the surface energy balance and the flows and transformation of elements. It is the state of this single complex system, which provides the planetary life support system for humanity, that the PB framework is concerned with, not with fragmentary bits of it in isolation.

In particular, the PB framework is based on the fact – and I emphasise the word “fact” – that the relatively stable Holocene state of the Earth System (the past approximately 11,700 years) is the only state of the System that has allowed the development of agriculture, urban settlements and complex human societies. Some argue that humanity can now survive, and even thrive, in a rapidly destabilizing planetary environment, but that is a belief system based on supreme technological optimism, and is not a reasoned scientifically informed judgment. Also, Ellis et al. seem to conflate human alteration of terrestrial environments with human alteration of the fundamental state of the Earth System as a whole. These are two vastly different things.

The criticisms show further misunderstanding of the nature of complex systems like the Earth System and how they operate. For example, Ellis et al. claim that a process is not important unless it has a threshold. Even a cursory understanding of the carbon cycle, for example, shows that this is nonsense. Neither the terrestrial nor the marine carbon sinks have known large-scale thesholds yet they are exceedingly important for the functioning of the climate system, which does indeed have known large-scale thresholds such as the melting of the Greenland ice sheet. Sure, it is more challenging to define boundaries for processes that are very important for the resilience of the Earth System but don’t have large-scale thresholds, but it is not impossible. The zone of uncertainty tends to be larger for these boundaries, but as scientific understanding improves, this zone will narrow.

An important misrepresentation of our paper is the assertion that we are somehow suggesting that fertilizer application in Africa be reduced. Nothing could be further from the truth. In fact, if Ellis et al had taken the time to read the SOM, the excellent paper by Carpenter and Bennett (2011) on the P boundary, the equally excellent paper by de Vries et al. (2013) on the N boundary, and the paper by Steffen and Stafford Smith (2013) on the distribution and equity issues for many of the PBs, including N and P, they wouldn’t have made such a misrepresentation.

Finally, the Steffen et al. (2011) paper seems to have triggered yet another misrepresentation. The paragraph of the paper quoted by Ellis et al. is based on contributions from two of the authors who are experts in institutions and governance issues, and does not come from the natural science community. Nowhere in the paragraph quoted, nor in the Steffen et al. (2011) paper as a whole, is there the proposal for a “a scientific expert group determining top-down global limits…”. The paragraph reprinted by Ellis et al. doesn’t mention scientists at all. That is a complete misrepresentation of our work.

We reiterate that we very much welcome careful and constructive critiques of the PB update paper, preferably in the peer-reviewed literature. In fact, such critiques of the 2009 PB paper were very helpful in developing the 2015 paper. Knee-jerk reactions in the blogosphere make for interesting reading, but they are far less useful in advancing the science.

Update, Jan. 16, 2:09 p.m. | Johan Rockström and Katherine Richardson, authors of the boundaries analysis, sent these additional reactions to the Ellis et al. critique:

We are honored that Erle Ellis, Barry Brook, Linus Blomqvist and Ruth DeFries (Ellis et al.) show such strong interest in our Planetary Boundaries research. The 2015 science update draws upon the over 60 scientific articles that have been published specifically scrutinizing different aspects of the Planetary Boundaries framework (amongst them the contributions by all these four researchers), and the most recent advancements in Earth System science. This new paper scientifically addresses and clarifies all of the natural science related aspects of Ellis et al.’s critique. It can also be noted that Ellis et al.’s critique simply echoes the standpoints regarding Planetary Boundaries research that the same group (Blomqvist et al., 2012) brought forward in 2012. Now, as then, their criticisms seem largely to be based on misunderstandings and their own viewpoints:

(1) We have never argued that there are planetary scale tipping points for all Planetary Boundary processes. Furthermore, there does not need to be a tipping point for these processes and systems in order for them to function as key regulators of the stability of the Earth system. A good example here is the carbon sink in the biosphere (approximately 4.5 Gt/year) which has doubled over the past 50 years in response to human emissions of CO2 and, thus, provides a good example of Earth resilience at play;

(2) Establishing the Planetary Boundaries, i.e. identifying Earth System scale boundaries for environmental processes that regulate the stability of the planet, does not (of course) contradict or replace the need for local action, transparency and democratic processes. Our society has long accepted the need for local – and to some extent regional- environmental management. Scientific evidence has now accumulated that indicates a further need for management of some environmental challenges at the global level. Many years of multi-lateral climate negotiation indicate a recognized need for global management of the CO2 emissions that occur locally. Our Planetary Boundaries research identifies that there are also other processes critical to the functioning of the Earth System that are so impacted by human activities that they, too, demand management at the global level. Ours is a positive – not a doomsday – message. It will come as no surprise to any reader that there are environmental challenges associated with all of the 9 Earth System functions we examine. Through our research, we offer a framework that can be useful in developing management at a global level.

It is important to emphasize that Ellis et al. associate socio-political attributes to our work that do not exist. The Science paper published today (16th January 2015), is a natural science update and advancement of the planetary boundaries framework. It makes no attempt to enter the (very important) social science realm of equity, institutions or global governance. The implications attributed to the PB framework must, then, reflect Ellis et al.’s own normative values. Furthermore, Ellis et al. argue that the “key to better environmental outcomes is not ending human alteration” but “anticipating and mitigating the negative consequences” of human environmental perturbation. While Planetary Boundaries research does not dictate how societies should use the insights it provides, “anticipating negative consequences” is at the absolute core of our approach!

Regarding Earth system tipping points. As Will Steffen points out in his earlier response, it would have been scientifically more correct for Ellis et al. to refer not only to their own assessment of uncertainties regarding a potential biosphere tipping point but also to the response to their article by Terry Hughes et al. (2014). These researchers presented the current state of empirical evidence concerning changes in interactions and feedbacks and how they can (in several cases do!) trigger tipping points at ecosystem and biome scale, and that such non-linear dynamics at local to regional scale can add up to impacts at the Earth system scale.

A different worldview. The Ellis et al. critique appears not to be a scientific criticism per se but rather is based on their own interpretation of differences in worldview. They do not substantively put in question the stability of the Earth system as a basis for human development– see Will Steffen’s response. Thus, it appears that we and Ellis et al. are in agreement here. Of course species and ecosystems have evolved prior to the Holocene but only in the stable environment of the Holocene have humans been able to exploit the Earth system at scale (e.g., by inventing agriculture as a response to a stable hydro-climate in the Holocene).

Ellis et al. argue that the only constructive avenue is to “investigate the connections and trade-offs between the state of the environment and human well-being in the context of the local setting..:”. This is clearly not aligned with current scientific evidence. In the Anthropocene, there is robust evidence showing that we need to address global environmental change at the global level, as well as at the regional, national and local contexts, and in particular understanding cross-scale interactions between them.

On global governance. It seems hardly surprising, given the Ellis et al.’s misunderstanding of the Planetary Boundaries framework that their interpretation of the implications of operationalizing the framework rests also on misunderstandings. They claim the Planetary Boundaries framework translates to an “ultimate global authority to rule over humanity”. No one would argue that the current multi-lateral climate negotiations are an attempt to establish “ultimate global authority over humanity” and this is certainly never been suggested by the Planetary Boundaries research. In essence, the Planetary Boundary analysis simply identifies Earth System processes that – in the same manner as climate – regulate the stability of the Earth System, and if impacted too far by human activities potentially can disrupt the functioning of the Earth System. The Planetary Boundaries is, then, nothing more than a natural sciences contribution to an important societal discussion and which presents evidence which can support the definition of Planetary Boundaries to safeguard a stable and resilient Earth system. How this then translates to governance is another issue entirely and important social science contributions have addressed these (Galaz et al 2012). As our research shows, there is natural science evidence that global management of some environmental challenges is necessary. From the social science literature (Biermann et al., 2012) as well as from real world policy making, we see that such global scale regulation is possible to construct in a democratic manner and does establish a safe operating space, e.g. the Montreal protocol, a global agreement to address one of the identified planetary boundaries and which, to our knowledge, is never referred to as a “global authority ruling over humanity”. As noted above, the UNFCCC process is also fundamentally concerned with establishing the global “rules of the game” by which society can continue to develop within a climate planetary boundary. The Aichi targets (within the UN Convention on Biological Diversity) of setting aside marine and terrestrial areas for conservation are also good examples of the political translation of a science based concern over global loss of biodiversity. The coming SDG (Sustainable Development Goals) framework includes a proposed set of four goals (oceans, climate, biodiversity and freshwater), which is a de-facto example of applying planetary boundary thinking to create a global framework for safeguarding a stable environment on the planet for societies and communities across the world. We find it interesting – and encouraging – that societies and the world community are already developing management tools within several “planetary boundary domains”. In all cases, this is happening in good democratic order and building upon bottom-up processes and informed by science. This ought to be reassuring for Ellis et al. who portray implementation of Planetary Boundary thinking as a dark force of planetary rule.

*   *   *

[Reaction]

The Limits of Planetary Boundaries 2.0 (Brave New Climate)

Back in 2013, I led some research that critiqued the ‘Planetary Boundaries‘ concept (my refereed paper, Does the terrestrial biosphere have planetary tipping points?, appeared in Trends in Ecology & Evolution). I also blogged about this here: Worrying about global tipping points distracts from real planetary threats.

Today a new paper appeared in the journal Science, called “Planetary boundaries: Guiding human development on a changing planet“, which attempts to refine and clarify the concept. It states that four of nine planetary boundaries have been crossed, re-imagines the biodiversity boundary as one of ‘biodiversity integrity’, and introduces the concept of ‘novel entities’. A popular summary in the Washington Post can be read here. On the invitation of New York Times “Dot Earth” reporter Andy Revkin, my colleagues and I have written a short response, which I reproduce below. The full Dot Earth article can be read here.

The Limits of Planetary Boundaries
Erle EllisBarry BrookLinus BlomqvistRuth DeFries

Steffen et al (2015) revise the “planetary boundaries framework” initially proposed in 2009 as the “safe limits” for human alteration of Earth processes(Rockstrom et al 2009). Limiting human harm to environments is a major challenge and we applaud all efforts to increase the public utility of global-change science. Yet the planetary boundaries (PB) framework – in its original form and as revised by Steffen et al – obscures rather than clarifies the environmental and sustainability challenges faced by humanity this century.

Steffen et al concede that “not all Earth system processes included in the PB have singular thresholds at the global/continental/ocean basin level.” Such processes include biosphere integrity (see Brook et al 2013), biogeochemical flows, freshwater use, and land-system change. “Nevertheless,” they continue, “it is important that boundaries be established for these processes.” Why? Where a global threshold is unknown or lacking, there is no scientifically robust way of specifying such a boundary – determining a limit along a continuum of environmental change becomes a matter of guesswork or speculation (see e.g. Bass 2009;Nordhaus et al 2012). For instance, the land-system boundary for temperate forest is set at 50% of forest cover remaining. There is no robust justification for why this boundary should not be 40%, or 70%, or some other level.

While the stated objective of the PB framework is to “guide human societies” away from a state of the Earth system that is “less hospitable to the development of human societies”, it offers little scientific evidence to support the connection between the global state of specific Earth system processes and human well-being. Instead, the Holocene environment (the most recent 10,000 years) is assumed to be ideal. Yet most species evolved before the Holocene and the contemporary ecosystems that sustain humanity are agroecosystems, urban ecosystems and other human-altered ecosystems that in themselves represent some of the most important global and local environmental changes that characterize the Anthropocene. Contrary to the authors’ claim that the Holocene is the “only state of the planet that we know for certain can support contemporary human societies,” the human-altered ecosystems of the Anthropocene represent the only state of the planet that we know for certain can support contemporary civilization.

Human alteration of environments produces multiple effects, some advantageous to societies, such as enhanced food production, and some detrimental, like environmental pollution with toxic chemicals, excess nutrients and carbon emissions from fossil fuels, and the loss of wildlife and their habitats. The key to better environmental outcomes is not in ending human alteration of environments but in anticipating and mitigating their negative consequences. These decisions and trade-offs should be guided by robust evidence, with global-change science investigating the connections and tradeoffs between the state of the environment and human well-being in the context of the local setting, rather than by framing and reframing environmental challenges in terms of untestable assumptions about the virtues of past environments.

Even without specifying exact global boundaries, global metrics can be highly misleading for policy. For example, with nitrogen, where the majority of human emissions come from synthetic fertilizers, the real-world challenge is to apply just the right amount of nitrogen to optimize crop yields while minimizing nitrogen losses that harm aquatic ecosystems. Reducing fertilizer application in Africa might seem beneficial globally, yet the result in this region would be even poorer crop yields without any notable reduction in nitrogen pollution; Africa’s fertilizer use is already suboptimal for crop yields. What can look like a good or a bad thing globally can prove exactly the opposite when viewed regionally and locally. What use is a global indicator for a local issue? As in real estate, location is everything.

Finally, and most importantly, the planetary boundaries are burdened not only with major uncertainties and weak scientific theory – they are also politically problematic. Real world environmental challenges like nitrogen pollution, freshwater consumption and land-use change are ultimately a matter of politics, in the sense that there are losers and winners, and solutions have to be negotiated among many stakeholders. The idea of a scientific expert group determining top-down global limits on these activities and processes ignores these inevitable trade-offs and seems to preclude democratic resolution of these questions. It has been argued that (Steffen et al 2011):

Ultimately, there will need to be an institution (or institutions) operating, with authority, above the level of individual countries to ensure that the planetary boundaries are respected. In effect, such an institution, acting on behalf of humanity as a whole, would be the ultimate arbiter of the myriad trade-offs that need to be managed as nations and groups of people jockey for economic and social advantage. It would, in essence, become the global referee on the planetary playing field.

Here the planetary boundaries framework reaches its logical conclusion with a political scenario that is as unlikely as it is unpalatable. There is no ultimate global authority to rule over humanity or the environment. Science has a tremendously important role to play in guiding environmental management, not as a decider, but as a resource for deliberative, evidence-based decision making by the public, policy makers, and interest groups on the challenges, trade-offs and possible courses of action in negotiating the environmental challenges of societal development (DeFries et al 2012). Proposing that science itself can define the global environmental limits of human development is simultaneously unrealistic, hubristic, and a strategy doomed to fail.

Siberian Arctic permafrost decay and methane escape (Climatestate)

Added by Chris Machens on January 18, 2015

Siberian Arctic permafrost decay and methane escape

Widespread seafloor gas release from the seabed offshore the West Yamal Peninsula, suggests that permafrost has degraded more significantly than previously thought.  Gas is released in an area of at least 7500 kmin water depths >20 m.(1)

Tromsø, Norway: Centre for Arctic Gas Hydrate (CAGE): It was previously proposed that the permafrost in the Kara Sea, and other Arctic areas, extends to water depths up to 100 meters, creating a seal that gas cannot bypass. Portnov and colleagues have found that the West Yamal shelf is leaking, profoundly, at depths much shallower than that.

Significant amount of gas is leaking at depths between 20 and 50 meters. This suggests that a continuous permafrost seal is much smaller than proposed. Close to the shore the permafrost seal may be few hundred meters thick, but tapers off towards 20 meters water depth. And it is fragile.

Evolution of permafrost

Portnov used mathematical models to map the evolution of the permafrost, and thus calculated its degradation since the end of the last ice age. The evolution of permafrost gives indication to what may happen to it in the future.

Basically the permafrost is thawing from two sides. The interior of the Earth is warming the permafrost from the bottom up, called geothermal heat flux – an ongoing process. Thus, if the bottom ocean temperature is −0,5°C, the maximal possible permafrost thickness would likely take 9000 years to thaw. But if water temperature increases, the process would go much faster, because the thawing would also happen from the top down.

“If the temperature of the oceans increases by two degrees as suggested by some reports, it will accelerate the thawing to the extreme. A warming climate could lead to an explosive gas release from the shallow areas.”(2)

Impact study

Another study based on a coupled climate–carbon cycle model (GCM) assessed a 1000-fold (from <1 to 1000 ppmv) methane increase – within a single pulse, from methane hydrates (based on carbon amount estimates for the PETM, with ~2000 GtC), and concluded it would increase atmospheric temperatures above >6°C within 80 years. Further, carbon stored in the land biosphere would decrease by >25%, suggesting a critical situation for ecosystems and farming, especially in the tropics.(3)

Though, in reality it is reasonable to assume that larger methane spikes will be in the 1-2 digit Gt ball park, which are still considerable amounts. The PETM, 55 mil years ago, is marked by several larger spikes. Even if there aren’t larger spikes, the current deglaciation in the northern hemisphere will considerably contribute – increase the current atmospheric carbon budget. Hence, it is vital to reduce emissions now, to slow or even reverse processes before things get out of control.

Related

An Arctic methane worst-case scenario http://www.realclimate.org/index.php/archives/2012/01/an-arctic-methane-worst-case-scenario/
An online model of methane in the atmosphere http://www.realclimate.org/index.php/archives/2012/01/an-online-model-of-methane-in-the-atmosphere/
Methane gas release from ocean might have led to AirAsia flight crash, expert speculates http://timesofindia.indiatimes.com/india/Methane-gas-release-from-ocean-might-have-led-to-AirAsia-flight-crash-expert-speculates/articleshow/45913234.cms

Teaser image via http://photography.nationalgeographic.com/photography/photo-of-the-day/methane-bubbles-thiessen/

Cientistas tentam responder: cadê as chuvas do Cantareira? (Folha de S.Paulo)

RAFAEL GARCIA

DE SÃO PAULO

18/01/2015 01h45

As tempestades que têm desabado sobre a cidade de São Paulo desde o fim de dezembro derrubaram árvores e postes, mas não serviram para abastecer as represas do Cantareira, prolongando a crise da água. Cientistas, porém, afirmam que isso é compreensível e era até esperado.

O problema que leva à essa situação paradoxal passa por uma espécie de pane que acontece pelo segundo verão consecutivo no sistema que os meteorologistas chamam de ZCAS (Zona de Convergência do Atlântico Sul). Trata-se de uma banda de nuvens que se estende desde o oeste da Amazônia até Mato Grosso, Minas Gerais, São Paulo e segue até alto mar.

“O sistema, que favoreceria as chuvas na região central do Brasil como um todo, não está atuando como deveria”, diz Anna Bárbara de Melo, do CPTEC (Centro de Previsão de Tempo e Estudos Climáticos), ligado ao Instituto Nacional de Pesquisas Espaciais.

Em dezembro, a ZCAS entrou em ação, mas no lugar “errado”. “O sistema ocorreu, só que favorecendo a região sul da Bahia e o Tocantins”, diz a pesquisadora. “Todo o estado de Minas, em dezembro, teve menos precipitação que o normal, com exceção de algumas áreas no norte.”
Segundo o climatologista Tércio Ambrizzi, da USP, o fenômeno pode estar relacionado à mudança climática.

“O fato de a atmosfera estar mais aquecida tem gerado uma variabilidade climática maior, enfatizando os eventos extremos”, diz o climatologista. “Em 2010 e 2011, nós estávamos enfrentando as inundações e mortes ocorridas nos deslizamentos do Rio de Janeiro”, conta Ambrizzi.
“Naquele ano o Cantareira estava com mais de 100% da capacidade, vertendo água e prejudicando algumas cidades. Três anos depois, passamos para um extremo seco com chuvas abaixo da média.”

CAPITAL

Mas, se falta chuva na Cantareira, por que tanta água na capital?

Isso se explica por um outro fenômeno, tipicamente relacionado às chuvas de verão: as ilhas de calor.

Em grandes concentrações urbanas, sem vegetação, o pouco de umidade que existe sobre essas áreas tende a subir em função do calor, até atingir temperaturas mais baixas e se condensar. Isso cria nuvens com uma extensão horizontal relativamente pequena, mas uma extensão vertical grande, com bastante água. A chuva então cai numa região específica, com muita violência, explica Ambrizzi. Em geral, tais tempestades ocorrem no início da noite.

Essas fortes descargas, concentradas em horários limitados, não chegaram nem a trazer um volume médio histórico de água nem mesmo para a capital.

Na primeira metade de janeiro, a estação meteorológica do Mirante de Santana, na zona norte de São Paulo, registrou 71 mm de chuva acumulada, quando a média histórica era de 130 mm. No Cantareira, mais ao norte, a situação é pior, com apenas 60 mm de chuva tendo ocorrido até agora, menos da metade do que se esperava. O nível do reservatório caiu de 7,2% para 6,2%, numa época do ano em que costuma subir.

Algumas das chuvas de verão estimuladas pela mancha urbana de São Paulo poderiam até ter contribuído para elevar o nível de algumas represas do sistema Cantareira, mas aí surge o terceiro problema. Segundo hidrólogos, o solo da maior parte das represas já estava tão seco, castigado pelo sol, que boa parte da água foi simplesmente absorvida pela terra, sem causar nenhuma elevação no nível dos reservatórios.

Esse “efeito esponja”, diz Ambrizzi, pode ter anulado qualquer benefício que chuvas de verão tenham trazido para as represas do Cantareira mais próximas da capital.

R.I.P. Ulrich Beck (PopAnth)

Sociology loses one of its most important voices

by John McCreery on January 16, 2015


Ulrich Beck. Photo by International Students’ Committee via Wikimedia Commons.
Ulrich Beck. Photo by International Students’ Committee via Wikimedia Commons.

The death of Ulrich Beck on January 1, 2015 stilled one of sociology’s most important voices.

Beck has long been one of my favourite sociologists. That is because the world he describes in his book Risk Society reminds me very much of the world of Chinese popular religion that I studied in Taiwan.

There are two basic similarities. First, in the risk society as Beck describes it, public pomp and ceremony and ostentatious displays of wealth recede. Wealth is increasingly privatized, concealed in gated communities, its excesses hidden from public view. Second, social inequality not only increases but increasingly takes the form of differential exposure to many forms of invisible risks.

In the world that Beck describes, signs of wealth continue to exist. Coronations and royal births, celebrity weddings, CEO yachts, the massive homes of the rich and famous and their McMansion imitators are all visible evidence that wealth still counts.

But, says Beck, inequality’s deeper manifestations are now in differences in institutions that shelter the rich and expose the poor to risks that include not only economic fluctuations but also extreme weather and climate change, chemical and biological pollution, mutating and drug-resistant diseases. The hidden plots of terrorists and of those who combat them might also be added to this list.

 People with problems attribute them to invisible causes. They turn for help to those who claim special powers to diagnose and prescribe. 

When I visualize what Beck is talking about when he says that wealth is becoming invisible, I imagine an airport. In the main concourse there is little visible difference between those checking in at the First or Business Class counters and those checking in for the cattle car seats in Economy. All will pass the same array of Duty Free shops on their way to their planes.

But while the masses wait at the gates, the elite relax in comfortable, concealed spaces, plied with food, drink and WiFi, in lounges whose entrances are deliberately understated. This is not, however, the height of luxury.

Keiko Yamaki, a former airline stewardess turned applied anthropologist, observes in her study of airline service culture that the real elite, the super rich, no longer fly with commercial airlines. They prefer their private jets. Even those in First Class are more likely to be from the merely 1% instead of the 0.01%, who are now never seen checking in or boarding with the rest of us.

What, then, of invisible risks? The transactions that dominate the global economy are rarely, if ever, to be seen, negotiated in private and executed via encrypted digital networks. Financial institutions and the 1% who own them are protected from economic risk. The 99%, and especially those who live in the world’s poorest nations and slums are not.

The invisible threats of nuclear, chemical and biological waste are concentrated where the poor live. Drug-resistant diseases spread like wildfire through modern transportation systems, but the wealthy are protected by advanced technology and excellent health care. The poor are not.

At the end of the day, however, all must face misfortune and death, and here is where the similarity to Chinese popular religion comes in.

My business is failing. My daughter is acting crazy. My son was nearly killed in a motorcycle accident. He’s been married for three years and his wife still hasn’t had a baby. I feel sick all the time. I sometimes feel faint or pass out.

Why? The world of Chinese popular religion has answers. Impersonal factors, the alignment of your birth date with the current configuration of the stars, Yin and Yang and the Five Elements, may mean that this is a bad time for you.

Worse still, you may have offended one of the gods, ghosts or ancestors who inhabit the invisible Yin world that exists alongside the Yang world in which we live. The possibilities are endless. You need to find experts, mediums, magicians or priests, who can identify the source of your problem and prescribe remedies for it. You know that most who claim to be experts are charlatans but hope nonetheless to find the real thing.

Note how similar this is to the world that Beck describes, where the things that we fear most are said to be caused by invisible powers, the market, the virus, pollution or climate change, for example. Most of us don’t understand these things. We turn to experts for advice; but so many claim to be experts and say so many different things.

How do we find those who “really know”? The rich may have access to experts with with bigger reputations in finance, law, medicine, science or personal protection. But what does this really mean?

As I see it, all forms of consulting are magic. People with problems attribute them to invisible causes. They turn for help to those who claim special powers to diagnose and prescribe, and random chance alone will lead to identification of some who claim such powers as having “It,” that special something that produces desired results. Negative evidence will disappear in a context where most who claim special powers are known to be frauds.

The primary question for those looking for “It” is how to find the golden needle in a huge and constantly growing haystack. People turn to to their social networks for recommendations by trusted others, whose trust may, however, be grounded in nothing more than having found someone whose recommendations are, by sheer random chance, located in the tail of the normal curve where “success” is concentrated.

I read Beck’s Risk Society long before I read Nassim Taleb’s Fooled by Randomnessand The Black Swan. Taleb’s accounts of how traders who place lucky bets in the bond market are seen as geniuses with mystical insights into market mechanisms — at least until their funds collapse — seem to me to strongly support my theory of how all consulting works.

I read the words of “experts” who clamour for my attention and think of Taleb’s parable, the one in which a turkey has a perfectly consistent set of longitudinal data, stretching over nearly a year demonstrating the existence of a perfectly predictable world in which the sun will rise every morning and the farmer will feed the turkey. Then comes the day before Thanksgiving, and the farmer turns up with an axe.

Be warned: reading books like those by Beck and Taleb may reinforce skepticism of claims to scientific and other expertise. But think about it. Which world would you rather live in: One where careful scientists slowly develop hypotheses and look systematically for evidence to test them? Or a world in which our natural human tendency to magical thinking has no brake at all?

For his leading me to these thoughts, I do, indeed, mourn the death of Ulrich Beck.

Katerina Kolozova on The Real in Contemporary Philosophy (Synthetic Zero)

Jan 15, 2015

The Real in Contemporary Philosophy

Katerina Kolozova

What Baudrillard called the perfect crime has become the malaise of the global(ized) intellectual of the beginning of the 21’st century. The “perfect crime” in question is the murder of the real, carried out in such way as to create the conviction it never existed and that the traces of its erased existence were mere symptom of its implacable originary absence. The era of postmodernism has been one of oversaturation with signification as a reality in its own right and also as the only possible reality. In 1995, with the publication of The Perfect Crime, Baudrillard declared full realization of the danger he warned against as early as in 1976 in his book The Symbolic Exchange and Death. The latter book centered on the plea to affirm reality in its form of negativity, i.e., as death and the trauma of interrupted life. And he did not write of some static idea of the “Negative,” of “the constitutive lack” or “absence” as conceived by postmodernism and epistemological poststructuralism. The fact that, within the poststructuralist theoretical tradition, the real has been treated as the “inaccessible” and “the unthinkable” has caused “freezing” of the category (of the real) as immutable, univocal and bracketed out of discursiveness as an unspoken axiom.

The romantic fascination with the possibility of self-invention, the dream of being the demiurge of oneself and one’s own reality, has been nesting in most postmodern readings of the idea of utter linguistic constructedness of the self and it’s jouissance. The theoretical trend of what I would call “cyber-optimism” of the 90’ was informed by the old European myth of transcending physical limitations by way of liberating desires from the body. Through prosthetic mediation, one would “emancipate” desire and re-create oneself as the product and the reality of pure signification. This is a theoretical trend mostly inspired by the work of Donna Haraway. However, in my view, one which has failed to see the terrifying void gaping behind that utter intentionality of the human mind that Donna Haraway’s Simians, Cyborgs, and Women: The Reinvention of Nature (1991) and Primate Visions (1989) expose. She speaks of the Cyborg we all are, a creature of no origin, “the bastard of patriarchal militarism” as the revolutionary subject that should aim to destroy the narratives of hierarchy which humanism and its anthropocentric vision of nature produce. Haraway radically problematizes the dualistic hierarchy which subdues and exploits nature. The Cyborg, that “militant bastard” of humanism, faces the horror of auto-seclusion in its narcissistic and auto-referential universe of dreams and desires informed by the universe of his philosophical fathers.

The realization about the fundamentally discursively constructed humanity, including its entire history of idea, its universe and horizon of thinkability, creates the following aporia: the limits of construction reveal a certain “out-there” against which one is constructed. The “out-there” has been habitually relegated by the postmodernists to the realm of nonsense which deserves no theoretical consideration insofar as it could only assume the status of the unthinkable real. Nonetheless, Baudrillard appealed to think it as affirmed negativity, and the Lacanians attempted to think it as trauma or “constitutive lack.” In Bodies that Matter (1993), Butler assigned the status of the real to some of the laws of phantasmatic construction of the body and gender. These efforts of invoking the real within a theory which is marked as predominantly poststructuralist seem to have failed to offer a satisfactory response to the ever increasing theoretical and existential need to reclaim the real. Hence, the emergence in the second half of the first decade of the 21st century of strands of philosophical thought such as “speculative realism,” “object oriented ontology,” Badousian-Žižekian realist tendencies in political theory and, finally, François Laruelle’s non-standard philosophy or non-philosophy. There has been a notable tendency in the last couple of years to subsume all these lines of thinking under the single label of “speculative realism.” The notion of “speculative realism” has taken a life of its own against the fact that virtually all of the prominent representatives of the heterogeneous theoretical trends it pretends to refer to do not endorse or even reject the label (except for some representatives of object oriented ontology).

All these trends to which the identification of “speculative realism” is assigned to, in spite of their fundamental differences, have something in common: they identify limitations to thought or discursivity precisely in the alleged “limitlessness” of thought, proclaimed by most postmodernists. The main epistemic problem of postmodern philosophy identified by the “new realists” is what Quentin Meillassoux, in his book After Finitude (2008), called “correlationism.” At the heart of postmodern philosophy lies “correlationism,” a philosophical axiom based on the premise that thought can only “think itself,” that the real is inaccessible to knowledge and human subjectivity.

Laruelle’s non-philosophy radicalizes the problem by way of insisting that indeed all that thought can operate with is thinking itself, and that the hallucinatory world of representation is indeed the only means and topos for mediating the real, viz. for signifying it. Nonetheless, according to him and radically differently from any postmodernist stance, the real can be thought and ought to be thought. Laruelle argues one should produce thought in accordance with the syntax of the real, a thought affected by the real and which accounts for the effects of the real. The real is not a meaning, it is not a truth of anything and does not possess an epistemic structure since it is not mirrored by and does not mirror any accurate knowledge of its workings. Therefore, a thought established in accordance with the effects of the real is unilateral. In non-philosophy, this stance is called dualysis. Namely, the radically different status of the immanent (the real) and of the transcendental (thought) is affirmed, and by virtue of such affirmation the thinking subject attempts to describe some effects of sheer exteriority, i.e., the real. The interpretation of these effects makes use of “philosophical material,” but it does not succumb to philosophy but rather to the real as its authority in the last instance.

Such fundamentally heretical stance with respect to the history of philosophical ideas or to the idea of philosophy itself creates the possibility of being radically innovative as far as political possibilities are concerned, both in terms of theory and action. In The Cut of the Real, I attempt to explore the potentiality for radicalizing some core concepts of the legacy of feminist poststructuralist philosophy. By way of resorting to some of the methodological procedures proferred by the non-philosophy, but also by way of unraveling a radically realist heuristics in the thought of Judith Butler, Luce Irigaray and Drucilla Cornell, I attempt to create grounds for a language of politics “affected by immanence” (Laruelle).

SOURCE: http://www.cupblog.org/?p=9763

[][][]

Katerina Kolozova, PhD. is the director of the Institute in Social Sciences and Humanities-Skopje and a professor of philosophy, sociological theory and gender studies at the University American College-Skopje. She is also visiting professor at several universities in Former Yugoslavia and Bulgaria (the State University of Skopje, University of Sarajevo, University of Belgrade and University of Sofia as well as at the Faculty of Media and Communications of Belgrade). In 2009, Kolozova was a visiting scholar at the Department of Rhetoric (Program of Critical Theory) at the University of California-Berkeley. Kolozova is the author of Cut of the Real: Subjectivity in Poststructuralist Philosophy (2014), The Lived Revolution: Solidarity with the Body in Pain As the New Political Universal (2010), The Real and “I”: On the Limit and the Self (2006), The Crisis of the Subject with Judith Butler and Zarko Trajanoski (2002), and The Death and the Greeks: On Tragic Concepts of Death from Antiquity to Modernity (2000).