Arquivo mensal: fevereiro 2015

An evolutionary approach reveals new clues toward understanding the roots of schizophrenia (AAAS)

24-FEB-2015

MOLECULAR BIOLOGY AND EVOLUTION (OXFORD UNIVERSITY PRESS)

Is mental illness simply the evolutionary toll humans have to pay in return for our unique and superior cognitive abilities when compared to all other species? But if so, why have often debilitating illnesses like schizophrenia persisted throughout human evolutionary history when the affects can be quite negative on an individual’s chances of survival or reproductive success?

In a new study appearing in Molecular Biology and Evolution, Mount Sinai researcher Joel Dudley has led a new study that suggests that the very changes specific to human evolution may have come at a cost, contributing to the genetic architecture underlying schizophrenia traits in modern humans.

“We were intrigued by the fact that unlike many other mental traits, schizophrenia traits have not been observed in species other than humans, and schizophrenia has interesting and complex relationships with human intelligence,” said Dr. Joel Dudley, who led the study along with Dr. Panos Roussos. “The rapid increase in genomic data sequenced from large schizophrenia patient cohorts enabled us to investigate the molecular evolutionary history of schizophrenia in sophisticated new ways.”

The team examined a link between these regions, and human-specific evolution, in genomic segments called human accelerated regions, or HARs. HARs are short signposts in the genome that are conserved among non-human species but experienced faster mutation rates in humans. Thus, these regions, which are thought to control the level of gene expression, but not mutate the gene itself, may be an underexplored area of mental illness research.

The team’s research is the first study to sift through the human genome and identify a shared pattern between the location of HARs and recently identified schizophrenia gene loci. To perform their work, they utilized a recently completed, largest schizophrenia study of its kind, the Psychiatric Genomics Consortium (PGC), which included 36,989 schizophrenia cases and 113,075 controls. It is the largest genome-wide association study ever performed on any psychiatric disease.

They found that the schizophrenic loci were most strongly associated in genomic regions near the HARs that are conserved in non-human primates, and these HAR-associated schizophrenic loci are found to be under stronger evolutionary selective pressure when compared with other schizophrenic loci. Furthermore, these regions controlled genes that were expressed only in the prefrontal cortex of the brain, indicating that HARs may play an important role in regulating genes found to be linked to schizophrenia. They specifically found the greatest correlation between HAR-associated schizophrenic loci and genes controlling the expression of the neurotransmitter GABA, brain development, synaptic formations, adhesion and signaling molecules.

Their new evolutionary approach provides new insights into schizophrenia, and genomic targets to prioritize future studies and drug development targets. In addition, there are important new avenues to explore the roles of HARs in other mental diseases such as autism or bipolar disorder.

On Surveys (Medium)

Erika Hall

Feb 23, 2015

Surveys are the most dangerous research tool — misunderstood and misused. They frequently straddle the qualitative and quantitative, and at their worst represent the worst of both.

In tort law the attractive nuisance doctrine refers to a hazardous object likely to attract those who are unable to appreciate the risk posed by the object. In the world of design research, surveys can be just such a nuisance.

Easy Feels True

It is too easy to run a survey. That is why surveys are so dangerous. They are so easy to create and so easy to distribute, and the results are so easy to tally. And our poor human brains are such that information that is easier for us to process and comprehend feels more true. This is our cognitive bias. This ease makes survey results feel true and valid, no matter how false and misleading. And that ease is hard to argue with.

A lot of important decisions are made based on surveys. When faced with a choice, or a group of disparate opinions, running a survey can feel like the most efficient way to find a direction or to settle arguments (and to shirk responsibility for the outcome). Which feature should we build next? We can’t decide ourselves, so let’s run a survey. What should we call our product? We can’t decide ourselves, so let’s run a survey.

Easy Feels Right

The problem posed by this ease is that other ways of finding an answer that seem more difficult get shut out. Talking to real people and analyzing the results? That sounds time consuming and messy and hard. Coming up with a set of questions and blasting it out to thousands of people gets you quantifiable responses with no human contact. Easy!

In my opinion it’s much much harder to write a good survey than to conduct good qualitative user research. Given a decently representative research participant, you could sit down, shut up, turn on the recorder, and get good data just by letting them talk. (The screening process that gets you that participant is a topic for another day.) But if you write bad survey questions, you get bad data at scale with no chance of recovery. This is why I completely sidestepped surveys in writing Just Enough Research.

What makes a survey bad? If the data you get back isn’t actually useful input to the decision you need to make or if doesn’t reflect reality, that is a bad survey. This could happen if respondents didn’t give true answers, or if the questions are impossible to answer truthfully, or if the questions don’t map to the information you need, or if you ask leading or confusing questions.

Often asking a question directly is the worst way to get a true and useful answer to that question. Because humans.

Bad Surveys Don’t Smell

A bad survey won’t tell you it’s bad. It’s actually really hard to find out that a bad survey is bad — or to tell whether you have written a good or bad set of questions. Bad code will have bugs. A bad interface design will fail a usability test. It’s possible to tell whether you are having a bad user interview right away. Feedback from a bad survey can only come in the form of a second source of information contradicting your analysis of the survey results.

Most seductively, surveys yield responses that are easy to count and counting things feels so certain and objective and truthful.

Even if you are counting lies.

And once a statistic gets out — such as “75% of users surveyed said that they love videos that autoplay on page load” —that simple “fact” will burrow into the brains of decision-makers and set up shop.

From time to time, people write to me with their questions about research. Usually these questions are more about politics than methodologies. A while back this showed up in my inbox:

“Direct interaction with users is prohibited by my organization, but I have been allowed to conduct a simple survey by email to identify usability issues.”

Tears, tears of sympathy and frustration streamed down my face. This is so emblematic, so typical, so counterproductive. The rest of the question was of course, “What do I do?”

User research and usability are about observed human behavior. The way to identify usability issues is to usability test. I mean, if you need to maintain a sterile barrier between your staff and your customers, at least use usertesting.com. The allowable solution is like using surveys as a way to pass notes through a wall, between the designers and the actual users. This doesn’t increase empathy.

Too many organizations treat direct user research like a breach of protocol. I understand that there are very sensitive situations, often involving health data or financial data. But you can do user research and never interact with actual customers. If you actually care about getting real data rather than covering some corporate ass, you can recruit people who are a behavioral match for the target and never reveal your identity.

A survey is a survey. A survey shouldn’t be a fallback for when you can’t do the right type of research.

Sometimes we treat data gathering like a child in a fairy tale who has been sent out to gather mushrooms for dinner. It’s getting late and the mushrooms are far away on the other side of the river. And you don’t want to get your feet wet. But look, there are all these rocks right here. The rocks look kind of like mushrooms. So maybe no one will notice. And then you’re all sitting around the table pretending you’re eating mushroom soup and crunching on rocks.

A lot of people in a lot of conference rooms are pretending that the easiest way to gather data is the most useful. And choking down the results.

Customer Satisfaction Is A Lie

A popular topic for surveys is “satisfaction.” Customer satisfaction has become the most widely used metric in companies’ efforts to measure and manage customer loyalty.

A customer satisfaction score is an abstraction, and an inaccurate one. According to the MIT Sloan Management Review, changes in customers’ satisfaction levels explain less than 1% of the variation in changes in their share of spending in a given category. Now, 1% is statistically significant, but not huge.

And Bloomberg Businessweek wrote that “Customer-service scores have no relevance to stock market returns…the most-hated companies perform better than their beloved peers.” So much of the evidence indicates this is just not a meaningful business metric, rather a very satisfying one to measure.

And now, a new company has made a business out of helping businesses with websites quantify a fuzzy, possibly meaningless metric.

“My boss is a convert to Foresee. She was apparently very skeptical of it at first, but she’s a very analytical person and was converted by its promise of being able to quantify unquantifiable data — like ‘satisfaction’.”

This is another cry for help I received not too long ago.

The boss in question is “a very analytical person.” This means that she is a person with a bias towards quantitative data. The designer who wrote to me was concerned about the potential of pop-up surveys to wreck the very customer experience they were trying to measure.

There’s a whole industry based on customer satisfaction. And when there is an industry that makes money from the existence of a metric, that makes me skeptical of a metric. Because as a customer, I find this a fairly unsatisfying use of space.

Here is a Foresee customer satisfaction survey (NOT for my correspondent’s employer). These are the questions that sounded good to ask, and that seem to map to best practices.

But this is complete hogwash.

Rate the options available for navigating? What does that mean? What actual business success metric does that map to. Rate the number of clicks–on a ten point scale? I couldn’t do that. I suspect many people choose the number of clicks they remember rather than a rating.

And accuracy of information? How is a site user not currently operating in god mode supposed to rate how accurate the information is? What does a “7″ for information accuracy even mean? None of this speaks to what the website is actually for or how actual humans think or make decisions.

And, most importantly, the sleight of hand here is that these customer satisfaction questions are qualitative questions presented in a quantitative style. This is some customer research alchemy right here. So, you are counting on the uncountable while the folks selling these surveys are counting their money. Enjoy your phlogiston.

I am not advising anyone to run a jerk company with terrible service. I want everyone making products to make great products, and to know which things to measure in order to do that.

I want everyone to see customer loyalty for what it is — habit. And to be more successful creating loyalty, you need to measure the things that build habit.

Approach with Caution

When you are choosing research methods, and are considering surveys, there is one key question you need to answer for yourself:

Will the people I’m surveying be willing and able to provide a truthful answer to my question?

And as I say again and again, and will never tire of repeating, never ask people what they like or don’t like. Liking is a reported mental state and that doesn’t necessarily correspond to any behavior.

Avoid asking people to remember anything further back than a few days. I mean, we’ve all been listening to Serial, right? People are lazy forgetful creatures of habit. If you ask about something that happened too far back in time, you are going to get a low quality answer.

And especially, never ask people to make a prediction of future behavior. They will make that prediction based on wishful thinking or social desireability. And this is the most popular survey question of all, I think:

How likely are you to purchase the thing I am selling in the next 6 months?

No one can answer that. At best you could get 1)Possibly 2)Not at all.

So, yeah, surveys are great because you can quantify the results.

But you have to ask, what are you quantifying? Is it an actual quantity of something, e.g. how many, how often — or is it a stealth quality like appeal, ease, or appropriateness, trying to pass itself off as something measurable?

In order to make any sort of decisions, and to gather information to inform decisions, the first thing you have to do is define success. You cannot derive that definition from a bunch of numbers.

To write a good survey. You need to be very clear on what you want to know and why a survey is the right way to get that information. And then you have to write very clear questions.

If you are using a survey to ask for qualitative information be clear about that and know that you’ll be getting thin information with no context. You won’t be able to probe into the all important “why” behind a response.

If you are treating a survey like a quantitative input, you can only ask questions that the respondents can be relied on to count. You must be honest about the type of data you are able to collect, or don’t bother.

And stay away from those weird 10-point scales. They do not reflect reality.

How to put together a good survey is a topic worthy of a book, or a graduate degree. Right here, I just want to get you to swear you aren’t going to be casual about them if you are going to be basing important decisions on them.

“At its core, all business is about making bets on human behavior.”

— Ben Wiseman, Wall Street Journal

The whole reason to bother going to the trouble of gathering information to inform decisions is that ultimately you want those decisions to lead to some sort of measurable success.

Making bets based on insights from observed human behavior can be far more effective that basing bets on bad surveys. So go forth, be better, and be careful about your data gathering. The most measurable data might not be the most valuable.

Futebol na Europa sofre com onda de violência (Folha de S.Paulo)

Briga de torcidas em Roma

Vincenzo Tersigni/Efe

RAFAEL REIS
DE SÃO PAULO

26/02/2015 02h00

Para quem pensa que a violência dos torcedores é exclusividade brasileira e que a Europa estava livre desse flagelo, os últimos dez dias foram bastante reveladores.

O continente, marcado por episódios trágicos especialmente nos anos 1980, conseguiu amenizar a violência nos estádios com leis rígidas e rigor no cumprimento delas.

Mas acontecimentos recentes fizeram os sinais de alerta voltarem a se acender.

O episódio mais emblemático é o da Grécia. O governo do país suspendeu por tempo indeterminado o campeonato local devido à violência de torcedores do Panathinaikos no jogo contra o Olympiakos no domingo (22).

Enquanto isso, na França, apoiadores do inglês Chelsea impediram um homem negro de entrar no metrô de Paris em meio a cânticos racistas. Na Inglaterra, torcedores do West Ham entoaram cantos antissemitas ao Tottenham, clube de origem judaica.

A Itália não escapou. Roma viu fãs do Feyenoord (HOL) depredarem uma de suas praças mais importantes antes de jogo da Liga Europa.

“Há muitas coisas acontecendo ao mesmo tempo na Europa, e o futebol é reflexo de tudo isso”, disse à Folha Piara Powar, diretor-executivo da rede de ONGs Fare (Futebol Contra o Racismo na Europa, em tradução livre).

“A Grécia tem a crise econômica. Notamos ainda um aumento da intolerância com a tensão entre Rússia e Ucrânia e o crescimento da xenofobia contra os imigrantes.”

A Fare trabalha em parceria com a Fifa em iniciativas de prevenção contra qualquer preconceito no futebol.

Os ultras, responsáveis por boa parte dos tumultos no Velho Continente, normalmente estão ligados a pensamentos da extrema direita, como o neonazismo. Ou seja, na Europa, preconceito e violência no futebol são partes de um só problema.

“Os jogos têm menos violência do que no passado. Mas esse comportamento ultrapassado de alguns europeus resiste e é uma ameaça ao ambiente familiar nos estádios”, disse Powar.

Editoria de arte/Folhapress

‘MEDO PESADO’

“A gente já vai para o clássico sabendo que o clima será de guerra. Mas, desta vez, foi demais. Quando pisamos no campo para o aquecimento, fomos recebidos com uma chuva de isqueiros e sinalizadores. Eles invadiram o gramado e, então, começou a briga. Tivemos que sair correndo. Deu um medo pesado.”

O relato é do lateral direito brasileiro Leandro Salino, 29, que esteve em campo na derrota por 2 a 1 do seu Olympiakos para o Panathinaikos.

O episódio levou à terceira paralisação do campeonato local por conta de episódios de violência só nesta temporada -as outras pausas ocorreram devido a morte de um torcedor e a atentado contra um dos chefes da arbitragem.

Até a reunião da liga para discutir medidas para conter a violência terminou em briga. Um dirigente do Panathinaikos acusou o segurança do presidente do Olympiakos de agredi-lo com soco. desejo de vingança

Na Holanda, a polícia de Roterdã, casa do Feyenoord, teme que torcedores da Roma aproveitem o jogo de volta do mata-mata da Liga Europa, nesta quinta (26), para se vingarem dos fãs holandeses.

Nos últimos dias, nas redes sociais, ultras do clube italiano têm falado em dar o troco aos torcedores rivais.

Noemi Jaffe: A semântica da seca (Folha de S.Paulo)

26/02/2015  02h00

Emmanuel Levinas disse que a “consciência é a urgência de uma destinação dirigida a outro, e não um eterno retorno sobre si mesmo”. Penso que, embora não pareça, a frase se relaciona intimamente à “crise hídrica” em São Paulo.

Temos sido obrigados a ouvir e a falar em “crise hídrica”, na “maior seca em 84 anos” e expressões afins, que culpam a natureza, e não em catástrofe, colapso, responsabilidade ou palavras de igual gravidade.

O cidadão comum vive, na gestão do governo paulista, sob um regime eufemístico de linguagem, em aparência elegante, mas, na verdade, retoricamente totalitário, com o qual somos obrigados a conviver e, ainda, forçados a mimetizar.

“Crise hídrica”, “plano de contingência”, “obras emergenciais”, “volume morto”, “reservatórios”, tal como vêm sendo usados, não são mais que desvios covardes da linguagem e da política para ocultar o enfrentamento do real.

Não há água, houve grande incompetência, haverá grandes dificuldades, é necessário um plano emergencial de orientação e a criação de redes de contenção e de solidariedade. É preciso construir e distribuir cisternas, caixas d’água para a população carente, ensinar medidas de economia, mobilizar as subprefeituras para ações localizadas e, sobretudo, expor pública e claramente medidas restritivas à grande indústria e à agricultura, que podem ser bem mais perdulárias do que o cidadão.

Mas nada disso se diz ou faz. E por quê? A impressão que tenho é a de que a maioria dos políticos não trabalha sob o regime da responsabilidade –a condição de “destinação ao outro”–, mas sim na forma do “eterno retorno sobre si mesmo”.

Vive-se, em São Paulo, uma situação de absurdo, em que, além das enormes dificuldades cotidianas –deslocamento, saúde, segurança, educação, enchentes, e agora, a de ter água–, ainda é preciso ouvir o presidente da Sabesp dizer que São Pedro “tem errado a pontaria”.

Meu impulso é o de partir para o vocativo: “Ei, presidenta Dilma, deputados federais, governador Alckmin, prefeito Haddad, vereadores! Ouçam! Nós os elegemos para que vocês batalhem por nós, e não por seus mandatos! Nós é que somos aquele, o outro, a quem vocês devem responsabilidade!”.

Ou não tem relação com a “crise hídrica” um deputado federal receber cerca de R$100.000,00 por mês em “verbas de gabinete”? Por que deputados têm direito a um benefício que, entre outros, lhes garante seguro de saúde e carro, se quem ganha muitíssimo menos não tem?

Desafio os deputados, um a um, a abrirem mão publicamente de seus seguros de saúde e a usarem o transporte público para irem ao trabalho –a entrarem no real.

Até quando a população, sobretudo a mais carente, que tem poucos instrumentos para amenizar o que já sofre, vai ser tutelada e oprimida sob o manto eufemístico da “maior seca em 84 anos”?

Queremos o real, a linguagem responsável, que explicita o olhar para o outro e dá sustentação e liberdade para que se possam superar as dificuldades com autonomia.

O eufemismo livra os políticos e aliena a população da chapa maciça do real. Ele representa um estado semelhante à burocracia ineficaz. Como ser responsável se, para cada ação, há infinitas mediações?

O resultado é que as mediações acabam por alimentar muito mais a si mesmas do que ao objetivo final e inicial de governar: ser para o outro –no caso, nós, impotentes diante do que nos obrigam e do que, há meses, nos forçam a presenciar.

NOEMI JAFFE, 52, é doutora em literatura brasileira pela USP e autora de “O que os Cegos Estão Sonhando?” (editora 34)

Afinal quanta água tem no Cantareira? (Conta d’Água)

Saindo do volume morto: 10%, 8% ou -19%?

25 fev 2015

por José Bueno e Luiz de Campos Jr para Rios e Ruas

Hoje assistimos e ouvimos nos noticiários sobre mais uma elevação no nível do Sistema Cantareira. Finalmente — e só no fim de Fevereiro — saímos da Reserva Técnica 2 (Volume Morto 2) e atingimos a “cota zero” da Reserva Técnica 1. Mas o que isso quer dizer na prática? Já é possível respirarmos um pouco mais aliviados com relação a segurança do nosso abastecimento de água? Infelizmente e definitivamente, não!

Como podemos ver no gráfico, estamos muito abaixo dos níveis registrados nos últimos quatro anos para esta mesma data. Ainda são necessários mais 182 bilhões de litros apenas para encher a Reserva Técnica 1 e atingir a “cota zero” do chamado Volume Útil. Tivemos um Fevereiro bem chuvoso na Cantareira (ninguém pode culpar S. Pedro este mês), já são quase 70mm acima da média histórica. Mas, a partir de agora as médias mensais tendem a diminuir continuamente — e drasticamente — até o final do inverno. Não vai bastar rezar, com certeza.

A SABESP divulgou no seu boletim de hoje que atingimos 10,7% do volume total do sistema. Essa informação não ajuda — na verdade confunde — o entendimento da realidade nos reservatórios. E cremos que essa confusão se dá por duas características desse índice:

1. O cálculo é feito a partir da soma do Volume Útil (antes de maio/2014) + Reserva Técnica 1 (após maio/2014) + Reserva Técnica 2 (após outubro/2014);
2. Mesmo utilizando esse total, o cálculo está simplesmente errado.

No primeiro caso, ao optar por utilizar o volume total de água disponível para calcular a porcentagem, a SABESP parece querer que esqueçamos que na verdade estamos utilizando as Reservas Técnicas, que só podem ser utilizadas em caráter extraordinário. Melhor seria que ela separasse e deixasse explícito o que é Reserva Técnica do que é Volume Útil no seu índice. Por exemplo, como fizemos no gráfico, colocando as porcentagens como negativas enquanto não ultrapassarmos as cotas das Reservas Técnicas.

No segundo caso é ainda mais grave e básico: um erro matemático. Uma vez que a SABESP está considerando a soma total (Volume Útil + Reservas Técnicas) no cálculo do índice, tinha de utilizar o volume somado para calcular as porcentagens, mas está usando apenas o valor do Volume Útil [!]. Resultado: um índice com valor nominal maior do que o correto. No índice de hoje, por exemplo, o resultado correto é de 8,3% e não 10,7% [!!]. Em se mantendo o procedimento e caso atinjamos a cota máxima do sistema em algum momento futuro — tomara! — o índice que representará o sistema “cheio” será de absurdos 129,2% [!].

Dada esta inusitada situação, com o objetivo de contribuir para o melhor entendimento da realidade e para a maior transparência dos dados, o Rios e Ruas decidiu atualizar e publicar diariamente o gráfico de comparação do volume do Sistema Cantareira de 2011 a 2015 na mesma data de cada ano, utilizando as duas formas de representá-los.


Fonte dos Dados: SABESP
Para saber mais sobre o erro no cálculo: 
Guia Rápido

A luta pela água em SP (Conta d’Água)

25 fev 2015

Quem é quem nos diferentes movimentos e coletivos que se organizam diante da ineficácia do governo e da Sabesp perante a crise hídrica.

Por Ivan Longo da Revista Fórum

O racionamento de água no estado de São Paulo já está consolidado e não é novidade para ninguém. Independente da região, não é difícil encontrar casas ou estabelecimentos que fiquem um ou mais dias sem água, todas as semanas. Os que não ficam só conseguem se segurar graças aos caminhões pipa. Ainda que essa situação seja um consenso, o governador Geraldo Alckmin e a Sabesp seguem negando o rodízio, negligenciando informação e adiando medidas para conter, de fato, a crise pela qual eles mesmos são os responsáveis.

Diante da inércia do poder público, a população vem se organizando para encontrar maneiras de adiar o pior ou mesmo pressionar os governantes para que se mude a lógica de como a água é administrada no estado. Do final do ano passado para o início deste ano, uma série de atos, atividades e aulas públicas relacionadas à crise hídrica vêm acontecendo independentemente da ação do poder público.

Para esta quinta-feira (26), por exemplo, o Movimento dos Trabalhadores Sem-Teto (MTST) convocou um grande ato — a Marcha pela Água — com o intuito de cobrar do governo transparência na gestão da crise e o direito universal à água.

Outros coletivos, entidades e movimentos pautados pela crise da água vêm nascendo e alguns deles, inclusive, atuando já há algum tempo. Com o objetivo em comum — o de garantir o acesso à água para todos — cada um desses grupos propõe diferentes métodos, caminhos e soluções.

Saiba quem é quem nessa nova configuração de lutas nascida no solo seco do estado de São Paulo.

Coletivo de Luta pela Água

O Coletivo de Luta pela Água publicou seu manifesto em janeiro deste ano diante do acirramento da crise no abastecimento no estado de São Paulo. Trata-se de um coletivo composto por movimentos sociais, sindicatos, gestores municipais e ONG’s que busca articular a sociedade civil na luta pelo direito à água. Como solução para a crise, a entidade propõe que o governo apresente imediatamente um Plano de Emergência que explicite de forma clara os próximos passos que serão tomados a partir de um amplo diálogo com a sociedade e representantes dos municípios.

Aliança pela Água

Aliança pela Água reúne uma série de entidades com diferentes áreas de atuação, mas principalmente as ligadas à questão ambiental. A ideia é construir, junto à sociedade — diante da inércia do governo estadual para com a crise no abastecimento — soluções para a segurança hídrica através de várias iniciativas.

Para isso, o coletivo tem realizado uma série de mapeamentos, aulas públicas, atos e consultas com especialistas para traçar caminhos, o que já levou à divulgação de uma Agenda Mínima, com 10 ações urgentes e 10 ações a médio e a longo prazo. Entre as propostas, estão a criação de um comitê de gestão da crise, a divulgação aberta de informações para a população, ação diferenciada das agências reguladoras para grandes consumidores (indústrias e agronegócio), incentivo às novas tecnologias, implantação de políticas de reuso, recuperação e proteção dos mananciais, transcrição de um novo modelo para a gestão da água, entre outras.

Assembleia Estadual da Água

Assembleia Estadual da Água surgiu a partir de entidades, como o coletivo Juntos!, do PSOL, que desde o ano passado vem realizando mobilizações contra a crise no abastecimento. No final do ano, a entidade teve contato com o movimento Itu Vai Parar, que lutava contra a calamidade ocorrida em Itu, uma das primeiras cidades a sentir mais intensamente os efeitos da crise. A partir do diálogo, diversas outras entidades decidiram se reunir para, em dezembro, realizar oficialmente a Assembleia Estadual da Água, em Itu, que contou com a participação de mais de 70 coletivos, entidades e movimentos. A Assembleia vem realizando uma série de atividades para mobilizar a população em torno do tema, inclusive em parceria com outros movimentos, como a Aliança pela Água.

MTST

O Movimento dos Trabalhadores Sem-Teto (MTST) também resolveu abraçar a causa da água. O movimento, que conta com milhares de militantes e com o apoio de dezenas de entidades, vai realizar o ato Marcha pela Água, no próximo dia 26. Eles exigem transparência do governo estadual para com a situação, a elaboração urgente de um plano de emergência e o fim da sobre taxa em relação ao consumo.

Lute pela água

O coletivo Lute pela Água busca fazer reuniões de bairro para articular a população na luta pelo direito à água e já realizou, desde o ano passado, três protestos contra a crise no abastecimento. Formado por membros do coletivo Território Livre e da Frente Independente Popular (FIP), o movimento defende a estatização da Sabesp e a gestão popular da companhia.

Conta D’água

O Conta D’água é um coletivo de comunicação, que reúne diversos veículos de mídia independente, bem como movimentos e entidades, com o intuito de fazer um contraponto à narrativa da mídia tradicional, que insiste em blindar o governo estadual e a Sabesp pela crise no abastecimento. Com matérias, reportagens, informes, entrevistas e eventos, o Conta D’água vem, desde o ano passado, participando das principais mobilizações em torno do tema e pautando o assunto com o viés e as demandas da população.


Agenda das mobilizações

26/2 (quinta-feira) — Marcha pela Água em São Paulo
Local: Largo da Batata, Pinheiros
Horário: 17h

20/03 (sexta-feira) — Dia de Luta pela Água
Realização: Coletivo de Luta pela Água
Local: Vão livre do MASP
Horário: 14h30

27/03 (sexta-feira) — 4º Ato Sem Água São Paulo vai Parar
Realização: Lute pela Água
Local: Largo da Batata, Pinheiros
Horário: 18h00

Vamos defender a água (Conta d’Água)

24 fev 2015

Você vem tomando banho de gato para economizar água? Não descarrega a privada se ela estiver apenas com xixi? Usa a água da lavadora de roupas para limpar o quintal? Sua casa está cheia de caixas d’água e baldes para armazenar chuva?

Oi! Estamos falando com você porque estamos na mesma situação.

O governador Geraldo Alckmin e a Sabesp — que vivem no reino da fantasia — dizem que não há racionamento, que não há falta de água na cidade.

Mas –na vida real– ou falta água todo dia, ou falta durante muitos dias seguidos, como já vem acontecendo na zona leste da capital.

Agora, o governador e a Sabesp dizem que os mananciais estão se recuperando com as chuvas de verão.

Eles querem nos tranquilizar porque têm medo do povo na rua.

A verdade é que os reservatórios de água, as represas e os rios que abastecem a região metropolitana de São Paulo estão nos níveis mais baixos da história.

As chuvas que têm desabado sobre a cidade são como uns caraminguás entrando numa conta que já está estourada no cheque especial. Sim, porque explorar o volume morto do sistema Cantareira (como ainda está acontecendo) é como entrar no cheque especial: fácil entrar, difícil sair.

Quando começar a estiagem, a partir de abril, aí é que a coisa vai ficar feia:

Seca climática sem reserva de água é o mesmo que aumento de doenças, fechamento de fábricas, comércio e escolas, desemprego.

Em uma palavra: sofrimento.

O pior de tudo é que enquanto nós fazemos uma economia danada e enfrentamos a interrupção no fornecimento de água, a Sabesp premiou 500 empresas privilegiadas com o direito de receber todo santo dia milhões de litros de água potável — e elas pagam uma tarifa camarada, bem mais baixa do que a dos cidadãos comuns.

É justo isso?

A reponsabilidade por tanto desmando é do governo do Estado, que não fez os investimentos necessários para reduzir os vazamentos nos canos de água da rede de abastecimento; que privatizou parte da Sabesp e distribuiu gordas fatias dos lucros para acionistas na bolsa de valores de Nova York; que preferiu culpar São Pedro a tomar providências; que presenteia com agrados os amigos da empresa.

E eles ainda querem aumentar a tarifa da água em abril!

Porque não queremos mais ser enganados; porque a população exige a elaboração de um plano de emergência para lidar com a seca; porque não queremos pagar nem um centavo a mais pela água que a Sabesp não entrega, porque não aceitamos privilégios no acesso à água, vamos fazer um grande ato público nesta quinta-feira (26 de fevereiro).

A iniciativa, do Movimento dos Trabalhadores Sem Teto (MTST), já conseguiu a adesão de vários movimentos sociais e de ambientalistas. A concentração será às 17h no Largo da Batata, em Pinheiros. De lá sairemos em passeata para o Palácio dos Bandeirantes, mansão onde vive o governador Geraldo Alckmin.

Vamos dizer bem alto para ele que não aceitamos pagar o pato pela crise que não criamos;

Que exigimos água boa, limpa e cristalina para todos (e não só para os mais ricos e privilegiados);

Chega de irresponsabilidade com a vida da população!

Após forte chuva, sistema Cantareira sobe de novo e chega a 11,1% (Folha de S.Paulo)

DE SÃO PAULO

26/02/2015  09h44

A forte chuva que atingiu a Grande SP na tarde de quarta-feira (25) fez com que o nível do Cantareira aumentasse 0,3 ponto percentual em comparação com o dia anterior. O manancial opera agora com 11,1%, índice ainda considerado crítico.

Apesar de ter subido mais do que nos dias anteriores, o aumento não foi tão alto porque, segundo meteorologistas, a chuva forte que atingiu São Paulo não passou pela região do manancial. Lá, a chuva foi mais moderada.

Representantes da Sabesp, no entanto, reiteraram na quarta-feira em sessão na Câmara Municipal de São Paulo que não está previsto um rodízio de água para a Grande SP. mesmo com as chuvas abaixo do previsto para março.

Fevereiro, segundo dados divulgados pela Sabesp, vai fechar com chuvas bem acima da média histórica. Até agora, no Cantareira registrou 293 mm de chuva quando a média para o mês é de 199,1 mm.

O Cantareira abastece 6,2 milhões de pessoas na zona norte e partes das zonas leste, oeste, central e sul da capital paulista -eram cerca de 9 milhões antes da crise. Essa diferença passou a ser atendida por outros sistemas.

Desde julho de 2014, em meio à grave crise hídrica, o governo paulista utilizou duas reservas do fundo da represa, conhecidas como volume morto. Esse volume, no Cantareira espalhado em três diferentes represas, é a porção que fica abaixo das tubulações que captam água. E, para ser utilizada, precisa ser bombeada.

A segunda cota do volume morto, de 105 bilhões de litros, começou a ser usada em novembro. Nesta terça, quando o sistema atingiu 10,7% de sua capacidade, o equivalente a ela foi recuperado. Já a primeira cota do volume morto, de 182,5 bilhões, talvez somente possa ser recuperada em um ou dois anos. Isso ocorrerá quando o nível do manancial atingir 29,2%.

A utilização do volume morto, segundo especialistas, pode ser comparada ao uso do cheque especial. Ambientalistas também apontam alguns riscos, como o de extinção de uma reserva técnica do manancial, por exemplo.

Rubens Fernando Alencar e Pilker/Folhapress

OUTROS RESERVATÓRIOS

Já o nível do reservatório Alto Tietê, que também sofre as consequências da seca, opera com 18,3% de sua capacidade, o mesmo índice registrado há quatro dias.

O sistema abastece 4,5 milhões de pessoas na região leste da capital paulista e Grande São Paulo. No dia 14 de dezembro, o Alto Tietê passou a contar com a adição do volume morto , que gerou um volume adicional de 39,5 milhões de metros cúbicos de água da represa Ponte Nova, em Salesópolis (a 97 km de São Paulo).

O nível da represa de Guarapiranga, que fornece água para 5,2 milhões de pessoas nas zonas sul e sudeste da capital paulista, avançou 1,1 ponto percentual e opera com 59,8% de sua capacidade.

O reservatório Rio Grande, que atendem a 1,5 milhão de pessoas, caiu 0,1 ponto percentual e opera agora com 83,3%. Já o reservatório Rio Claro, que também atende 1,5 milhão de pessoas, avançou 0,2 percentual. O sistema opera com 35,7%

O sistema Alto Cotia também teve melhora passando de 36,4% para 37,7%. O reservatório fornece água para 400 mil pessoas.

A medição da Sabesp é feita diariamente e compreende um período de 24 horas: das 7h às 7h.

Seca no sistema Cantareira

Nacho Doce – 4.dez.14/Reuters

Sabesp admite que rodízio pode contaminar água (Estadão)

Pedro Venceslau e Fabio Leite – O Estado de S. Paulo

26 Fevereiro 2015 | 03h 00

Diretor disse em CPI que problema não colocaria usuário em risco; empresa também afirmou que pressão está fora da norma

SÃO PAULO – O risco de contaminação da água admitido nesta quarta-feira, 25, pelo diretor metropolitano da Companhia de Saneamento Básico do Estado de São Paulo (Sabesp), Paulo Massato, em caso de rodízio oficial já é realidade em algumas regiões altas da Grande São Paulo. São locais onde a rede fica despressurizada após o fechamento manual dos registros na rua, conforme um alto dirigente da empresa admitiu ao Estado no início do mês.

“Se implementado o rodízio, a rede fica despressurizada, principalmente em regiões de topografia acidentada, nos pontos em que a tubulação está em declive. Se o lençol freático está contaminado, isso aumenta o risco de contaminação (da água na rede)”, afirmou Massato, nesta quarta, durante sessão da CPI da Sabesp na Câmara Municipal.

O resultado desse contágio, segundo ele, não colocaria a vida dos consumidores em risco, mas poderia causar disenteria, por exemplo. “Nós temos hoje medicina suficiente para minimizar risco de vida para a população. Uma disenteria pode ser mais grave ou menos grave, mas é um risco (implementar o rodízio) que nós queremos evitar ”, completou. Apesar do alerta, ele disse que a estatal poderia “descontaminar” rapidamente a água afetada.

Hélvio Romero/Estadão

‘Estamos em uma situação de anormalidade. Nós não conseguiríamos abastecer 6 milhões de habitantes se mantivéssemos a normalidade’, disse Massato

No início do mês, um dirigente da Sabesp admitiu ao Estado que em 40% da rede onde não há válvulas redutoras de pressão (VRPs) instaladas, o racionamento de água é feito por meio do fechamento manual, flagrado pela reportagem na Vila Brasilândia, zona norte da capital. Segundo ele, a manobra “não esvazia totalmente” a rede, mas “despressuriza pontos mais altos”.

“A zona baixa fica com água. Se não houver consumo excessivo, a maior parte da rede fica com água. Acaba despressurizando zonas altas, isso acontece mesmo. Tanto é que quando abre (o registro) para encher de novo, as zonas mais altas e distantes acabam sofrendo mais, ficando mais tempo sem água”, afirmou.

Para o engenheiro Antonio Giansante, professor de Engenharia Hídrica do Mackenzie, é grande o risco de contaminação em caso de fechamento da rede. “Em uma eventualidade de o tubo estar seco, pode ser que entre água de qualidade não controlada, em geral, contaminada por causa das redes coletoras de esgoto, para dentro da rede da Sabesp.”

Segundo interlocutores do governador Geraldo Alckmin (PSDB), a declaração desagradou o tucano, uma vez que o rodízio não está descartado. Massato já havia causado constrangimento ao governo ao dizer, em 27 de janeiro, que São Paulo poderia ficar até cinco dias sem água por semana em caso de racionamento.

Fora da norma. Massato e o presidente da Sabesp, Jerson Kelman, que também prestou depoimento à CPI, admitiram aos vereadores que a empresa mantém a pressão da água na rede abaixo do recomendado pela Associação Brasileira de Normas Técnicas (ABNT), conforme o Estado revelou no início do mês. Segundo o órgão, são necessários ao menos 10 metros de coluna de água para encher todas as caixas.

“Nós estamos garantindo 1 metro da coluna de água, preservando a rede de distribuição. Mas não tem pressão suficiente para chegar na caixa d’água”, admitiu Massato. “Estamos abaixo dos 10 metros de coluna de água, principalmente nas zonas mais altas e mais distantes dos reservatórios.”

“Essa é uma medida mitigadora para evitar algo muito pior para a população, que é o rodízio”, afirmou Kelman. “São poucos pontos na rede em que não se tem a pressão exigida pela ABNT para condições normais. Isso não é uma opção da Sabesp. Não estamos em condições normais”, completou.

Em dezembro, Alckmin disse que a Sabesp cumpria “rigorosamente” a norma técnica. A Sabesp foi notificada pela Agência Reguladora de Saneamento e Energia do Estado de São Paulo (Arsesp) e respondeu na terça-feira aos questionamentos feitos sobre as manobras na rede. O órgão fiscalizador, contudo, ainda não se pronunciou.

Ar encanado. Questionados sobre a investigação do Ministério Público Estadual que apura suposta cobrança por “ar encanado” pela Sabesp, revelada pelo Estado, os dirigentes da empresa disseram que a prática atingiu apenas 2% dos clientes. Das 22 mil reclamações registradas em fevereiro sobre aumento indevido da conta, 500 culpavam o ar encanado. O problema ocorre quando a água retorna na rede e empurra o ar de volta para as ligações das casas, podendo adulterar a medição do hidrômetro. / COLABOROU RICARDO CHAPOLA

Câmara aprova projeto que torna lei a Política Nacional de Combate à Seca (Agência Câmara Notícias)

JC 5125, 26 de fevereiro de 2015

Proposta lista diversas ações que caberão ao poder público, como o mapeamento dos processos de desertificação e degradação ambiental e a criação de um sistema integrado de informações de alerta quanto à seca

O Plenário da Câmara dos Deputados aprovou nesta quarta-feira (25) o Projeto de Lei 2447/07, do Senado, que torna lei a Política Nacional de Combate à Desertificação e Mitigação dos Efeitos da Seca e cria a Comissão Nacional de Combate à Desertificação (CNCD). Devido às mudanças, a matéria retorna ao Senado.

O projeto foi aprovado na forma de um substitutivo da Comissão de Meio Ambiente e Desenvolvimento Sustentável, elaborado pelo ex-deputado Penna (PV-SP). O texto original era do ex-senador Inácio Arruda.

Desde 1997, o Brasil já conta com uma Política Nacional de Controle da Desertificação, aprovada pelo Conselho Nacional do Meio Ambiente (Conama) e surgida após a ratificação da Convenção Internacional das Nações Unidas de Combate à Desertificação, de 1996.

De acordo com o substitutivo, são vários os objetivos da política nacional, entre os quais destacam-se o uso de mecanismos de proteção, preservação, conservação e recuperação dos recursos naturais; o fomento de pesquisas sobre o processo de desertificação; a educação socioambiental dos atores sociais envolvidos na temática; e o apoio a sistemas de irrigação socioambientalmente sustentáveis em áreas que sejam aptas para a atividade.

Para cumprir os objetivos, o poder público deverá seguir várias diretrizes, como gestão integrada e participativa dos entes federados e das comunidades situadas em áreas suscetíveis à desertificação no processo de elaboração e de implantação das ações.

Devem ser observados ainda aspectos como a incorporação e valorização dos conhecimentos tradicionais sobre o manejo e o uso sustentável dos recursos naturais e a articulação com outras políticas (erradicação da miséria e reforma agrária, por exemplo).

Ações públicas
O substitutivo lista diversas ações que caberão ao poder público, tais como o mapeamento dos processos de desertificação e degradação ambiental; sistema integrado de informações de alerta quanto à seca; capacitação dos técnicos em extensão rural para a promoção de boas práticas de combate à desertificação; implantar tecnologias de uso eficiente da água e de seu reuso na produção de mudas para reflorestamento; e implantar sistemas de parques e jardins botânicos e bancos de sementes para a conservação de espécies adaptadas à aridez.

Emenda aprovada pelo Plenário, do deputado Moses Rodrigues (PPS-CE), prevê ainda a perfuração de poços artesianos onde houver viabilidade ambiental para isso.

Outra emenda do deputado trata do estímulo à criação de centros de pesquisas para o desenvolvimento de tecnologias de combate à desertificação.

Já emenda do deputado Sibá Machado (PT-AC) determina que os planos de prevenção e controle do desmatamento servirão de instrumento para a política nacional.

Comissão nacional
A comissão nacional, que funciona atualmente com base em decreto do Executivo federal, terá natureza deliberativa e consultiva e fará parte da estrutura regimental do Ministério do Meio Ambiente.

Compete à comissão promover a integração das estratégias, acompanhar e avaliar as ações de combate à desertificação, propor ações estratégicas e identificar a necessidade e propor a criação ou modificação dos instrumentos necessários à execução da política nacional.

Semiárido
No Brasil, as principais áreas suscetíveis à desertificação são as regiões de clima semiárido ou subúmido seco, encontrados no Nordeste brasileiro e norte de Minas Gerais.

Essa região abrange 1.201 municípios, em um total de 16% do território e incorpora 11 estados: Alagoas, Bahia, Ceará, Espírito Santo, Maranhão, Minas Gerais, Paraíba, Pernambuco, Piauí, Rio Grande do Norte e Sergipe. A região também concentra 85% da pobreza do País.

Durante o debate do projeto em Plenário, alguns deputados avaliaram que a Política Nacional de Combate à Seca também poderá dar uma resposta para o cenário de falta d’água na região Sudeste.

Continua:

Íntegra da proposta: PL-2447/2007

(Eduardo Piovesan / Agência Câmara Notícias)

http://www2.camara.leg.br/camaranoticias/noticias/POLITICA/482281-CAMARA-APROVA-PROJETO-QUE-TORNA-LEI-A-POLITICA-NACIONAL-DE-COMBATE-A-SECA.html

Vídeo mostra como o Brasil monitora os riscos de desastres naturais (MCTI/INPE)

JC 5125, 26 de fevereiro de 2015

Os sistemas de monitoramento e prevenção de seus impactos no Brasil também integram o vídeo educacional lançado pelo INCT-MC

Os desastres naturais e os sistemas de monitoramento e prevenção de seus impactos no Brasil são tema do vídeo educacional lançado pelo Instituto Nacional de Ciência e Tecnologia para Mudanças Climáticas (INCT-MC).

material integra o projeto de difusão do conhecimento gerado pelas pesquisas realizadas durante os seis anos de vigência do INCT-MC (2008-2014), sediado no Instituto Nacional de Pesquisas Espaciais (Inpe/MCTI).

Dirigido a educadores, estudantes de ensino médio e graduação, e formuladores de políticas públicas, o vídeo traz informações sobre as causas do aumento do número de desastres naturais nos últimos anos e como o País está se preparando para prevenir e reduzir os prejuízos nos diversos setores da sociedade. Pesquisadores e tecnologistas do Centro Nacional de Monitoramento e Alertas de Desastres Naturais (Cemaden/MCTI) mostram como é feito o monitoramento de áreas de risco 24 horas por dia. Também são apresentadas as dimensões humanas, ou seja, como os desastres interferem e prejudicam a vida das pessoas e como o surgimento de novos cenários de risco pode e deve ser evitados.

Até junho, serão concluídos outros cinco vídeos educacionais, abordando temas relacionados às pesquisas do INCT para Mudanças Climáticas: segurança alimentar, segurança energética, segurança hídrica, saúde e biodiversidade.

Portal

O conhecimento produzido durante seis anos de pesquisas realizadas no âmbito do INCT para Mudanças Climáticas está sendo reunido em um portal na internet, a ser lançado neste semestre. O ambiente virtual oferecerá conteúdos com linguagem adequada para os diversos públicos de interesse: pesquisadores, educadores, estudantes (divididos por faixas etárias) e formuladores de políticas públicas. O material estará organizado em seis grandes áreas temáticas: segurança alimentar, segurança energética, segurança hídrica, saúde humana, biodiversidade e desastres naturais.

Leia mais.

(MCTI, via Inpe)

http://www.mcti.gov.br/noticias/-/asset_publisher/IqV53KMvD5rY/content/video-mostra-como-o-brasil-monitora-os-riscos-de-desastres-naturais

Em 10 anos, falta d’água atingirá 2,9 bilhões (Estadão)

Os países mais deficitários serão os com menos recursos e populações jovens e em crescimento

Um relatório internacional divulgado ontem adverte que, em 15 anos, a demanda mundial por água doce será 40% superior à oferta. Os países mais deficitários serão os com menos recursos e populações jovens e em crescimento. O documento do Instituto de Água, Meio Ambiente e Saúde (INWEH) da Universidade das Nações Unidas com sede no Canadá, prevê que em 10 anos 48 países – e uma população de 2,9 bilhões de pessoas – estarão classifica dos como “com escassez ou com estresse de água”.

O conteúdo na íntegra está disponível em: http://digital.estadao.com.br/download/pdf/2015/02/25/A15.pdf

(O Estado de S.Paulo)

Suspeita de abuso derruba ‘chefão do clima’ da ONU (Estadão)

Rajendra Pachauri renuncia e IPCC fica sem uma direção permanente em momento crítico das negociações por um novo acordo mundial

Acusado de assédio sexual, o indiano Rajendra Pachauri abandonou a presidência do Painel Intergovernamental da ONU para Mudanças Climáticas (IPCC) em um momento crítico nas negociações para um acordo sobre emissões de CO 2.Em comunicado emitido ontem, a ONU aceitou sua renúncia.

O conteúdo na íntegra está disponível em: http://digital.estadao.com.br/download/pdf/2015/02/25/A15.pdf

(O Estado de S.Paulo)

Cantareira recupera 2º volume morto (Estadão)

Sistema Cantareira atingiu ontem 10,7% da capacidade

Após a 19.ª alta consecutiva, o nível do Sistema Cantareira atingiu ontem 10,7% da capacidade, índice que marca a “recuperação” da segunda cota do volume morto, de acordo com a Companhia de Saneamento Básico do Estado de São Paulo (Sabesp) e a Agência Nacional de Águas (ANA).

O conteúdo na íntegra está disponível em: http://digital.estadao.com.br/download/pdf/2015/02/25/A15.pdf

(O Estado de S.Paulo)

Mais informações sobre o assunto na Folha de S.Paulo – http://www1.folha.uol.com.br/fsp/cotidiano/209589-cantareira-recupera-2-parte-do-volume-morto.shtml

Como plantar água (Folha de S.Paulo)

JC, 5124, 25 de fevereiro de 2015

Governo reage à crise hídrica sobretudo com obras, mas deveria também mudar mentalidade em relação à preservação ambiental, diz o editorial da Folha de S.Paulo

O governo do Estado de São Paulo deixa escapar uma boa oportunidade de trocar a moldura com que enquadra a questão do abastecimento de água na região metropolitana da capital paulista. Mergulhado na crise e no curto prazo, perde de vista as providências de longo alcance que deveria tomar.

O conteúdo na íntegra está disponível em: http://www1.folha.uol.com.br/fsp/opiniao/209505-como-plantar-agua.shtml

(Folha de S.Paulo)

Nada cai do céu (Folha de S.Paulo)

JC, 5124, 25 de fevereiro de 2015

Carlos Magno

Com ou sem chuva à vista, a população precisa entender que a água pode –e vai– acabar se não forem tomadas medidas preventivas

O racionamento a que pode ser submetida boa parte da população paulistana –e de outras cidades e Estados brasileiros– poderia ser evitado? A questão é muito mais complexa do que possa parecer e jamais deveria ser levada ao campo do flá-flu político. Afinal, todos que vivemos nessas áreas já somos e seremos ainda mais afetados.

O calor bate recordes no mundo. Dados recentes da Nasa e da Administração Oceânica e Atmosférica dos Estados Unidos (Noaa, sigla para o nome em inglês do órgão) apontam 2014 como o ano mais quente da história. A temperatura média no solo e nos oceanos aumentou 0,69 graus, superando recordes anteriores. Parece pouco, mas não é.

A cada 20 ou 30 anos, em média, o oceano Pacífico, a maior massa de água do planeta, sofre variações de temperatura, ficando mais quente ou mais frio do que o normal. Essas oscilações de longo período interferem nos ventos, na chuva e na temperatura em muitas regiões do globo. No Brasil, diversos Estados já sentem os impactos dessa alteração climática.

O verão passado foi um dos mais secos e quentes da história, não apenas na região da capital paulista e seu entorno mas também em grande parte do Sudeste, especialmente em Minas Gerais e no vale do Piracicaba, de onde vem a maior parte da água que abastece a região metropolitana de São Paulo, por meio do sistema Cantareira. Áreas dessa região chegaram a registrar anomalias de até 5 graus nas temperaturas máximas em janeiro de 2014.

Com pouca água e maior consumo, devido ao calor, os rios e represas que abastecem o sistema caíram aos menores níveis já registrados. Em São Paulo, por exemplo, desde 2012 o Cantareira vem sofrendo com chuva abaixo do normal. Nem mesmo as chuvas de fevereiro, que elevaram o nível dos reservatórios, são ainda suficientes para mudar o quadro de seca.

E as previsões não são as melhores. Segundo estudo da Climatempo, somente no verão de 2017 é que se poderá esperar por uma chuva normal ou acima da média, que vá colaborar para uma consistente recuperação do sistema.

Reverter a situação é um desafio. Trata-se de algo muito mais educativo do que meteorológico ou de obras faraônicas –que, se agora são necessárias, deveriam ter sido planejadas há pelo menos dez anos.

Desde o final de 2013, meteorologistas têm alertado sobre esse cenário crítico. Já se sabe que o quadro não é favorável, e há poucas chances de mudança em curto prazo.

Porém, em um planeta onde 1,4 bilhão de quilômetros cúbicos é ocupado por água, o ser humano ainda parece acreditar que ela nunca irá acabar.

Com ou sem chuva à vista, a população precisa entender que a água pode –e vai– acabar se não forem tomadas medidas preventivas. A conscientização sobre o consumo deve ser permanente.

Optar pelo reúso pode ser uma das soluções. Aliás, a ideia de cobrar uma sobretaxa para aqueles que consumirem mais água do que o normal nesse período está entre as boas medidas já tomadas –tão boa quanto os descontos anunciados desde o ano passado para quem economiza água.

Em São Paulo, a despoluição dos rios Tietê e Pinheiros também é um caminho. Mas esse parece ser um cenário utópico, sobretudo se lembrarmos que a ideia é citada há décadas pelo poder público.

O que nossas autoridades precisam entender é que não dá para passar uma vida acreditando na ajuda divina. É preciso arregaçar as mangas e se preparar. Há ainda muito a fazer e a investir. Porque nada cai do céu –nem mesmo a água tem caído ultimamente.

CARLOS MAGNO, 53, é meteorologista e presidente da Climatempo Meteorologia

Os artigos publicados com assinatura não traduzem a opinião do jornal. Sua publicação obedece ao propósito de estimular o debate dos problemas brasileiros e mundiais e de refletir as diversas tendências do pensamento contemporâneo. debates@uol.com.br

The Snapchat and The Platypus (Medium)

Scissor-testing A New Branch of the Mobile Evolutionary Tree

Andrew McLaughlin

The British Museum still has the first platypus sent back to Europe from Australia, by Captain John Hunter in 1799. There are scissor marks on its duck-bill.

The first platypus specimen studied by European scientists, at the British Museum.

That’s because George Shaw, the first scientist who studied the astonishing specimen, was pretty sure it was a hoax, sewn together by pranksters or profiteers. With its webbed feet, furry pelt, venomous claw, and ducky beak, it was too freakish to be believed; moreover, London society had lately been thrilled, then crestfallen, by a wave of Franken-mermaids and other concocted exotica hawked by foreign sailors. So Shaw’s first move upon examining the platypus was to reach for his scissors, to uncover what kind of clever stitches bound the amalgamation together.

First published illustrations of a platypus, by George Shaw, “The Duck-Billed Platypus,” Naturalist’s Miscellany, Vol. X (1799).

Finding that the platypus was held together by flesh, not thread, Shaw stopped snipping and starting measuring, and marveling. He published a dutiful summary of his anatomical observations, together with field notes from Australia, in the impossibly well-named Naturalist’s Miscellany. Even with the benefit of several additional, later-arriving specimens, he wrote that it was “impossible not to entertain some doubts as to the genuine nature of the animal, and to surmise that there might have been practised some arts of deception in its structure.”

Which brings me to Snapchat.

When a certain kind of person — OK, an older person, where “old” equals 24— first encounters Snapchat, the reaction is typically some mixture of mystification, disbelief, and annoyance. For people who have gotten used to the dominant evolved anatomies of mobile apps, Snapchat seems like an odd and improbable creature.

A typical sentiment:

Or, as the 32-year-old Will Oremus put it in a brilliant and entertaining screed: “Is Snapchat Really Confusing, Or Am I Just Old?

A quick cruise through the app reveals why people born before the dawn of Clinton Administration react so strongly to it: Snapchat’s UI is really different from what we’re used to. What we’re used to is desktop software and its lineal descendants, with their predictably-located upper-margin drop-down menus, scrollable windows and swappable tabs, and logo-bearing application icons. On our mobile devices, designers have forged comfortingly similar UI elements, ever-so-slightly tweaked to work on smaller screens: scrollable feeds, sliding drawers with logically stacked navigation and option menus, all signaled by a homescreen hamburger icon.

Here are some of the ways Snapchat is different:

  • The app opens in camera mode. You don’t start with a social feed like Facebook, Twitter, LinkedIn, or Instagram, an editorial content feed like Digg, Buzzfeed, or the New York Times, a list of friends like Google Hangouts or Line, or a chronology of recent messages like FaceTime, Skype, or Slack. Instead, you start with whatever your phone’s camera is currently aimed at. Snapchat believes that you (should) want to create something — a photo, a short video— for immediate sharing. Snapchat is designed for you to create first, consume later.
  • There is no options menu. You have to navigate around the app without the crutch of a menu adorned with actual words that spell out what you can do and where you can go. But wait, you cry, there is (sometimes) a hamburger icon right there on the homescreen! Only it doesn’t do what you expect. Tapping the hamburger takes you to Snapchat Stories, a sort of expansive, broadcast-like version of the Snapchat snap. It doesn’t open a sliding drawer with a soothing hierarchical options menu. In Snapchat, navigation is done directly, via left/right/up/down thumb slides, supplemented by a handful of redundant touchable icons. People who are used to tapping well-labeled menu options are often baffled by Snapchat; but conversely, it will feel natural to someone whose first software experiences were on a mobile device, rather than a desktop.
  • Snapchat uses icons that change shape and color to signal different things. For example, a solid arrow is a sent snap (image or video); red if without audio, purple if with audio, and blue if a text chat only. The arrow becomes hollow once a friend has opened it. A solid square is a received snap or chat, with the same variations of color and hollowness. There are other icons that alert you when a friend has replayed or taken a screenshot of your snap. It’s not a complicated system, but it is esoteric and native to Snapchat; nothing about it is self-evident to new users.
  • Snapchat doesn’t pester you to keep connecting to more people.Adding friends in Snapchat is bizarrely cumbersome. If you’re used to traditional social apps, your first move will be tap on “Add Friends” (if you can find it), import your phone’s contacts database, and then squint through the entire list, name by name, to see which ones are on Snapchat and manually add them. It’s a huge pain if you have a lot of contacts. But Snapchat conversely makes it super-easy to add a friend when you are physically together by giving you a personally-encoded, QR-like Ghostface Chillah icon that can be snapped by a friend to add you. Notably, when you first set up Snapchat, you find that you can’t import your social graph from Facebook, Twitter, Google, etc. Snapchat draws solely on your phone’s contacts database. Though to some measure driven by necessity (at some point between the introduction of Pinterest’s “Add All My Facebook Friends” feature and the launch of Snapchat, Facebook started blocking new social services from using its social graph to kickstart theirs) Snapchat’s use of the phone’s contacts database reflects its emphasis on intimate, private, person-to-person communications with people you already know (or just met). It also shows Snapchat’s determination not to be dependent on other companies for core elements of its offerings.

So Snapchat’s user interface really is different, and different in ways that turn off a lot of people habituated to the dominant mobile design vocabulary, descended from desktop applications. And yet, Snapchat’s been getting hugely popular, with somebody.

Like any social or communications application, Snapchat has grown through real-world social pathways: its users tell their friends to get on it. If your friends or colleagues don’t use it, you won’t find much value in it. As a result, social and communications services like Snapchat, WhatsApp, WeChat, KakaoTalk, Viber, Line, Kik, etc., can saturate some discrete user clusters (e.g., U.S. Hispanic teens living in Southern California, Brooklyn-based social media junkies, female Korean professionals, etc.) but be almost unknown in others.

In the U.S., for example, Snapchat’s user cohort is overwhelming young — younger than any scaled social app we’ve seen before.

From Business Insider, July 30, 2014, http://www.businessinsider.com/a-primer-on-snapchat-and-its-demographics-2014-6

But the fact that Snapchat has become hugely popular with a wide swath of 12-to-24 year-old Americans doesn’t answer Will Oremus’s basic question. At the risk of stretching my metaphor past the breaking point, it doesn’t tell you whether Snapchat is a platypus (an isolated and precarious evolutionary adaptation well-suited to a specific subcontinental ecology), a fake mermaid (an apparent evolutionary advance that falls apart upon close inspection), or something more like a killer whale (a seemingly unlikely but wildly successful branch of the mammalian tree that has become an apex predator prowling every ocean and climate).


A few weeks ago, my betaworks partners and I found ourselves arguing about Snapchat, the merits of its app interface, and the trajectory of its future path. To get some practical data, and to understand Snapchat more thoroughly, we decided to commit to it, hard, for a week. And then to do the same for other fast-rising communications apps.

To reach meaningful scale, we enforced a herd migration among betaworkers. Starting two weeks ago, we announced that all intra-betaworks communications had to happen via Snapchat. If you wanted to reach us, you had to use Snapchat.

The result has been a scissor-test of Snapchat. We still ended up with conflicting opinions about whether Snapchat is poorly or brilliantly designed (or both). But we all agreed that the experience is more intimate, more private, and more creativity-sparking than we had previously understood. (And I learned the hard way how Snapchat punishes procrastination: one morning, my partner Sam sent me a couple of questions about a pending deal; I quickly scanned them while out on the sidewalk across town; when I returned to the office and opened the app to compose a response, Sam’s chats had disappeared and I couldn’t remember what the questions were.)

There’s one part of Snapchat, though, that really does seem to be grafted on like a fake duck-bill. Snapchat Discover is a new section of the app where big media companies like CNN, ESPN, People, Cosmopolitan, and the Daily Mail post slickly-produced packages that have as much in common with the casual, rough-hewn, intimate, person-to-person snap as Air Force One has with a homemade kite. Snapchat Discover is broadcast, not interpersonal; professional, not amateur; branded, not hacked. Snapchat’s ability to drive attention may ultimately make its Discover platform a viable (native, mobile, short-form) alternative to TV. But for now, it feels like an amphibian limb sutured onto a mammalian torso.

Looking at Snapchat Discover from the perspective of Digg, as a potential someday distribution platform, I can see why Buzzfeed declined to participate, at least for now.


My conclusion from the scissor-test is that Snapchat really is a new and promising branch of the mobile evolutionary tree, but burdened with at least one surgically dubious addition.

The Snapchat week was so much fun, we’re moving on. Last week, we all dogpiled onto Line. This week, WeChat. Next up, in some order, will be WhatsAppKikViberKakaoTalk, and so on.

More test results to come.

Why Hollywood had to Fudge The General Relativity-Based Wormhole Scenes in Interstellar (The Physics arXiv Blog)

The Physics arXiv Blog

Interstellar is the only Hollywood movie to use the laws of physics to create film footage of the most extreme regions of the Universe. Now the film’s scientific advisor, Kip Thorne, reveals why they fudged the final footage

Wormholes are tunnel-like structures that link regions of spacetime. In effect, they are shortcuts from one part of the universe to another. Theoretical physicists have studied their properties for decades but despite all this work, nobody quite knows if they can exist in our universe or whether matter could pass through them if they did.

That hasn’t stopped science fiction writers making liberal use of wormholes as a convenient form of transport over otherwise unnavigable distances. And where science fiction writers roam, Hollywood isn’t far behind. Wormholes have played starring roles in films such as Star Trek, Stargate and even Bill & Ted’s Excellent Adventure. But none of these films depict wormholes in the way they might look like in real life.

All that has now changed thanks to the work of film director Christopher Nolan and Kip Thorne, a theoretical physicist at the California Institute of Technology in Pasadena, who collaborated on the science fiction film Interstellar, which was released in 2014.

Nolan wanted the film to be as realistic as possible and so invited Thorne, an expert on black holes and wormholes, to help create the footage. Thorne was intrigued by the possibility of studying wormholes visually, given that they are otherwise entirely theoretical. The result, he thought, could be a useful way of teaching students about general relativity.

So Thorne agreed to collaborate with a special effects team at Double Negative in London to create realistic footage. And today they publish a paper on the arXiv about the collaboration and what they learnt.

Interstellar is an epic tale. It begins with the discovery of a wormhole near Saturn and the decision to send a team of astronauts through it in search of a habitable planet that humans can populate because Earth is dying.

A key visual element of the story is the view through the wormhole of a different galaxy and the opposite view of Saturn. But what would these views look like?

One way to create computer generated images is to trace all the rays of light in a given scene and then determine which rays enter a camera placed at a given spot. But this is hugely inefficient because most of the rays never enter the camera.

A much more efficient method is to allow time to run backwards and trace the trajectories of light rays leaving the camera and travelling back to their source. In that way, the computing power is focused only on light rays that contribute to the final image.

So Thorne derived the various equations from general relativity that would determine the trajectory of the rays through a wormhole and the team at Double Negative created a computer model that simulated this, which they could run backwards. They also experimented with wormholes of different shapes, for example with long thin throats or much shorter ones and so on.

The results provided some fascinating insights into the way a wormhole might appear in the universe. But it also threw up some challenges for the film makers.

One problem was that Nolan chose to use footage of the view through a short wormhole, which produced fascinatingly distorted images of the distant galaxy. However, the footage of travelling through such a wormhole was too short. “The trip was quick and not terribly interesting, visually — not at all what Nolan wanted for his movie,” say Thorne and co.

But the journey through a longer wormhole was like travelling through a tunnel and very similar to things seen in other movies. “None of the clips, for any choice of parameters, had the compelling freshness that Nolan sought,” they admit.

In particular, when travelling through a wormhole, the object at the end becomes larger, scaling up from its centre and growing in size until it fills the frame. That turns out to be hard to process visually. “Because there is no parallax or other relative motion in the frame, to the audience it looks like the camera is zooming into the center of the wormhole,” say Thorne and co.

But camera zoom was utterly unlike the impression the film-makers wanted to portray, which was the sense of travelling through a shortcut from one part of the universe to another. “To foster that understanding, Nolan asked the visual effects team to convey a sense of travel through an exotic environment, one that was thematically linked to the exterior appearance of the wormhole but also incorporated elements of passing landscapes and the sense of a rapidly approaching destination,” they say.

So for the final cut, they asked visual effects artists to add some animation that gave this sense of motion. “The end result was a sequence of shots that told a story comprehensible by a general audience while resembling the wormhole’s interior,” they say.

In other words, they had to fudge it. Nevertheless, the remarkable attention to detail is a testament to the scientific commitment of the director and his team. And Thorne is adamant that the entire process of creating the footage will be an inspiration to students of film-making and of general relativity.

Of course, whether wormholes really do look like any of this is hard to say. The current thinking is that the laws of physics probably forbid the creation of wormholes like the one in Interstellar.

However, there are several ideas that leave open the possibility that wormholes might exist. The first is that wormholes may exist on the quantum scale, so a sufficiently advanced technology could enlarge them in some way.

The second is that our universe may be embedded in a larger multidimensional cosmos called a brane. That opens the possibility of travelling into other dimensions and then back into our own.

But the possibility that wormholes could exist in these scenarios reflects our ignorance of the physics involved rather any important insight. Nevertheless, there’s no harm in a little speculation!

Ref: arxiv.org/abs/1502.03809 : Visualizing Interstellar’s Wormhole

Physics’s pangolin (AEON)

Trying to resolve the stubborn paradoxes of their field, physicists craft ever more mind-boggling visions of reality

by 

Illustration by Claire ScullyIllustration by Claire Scully

Margaret Wertheim is an Australian-born science writer and director of the Institute For Figuring in Los Angeles. Her latest book is Physics on the Fringe (2011).

Theoretical physics is beset by a paradox that remains as mysterious today as it was a century ago: at the subatomic level things are simultaneously particles and waves. Like the duck-rabbit illusion first described in 1899 by the Polish-born American psychologist Joseph Jastrow, subatomic reality appears to us as two different categories of being.

But there is another paradox in play. Physics itself is riven by the competing frameworks of quantum theory and general relativity, whose differing descriptions of our world eerily mirror the wave-particle tension. When it comes to the very big and the extremely small, physical reality appears to be not one thing, but two. Where quantum theory describes the subatomic realm as a domain of individual quanta, all jitterbug and jumps, general relativity depicts happenings on the cosmological scale as a stately waltz of smooth flowing space-time. General relativity is like Strauss — deep, dignified and graceful. Quantum theory, like jazz, is disconnected, syncopated, and dazzlingly modern.

Physicists are deeply aware of the schizophrenic nature of their science and long to find a synthesis, or unification. Such is the goal of a so-called ‘theory of everything’. However, to non-physicists, these competing lines of thought, and the paradoxes they entrain, can seem not just bewildering but absurd. In my experience as a science writer, no other scientific discipline elicits such contradictory responses.

In string cosmology, the totality of existing universes exceeds the number of particles in our universe by more than 400 orders of magnitude

This schism was brought home to me starkly some months ago when, in the course of a fortnight, I happened to participate in two public discussion panels, one with a cosmologist at Caltech, Pasadena, the other with a leading literary studies scholar from the University of Southern Carolina. On the panel with the cosmologist, a researcher whose work I admire, the discussion turned to time, about which he had written a recent, and splendid, book. Like philosophers, physicists have struggled with the concept of time for centuries, but now, he told us, they had locked it down mathematically and were on the verge of a final state of understanding. In my Caltech friend’s view, physics is a progression towards an ever more accurate and encompassing Truth. My literary theory panellist was having none of this. A Lewis Carroll scholar, he had joined me for a discussion about mathematics in relation to literature, art and science. For him, maths was a delightful form of play, a ludic formalism to be admired and enjoyed; but any claims physicists might make about truth in their work were, in his view, ‘nonsense’. This mathematically based science, he said, was just ‘another kind of storytelling’.

On the one hand, then, physics is taken to be a march toward an ultimate understanding of reality; on the other, it is seen as no different in status to the understandings handed down to us by myth, religion and, no less, literary studies. Because I spend my time about equally in the realms of the sciences and arts, I encounter a lot of this dualism. Depending on whom I am with, I find myself engaging in two entirely different kinds of conversation. Can we all be talking about the same subject?

Many physicists are Platonists, at least when they talk to outsiders about their field. They believe that the mathematical relationships they discover in the world about us represent some kind of transcendent truth existing independently from, and perhaps a priori to, the physical world. In this way of seeing, the universe came into being according to a mathematical plan, what the British physicist Paul Davies has called ‘a cosmic blueprint’. Discovering this ‘plan’ is a goal for many theoretical physicists and the schism in the foundation of their framework is thus intensely frustrating. It’s as if the cosmic architect has designed a fiendish puzzle in which two apparently incompatible parts must be fitted together. Both are necessary, for both theories make predictions that have been verified to a dozen or so decimal places, and it is on the basis of these theories that we have built such marvels as microchips, lasers, and GPS satellites.

Quite apart from the physical tensions that exist between them, relativity and quantum theory each pose philosophical problems. Are space and time fundamental qualities of the universe, as general relativity suggests, or are they byproducts of something even more basic, something that might arise from a quantum process? Looking at quantum mechanics, huge debates swirl around the simplest situations. Does the universe split into multiple copies of itself every time an electron changes orbit in an atom, or every time a photon of light passes through a slit? Some say yes, others say absolutely not.

Theoretical physicists can’t even agree on what the celebrated waves of quantum theory mean. What is doing the ‘waving’? Are the waves physically real, or are they just mathematical representations of probability distributions? Are the ‘particles’ guided by the ‘waves’? And, if so, how? The dilemma posed by wave-particle duality is the tip of an epistemological iceberg on which many ships have been broken and wrecked.

Undeterred, some theoretical physicists are resorting to increasingly bold measures in their attempts to resolve these dilemmas. Take the ‘many-worlds’ interpretation of quantum theory, which proposes that every time a subatomic action takes place the universe splits into multiple, slightly different, copies of itself, with each new ‘world’ representing one of the possible outcomes.

When this idea was first proposed in 1957 by the American physicist Hugh Everett, it was considered an almost lunatic-fringe position. Even 20 years later, when I was a physics student, many of my professors thought it was a kind of madness to go down this path. Yet in recent years the many-worlds position has become mainstream. The idea of a quasi-infinite, ever-proliferating array of universes has been given further credence as a result of being taken up by string theorists, who argue that every mathematically possible version of the string theory equations corresponds to an actually existing universe, and estimate that there are 10 to the power of 500 different possibilities. To put this in perspective: physicists believe that in our universe there are approximately 10 to the power of 80 subatomic particles. In string cosmology, the totality of existing universes exceeds the number of particles in our universe by more than 400 orders of magnitude.

Nothing in our experience compares to this unimaginably vast number. Every universe that can be mathematically imagined within the string parameters — including ones in which you exist with a prehensile tail, to use an example given by the American string theorist Brian Greene — is said to be manifest somewhere in a vast supra-spatial array ‘beyond’ the space-time bubble of our own universe.

What is so epistemologically daring here is that the equations are taken to be the fundamental reality. The fact that the mathematics allows for gazillions of variations is seen to be evidence for gazillions of actual worlds.

Perhaps what we are encountering here is not so much the edge of reality, but the limits of the physicists’ category system

This kind of reification of equations is precisely what strikes some humanities scholars as childishly naive. At the very least, it raises serious questions about the relationship between our mathematical models of reality, and reality itself. While it is true that in the history of physics many important discoveries have emerged from revelations within equations — Paul Dirac’s formulation for antimatter being perhaps the most famous example — one does not need to be a cultural relativist to feel sceptical about the idea that the only way forward now is to accept an infinite cosmic ‘landscape’ of universes that embrace every conceivable version of world history, including those in which the Middle Ages never ended or Hitler won.

In the 30 years since I was a student, physicists’ interpretations of their field have increasingly tended toward literalism, while the humanities have tilted towards postmodernism. Thus a kind of stalemate has ensued. Neither side seems inclined to contemplate more nuanced views. It is hard to see ways out of this tunnel, but in the work of the late British anthropologist Mary Douglas I believe we can find a tool for thinking about some of these questions.

On the surface, Douglas’s great book Purity and Danger (1966) would seem to have nothing do with physics; it is an inquiry into the nature of dirt and cleanliness in cultures across the globe. Douglas studied taboo rituals that deal with the unclean, but her book ends with a far-reaching thesis about human language and the limits of all language systems. Given that physics is couched in the language-system of mathematics, her argument is worth considering here.

In a nutshell, Douglas notes that all languages parse the world into categories; in English, for instance, we call some things ‘mammals’ and other things ‘lizards’ and have no trouble recognising the two separate groups. Yet there are some things that do not fit neatly into either category: the pangolin, or scaly anteater, for example. Though pangolins are warm-blooded like mammals and birth their young, they have armoured bodies like some kind of bizarre lizard. Such definitional monstrosities are not just a feature of English. Douglas notes that all category systems contain liminal confusions, and she proposes that such ambiguity is the essence of what is seen to be impure or unclean.

Whatever doesn’t parse neatly in a given linguistic system can become a source of anxiety to the culture that speaks this language, calling forth special ritual acts whose function, Douglas argues, is actually to acknowledge the limits of language itself. In the Lele culture of the Congo, for example, this epistemological confrontation takes place around a special cult of the pangolin, whose initiates ritualistically eat the abominable animal, thereby sacralising it and processing its ‘dirt’ for the entire society.

‘Powers are attributed to any structure of ideas,’ Douglas writes. We all tend to think that our categories of understanding are necessarily real. ‘The yearning for rigidity is in us all,’ she continues. ‘It is part of our human condition to long for hard lines and clear concepts’. Yet when we have them, she says, ‘we have to either face the fact that some realities elude them, or else blind ourselves to the inadequacy of the concepts’. It is not just the Lele who cannot parse the pangolin: biologists are still arguing about where it belongs on the genetic tree of life.

As Douglas sees it, cultures themselves can be categorised in terms of how well they deal with linguistic ambiguity. Some cultures accept the limits of their own language, and of language itself, by understanding that there will always be things that cannot be cleanly parsed. Others become obsessed with ever-finer levels of categorisation as they try to rid their system of every pangolin-like ‘duck-rabbit’ anomaly. For such societies, Douglas argues, a kind of neurosis ensues, as the project of categorisation takes ever more energy and mental effort. If we take this analysis seriously, then, in Douglas’ terms, might it be that particle-waves are our pangolins? Perhaps what we are encountering here is not so much the edge of reality, but the limits of the physicists’ category system.

In its modern incarnation, physics is grounded in the language of mathematics. It is a so-called ‘hard’ science, a term meant to imply that physics is unfuzzy — unlike, say, biology whose classification systems have always been disputed. Based in mathematics, the classifications of physicists are supposed to have a rigour that other sciences lack, and a good deal of the near-mystical discourse that surrounds the subject hinges on ideas about where the mathematics ‘comes from’.

According to Galileo Galilei and other instigators of what came to be known as the Scientific Revolution, nature was ‘a book’ that had been written by God, who had used the language of mathematics because it was seen to be Platonically transcendent and timeless. While modern physics is no longer formally tied to Christian faith, its long association with religion lingers in the many references that physicists continue to make about ‘the mind of God’, and many contemporary proponents of a ‘theory of everything’ remain Platonists at heart.

It’s a startling thought, in an age when we can read the speed of our cars from our digitised dashboards, that somebody had to discover ‘velocity’

In order to articulate a more nuanced conception of what physics is, we need to offer an alternative to Platonism. We need to explain how the mathematics ‘arises’ in the world, in ways other than assuming that it was put there there by some kind of transcendent being or process. To approach this question dispassionately, it is necessary to abandon the beautiful but loaded metaphor of the cosmic book — and all its authorial resonances — and focus, not the creation of the world, but on the creation of physics as a science.

When we say that ‘mathematics is the language of physics’, we mean that physicists consciously comb the world for patterns that are mathematically describable; these patterns are our ‘laws of nature’. Since mathematical patterns proceed from numbers, much of the physicist’s task involves finding ways to extract numbers from physical phenomena. In the 16th and 17th centuries, philosophical discussion referred to this as the process of ‘quantification’; today we call it measurement. One way of thinking about modern physics is as an ever more sophisticated process of quantification that multiplies and diversifies the ways we extract numbers from the world, thus giving us the raw material for our quest for patterns or ‘laws’. This is no trivial task. Indeed, the history of physics has turned on the question of whatcan be measured and how.

Stop for a moment and take a look around you. What do you think can be quantified? What colours and forms present themselves to your eye? Is the room bright or dark? Does the air feel hot or cold? Are birds singing? What other sounds do you hear? What textures do you feel? What odours do you smell? Which, if any, of these qualities of experience might be measured?

In the early 14th century, a group of scholarly monks known as the calculatores at the University of Oxford began to think about this problem. One of their interests was motion, and they were the first to recognise the qualities we now refer to as ‘velocity’ and ‘acceleration’ — the former being the rate at which a body changes position, the latter, the rate at which the velocity itself changes. It’s a startling thought, in an age when we can read the speed of our cars from our digitised dashboards, that somebody had to discover ‘velocity’.

Yet despite the calculatores’ advances, the science of kinematics made barely any progress until Galileo and his contemporaries took up the baton in the late-16th century. In the intervening time, the process of quantification had to be extracted from a burden of dreams in which it became, frankly, bogged down. For along with motion, the calculatoreswere also interested in qualities such as sin and grace and they tried to find ways to quantify these as well. Between the calculatores and Galileo, students of quantification had to work out what they were going to exclude from the project. To put it bluntly, in order for the science of physics to get underway, the vision had to be narrowed.

How, exactly, this narrowing was to be achieved was articulated by the 17th-century French mathematician and philosopher René Descartes. What could a mathematically based science describe? Descartes’s answer was that the new natural philosophers must restrict themselves to studying matter in motion through space and time. Maths, he said, could describe the extended realm — or res extensa.Thoughts, feelings, emotions and moral consequences, he located in the ‘realm of thought’, or res cogitans, declaring them inaccessible to quantification, and thus beyond the purview of science. In making this distinction, Descartes did not divide mind from body (that had been done by the Greeks), he merely clarified the subject matter for a new physical science.

So what else apart from motion could be quantified? To a large degree, progress in physics has been made by slowly extending the range of answers. Take colour. At first blush, redness would seem to be an ineffable and irreducible quale. In the late 19th century, however, physicists discovered that each colour in the rainbow, when diffracted through a prism, corresponds to a different wavelength of light. Red light has a wavelength of around 700 nanometres, violet light around 400 nanometres. Colour can be correlated with numbers — both the wavelength and frequency of an electromagnetic wave. Here we have one half of our duality: the wave.

The discovery of electromagnetic waves was in fact one of the great triumphs of the quantification project. In the 1820s, Michael Faraday noticed that, if he sprinkled iron filings around a magnet, the fragments would spontaneously assemble into a pattern of lines that, he conjectured, were caused by a ‘magnetic field’. Physicists today accept fields as a primary aspect of nature but at the start of the Industrial Revolution, when philosophical mechanism was at its peak, Faraday’s peers scoffed. Invisible fields smacked of magic. Yet, later in the 19th century, James Clerk Maxwell showed that magnetic and electric fields were linked by a precise set of equations — today known as Maxwell’s Laws — that enabled him to predict the existence of radio waves. The quantification of these hitherto unsuspected aspects of our world — these hidden invisible ‘fields’ — has led to the whole gamut of modern telecommunications on which so much of modern life is now staged.

Turning to the other side of our duality – the particle – with a burgeoning array of electrical and magnetic equipment, physicists in the late 19th and early 20th centuries began to probe matter. They discovered that atoms were composed from parts holding positive and negative charge. The negative electrons, were found to revolve around a positive nucleus in pairs, with each member of the pair in a slightly different state, or ‘spin’. Spin turns out to be a fundamental quality of the subatomic realm. Matter particles, such as electrons, have a spin value of one half. Particles of light, or photons, have a spin value of one. In short, one of the qualities that distinguishes ‘matter’ from ‘energy’ is the spin value of its particles.

We have seen how light acts like a wave, yet experiments over the past century have shown that under many conditions it behaves instead like a stream of particles. In the photoelectric effect (the explanation of which won Albert Einstein his Nobel Prize in 1921), individual photons knock electrons out of their atomic orbits. In Thomas Young’s infamous double-slit experiment of 1805, light behaves simultaneously like waves and particles. Here, a stream of detectably separate photons are mysteriously guided by a wave whose effect becomes manifest over a long period of time. What is the source of this wave and how does it influence billions of isolated photons separated by great stretches of time and space? The late Nobel laureate Richard Feynman — a pioneer of quantum field theory — stated in 1965 that the double-slit experiment lay at ‘the heart of quantum mechanics’. Indeed, physicists have been debating how to interpret its proof of light’s duality for the past 200 years.

Just as waves of light sometimes behave like particles of matter, particles of matter can sometimes behave like waves. In many situations, electrons are clearly particles: we fire them from electron guns inside the cathode-ray tubes of old-fashioned TV sets and each electron that hits the screen causes a tiny phosphor to glow. Yet, in orbiting around atoms, electrons behave like three-dimensional waves. Electron microscopes put the wave-quality of these particles to work; here, in effect, they act like short-wavelengths of light.

Physics is not just another story about the world: it is a qualitatively different kind of story to those told in the humanities, in myths and religions

Wave-particle duality is a core feature of our world. Or rather, we should say, it is a core feature of our mathematical descriptions of our world. The duck-rabbits are everywhere, colonising the imagery of physicists like, well, rabbits. But what is critical to note here is that however ambiguous our images, the universe itself remains whole and is manifestly not fracturing into schizophrenic shards. It is this tantalising wholeness in the thing itself that drives physicists onward, like an eternally beckoning light that seems so teasingly near yet is always out of reach.

Instrumentally speaking, the project of quantification has led physicists to powerful insights and practical gain: the computer on which you are reading this article would not exist if physicists hadn’t discovered the equations that describe the band-gaps in semiconducting materials. Microchips, plasma screens and cellphones are all byproducts of quantification and, every decade, physicists identify new qualities of our world that are amendable to measurement, leading to new technological possibilities. In this sense, physics is not just another story about the world: it is a qualitatively different kind of story to those told in the humanities, in myths and religions. No language other than maths is capable of expressing interactions between particle spin and electromagnetic field strength. The physicists, with their equations, have shown us new dimensions of our world.

That said, we should be wary of claims about ultimate truth. While quantification, as a project, is far from complete, it is an open question as to what it might ultimately embrace. Let us look again at the colour red. Red is not just an electromagnetic phenomenon, it is also a perceptual and contextual phenomenon. Stare for a minute at a green square then look away: you will see an afterimage of a red square. No red light has been presented to your eyes, yet your brain will perceive a vivid red shape. As Goethe argued in the late-18th century, and Edwin Land (who invented Polaroid film in 1932) echoed, colour cannot be reduced to purely prismatic effects. It exists as much in our minds as in the external world. To put this into a personal context, no understanding of the electromagnetic spectrum will help me to understand why certain shades of yellow make me nauseous, while electric orange fills me with joy.

Descartes was no fool; by parsing reality into the res extensa and res cogitans he captured something critical about human experience. You do not need to be a hard-core dualist to imagine that subjective experience might not be amenable to mathematical law. For Douglas, ‘the attempt to force experience into logical categories of non-contradiction’ is the ‘final paradox’ of an obsessive search for purity. ‘But experience is not amenable [to this narrowing],’ she insists, and ‘those who make the attempt find themselves led into contradictions.’

Quintessentially, the qualities that are amenable to quantification are those that are shared. All electrons are essentially the same: given a set of physical circumstances, every electron will behave like any other. But humans are not like this. It is our individuality that makes us so infuriatingly human, and when science attempts to reduce us to the status of electrons it is no wonder that professors of literature scoff.

Douglas’s point about attempting to corral experience into logical categories of non-contradiction has obvious application to physics, particularly to recent work on the interface between quantum theory and relativity. One of the most mysterious findings of quantum science is that two or more subatomic particles can be ‘entangled’. Once particles are entangled, what we do to one immediately affects the other, even if the particles are hundreds of kilometres apart. Yet this contradicts a basic premise of special relativity, which states that no signal can travel faster than the speed of light. Entanglement suggests that either quantum theory or special relativity, or both, will have to be rethought.

More challenging still, consider what might happen if we tried to send two entangled photons to two separate satellites orbiting in space, as a team of Chinese physicists, working with the entanglement theorist Anton Zeilinger, is currently hoping to do. Here the situation is compounded by the fact that what happens in near-Earth orbit is affected by both special and general relativity. The details are complex, but suffice it to say that special relativity suggests that the motion of the satellites will cause time to appear to slow down, while the effect of the weaker gravitational field in space should cause time to speed up. Given this, it is impossible to say which of the photons would be received first at which satellite. To an observer on the ground, both photons should appear to arrive at the same time. Yet to an observer on satellite one, the photon at satellite two should appear to arrive first, while to an observer on satellite two the photon at satellite one should appear to arrive first. We are in a mire of contradiction and no one knows what would in fact happen here. If the Chinese experiment goes ahead, we might find that some radical new physics is required.

To say that every possible version of their equations must be materially manifest strikes me as a kind of berserk literalism

You will notice that the ambiguity in these examples focuses on the issue of time — as do many paradoxes relating to relativity and quantum theory. Time indeed is a huge conundrum throughout physics, and paradoxes surround it at many levels of being. In Time Reborn: From the Crisis in Physics to the Future of the Universe (2013) the American physicist Lee Smolin argues that for 400 years physicists have been thinking about time in ways that are fundamentally at odds with human experience and therefore wrong. In order to extricate ourselves from some of the deepest paradoxes in physics, he says, its very foundations must be reconceived. In an op-ed in New Scientist in April this year, Smolin wrote:
The idea that nature consists fundamentally of atoms with immutable properties moving through unchanging space, guided by timeless laws, underlies a metaphysical view in which time is absent or diminished. This view has been the basis for centuries of progress in science, but its usefulness for fundamental physics and cosmology has come to an end.

In order to resolve contradictions between how physicists describetime and how we experience time, Smolin says physicists must abandon the notion of time as an unchanging ideal and embrace an evolutionary concept of natural laws.

This is radical stuff, and Smolin is well-known for his contrarian views — he has been an outspoken critic of string theory, for example. But at the heart of his book is a worthy idea: Smolin is against the reflexive reification of equations. As our mathematical descriptions of time are so starkly in conflict with our lived experience of time, it is our descriptions that will have to change, he says.

To put this into Douglas’s terms, the powers that have been attributed to physicists’ structure of ideas have been overreaching. ‘Attempts to force experience into logical categories of non-contradiction’ have, she would say, inevitablyfailed. From the contemplation of wave-particle pangolins we have been led to the limits of the linguistic system of physicists. Like Smolin, I have long believed that the ‘block’ conception of time that physics proposes is inadequate, and I applaud this thrilling, if also at times highly speculative, book. Yet, if we can fix the current system by reinventing its axioms, then (assuming that Douglas is correct) even the new system will contain its own pangolins.

In the early days of quantum mechanics, Niels Bohr liked to say that we might never know what ‘reality’ is. Bohr used John Wheeler’s coinage, calling the universe ‘a great smoky dragon’, and claiming that all we could do with our science was to create ever more predictive models. Bohr’s positivism has gone out of fashion among theoretical physicists, replaced by an increasingly hard-core Platonism. To say, as some string theorists do, that every possible version of their equations must be materially manifest strikes me as a kind of berserk literalism, reminiscent of the old Ptolemaics who used to think that every mathematical epicycle in their descriptive apparatus must represent a physically manifest cosmic gear.

We are veering here towards Douglas’s view of neurosis. Will we accept, at some point, that there are limits to the quantification project, just as there are to all taxonomic schemes? Or will we be drawn into ever more complex and expensive quests — CERN mark two, Hubble, the sequel — as we try to root out every lingering paradox? In Douglas’s view, ambiguity is an inherent feature of language that we must face up to, at some point, or drive ourselves into distraction.

3 June 2013

Review of Thinking with Whitehead: A Free and Wild Creation of Concepts (Notre Dame Philosophical Reviews)

2012.06.21
ISABELLE STENGERS
Thinking with Whitehead: A Free and Wild Creation of Concepts
Isabelle Stengers, Thinking with Whitehead: A Free and Wild Creation of Concepts, Michael Chase (tr.), Harvard University Press, 2011, 531pp., $49.95 (hbk), ISBN 9780674048034.

Reviewed by Roland Faber, Claremont School of Theology

Isabelle Stengers’ work on Whitehead was a long time in the making — as a work on Whitehead’s work, as an outcome of her thinking with Whitehead through different instantiations of her own writing, and as a process of translation from the French original. It is an important work, unusual not only for the bold generality with which it tries to characterize Whitehead’s philosophical work in its most important manifestations, but even more importantly, for its effort to present a radical alternative mode of contemporary thinking. One is almost tempted to say that the urgency of this book’s intensity is motivated by nothing less than Stengers’ immediate feeling of the importance of Whitehead’s work for the future of (human) civilization. Since we need to make life-and-death decisions regarding the directions we might (want to) take, the explication of Whitehead’s alternatives may be vital. Hence to think with Whitehead is to think alternatives in which we “sign on in advance to an adventure that will leave none of the terms we normally use as they were.” Yet, as a rule, Stengers is “with” Whitehead not only in sorting out such alternatives, but also in his non-confrontational method of peace-making, in which nothing “will be undermined or summarily denounced as a carrier of illusion.” (24)

The two parts of the book roughly bring to light the development of Whitehead’s thought and its shifting points of gravity, circling around two of its major developments.  One of these developments could be said to be temporal, since Whitehead’s philosophical work over time can be characterized as developing from a philosophy of nature (as it was still embedded in the discussion of a philosophy of science) to a metaphysics (that included everything that a philosophy of science has excluded). The other is more spatial, since it circles around the excluded middle between the philosophy of science (excluding mind) and a general metaphysics (of all worlds), namely, a cosmology of our real universe. In an interesting twist, not so common today in any of the standard fields of discourse, we could also agree with Bruno Latour, who in his introduction suggests that both developments, the temporal — how to overcome the bifurcation of nature — and the spatial — how to understand a cosmos of creative organisms — are again (and further) de-centered by the unusual Whiteheadian reintroduction of “God.” (xiii)

The first fourteen chapters that discuss the “temporal” development of Whitehead’s thought (“From the Philosophy of Nature to Metaphysics”) begin with a hermeneutical invitation to the reader to view the Whiteheadian adventure of thought as a dislocation from all commonly held beliefs and theories about nature and the world in general because it asks “questions that will separate them from every consensus.” (7) As its major problem and point of departure, Stengers identifies Whitehead’s criticism of the “bifurcation of nature,” that is, the constitutional division of the universe into mutually exclusive sections (which are often at war with one another because of this division). One section consists of what science finds to be real, but valueless, and the other of that which constitutes mind — a setup that reduces the first section to senseless motion and the second to mere “psychic additions.” (xii) At first exploring Whitehead’s The Concept of Nature, the beginning chapters draw out the contours of Whitehead’s reformulation of the concept of nature, implying that it must not avoid “what the concept of nature designates as ultimate: knowledge.” (41) In Whitehead’s view, knowledge and conceptualization become essential to the concept of nature. While the “goal is not to define a nature that is ‘knowable’ in the philosophers’ sense,” Whitehead defines nature and knowledge “correlatively” such that “‘what’ we perceive does indeed designate nature rather than the perceiving mind.” (44) Conversely, “exactness” is no longer an ideal, but “a thickness with a plurality of experiences occurring simultaneously — like a person walking by.” (55) With Bergson, Whitehead holds that such duration — an event — is the “foothold of the mind” (67) in nature. Being a standpoint, a perspective, paying attention to the aspects of its own integration, such a characterization of an event is meant to generate Whitehead’s argument, as unfolded in Science and the Modern World, against the “fallacy of misplaced concreteness” (which excludes standpoints by introducing exactness in describing vacuous matter) and, thereby, the bifurcation of nature. (113)

On the way to the cosmology of Process and Reality — itself “a labyrinth-book, a book about which one no longer knows whether it has an author, or whether it is not rather the book that has fashioned its author” (122) — Stengers examines the two unexpected metaphysical chapters of Science and the Modern World — on Abstraction and God — as urged by the aesthetic question within a universe, which defines itself by some kind of harmony and a rationality, that is, by faith in the order of a nature, that does not exclude organisms as exhibiting “living values.” (130) As it resists bifurcation, it enables us to reconcile science and philosophy. This is the moment where, as Stengers shows, Whitehead finds himself in a place where he needs to introduce the concept of God. This move is, however, not motivated by a “preliminary affirmation of His existence,” but by a

fundamental experience of humanity . . . of which no religion can be the privileged expression, although each one develops and collapses, from epoch to epoch, according to whether its doctrines, its rites, its commands, or its definitions do or do not evoke this vision, revive it, or inhibit it, giving it or failing to give it adequate expression (133).

The second part (“Cosmology”) features mainly Process and Reality. Stengers probes the uniqueness and necessity of speculative philosophy and its “intellectual intuition” (234) by exploring its criterion of reciprocal presupposition. (237) This expresses the impossibility of any bifurcation: “the ambition of speculative coherence is to escape the norms to which experiences, isolated by the logical, moral, empiricist, religious, and other stakes that privilege them, are” at “risk of ignoring” the mutuality of “each dancer’s center of gravity” with the “dancer’s spin.” This mutuality of movement requires speculative philosophy, which, in its very production, brings to existence the possibility of a thought ‘without gravity,’ without a privileged direction. The ‘neutral’ metaphysical thought of Science and the Modern World had already risked the adventure of trusting others ‘precursively’ at the moment when one accepts that one’s “own body is put off balance.” (239)

What, in such a world, is ultimately given, then? While in The Concept of Nature the Ultimate was Mind and in Science and the Modern World it was God, in Process and Reality it becomes Creativity. (255) Creativity affirms a universe of accidents, for which God introduces a requirement of the reciprocity of these accidents (265). Creativity is, like Deleuze’s “plane of immanence”, that “which insists and demands to be thought by the philosopher, but of which the philosopher is not in any way the creator.” (268)

Stengers’ distinctive mode of thought tries to avoid common dichotomies and to always highlight Whitehead’s alternative, carved out of the always present aura of complexities that surrounds any activity of becoming, interpretation and reflection. Therefore, she introduces the meaning and function of the Whiteheadian organization of organisms — each event being a “social effort, employing the whole universe” (275) — and the organization of thought (the obligations of speculative philosophy) — correcting the initial surplus of chaotic subjectivity (277). Both these forms of organization lead to “the most risky interpretation” (277) of empiricism as that which makes things hold together, neither crushed nor torn apart. Further investigating how occasions and philosophies function together (by dealing with what has been excluded), Stengers presents us with the fundamental importance of how “feeling” (or the transformation of scars) can offer new ways for (concepts of) life that testify to that which has been eliminated or neglected: how decisions can reduce the cost and victims they require (334) and, in actual and conceptual becoming, transform the status quo. (335) Whiteheadian feeling, of course, precedes consciousness and (even prior to perception) is the unconstrained reception that creates the events of its passing.

In chapters 21 and 22, God again enters the picture, not as rule of generality (metaphysically, aesthetically, or ethically), but as “divine endowment [that] thus corresponds to an individual possibility, not to what individuals should accomplish in the name of interest that transcend them.” (390) Divine intervention responds to “what is best for this impasse” (421), a proposition whose actualization is indeterminate by definition. Here, Whitehead’s metaphysics has rejected the normal/normative in favor of the relevant/valuable. (422) This again is related to the concepts of expression and importance in chapter 23, as “the way living societies can simultaneously canalize and be infected by what lurks [from the future]: originality.” (429)

Most interestingly, Stengers describes this interstitial space as a “sacrament” — the “unique sacrament of expression” — that in its “call for a sacramental plurality” conveys Whitehead’s understanding of “the cosmic meaning he confers upon expression and importance” in order to develop “a sociology of life” (435) for which signs are not only functional, but expressive. It is in this context that “Whitehead’s metaphysical God does not recognize his own, he does not read our hearts, he does not understand us better than we do ourselves, he does not demand our recognition or our gratitude, and we shall never contemplate him in his truth.” Rather, God “celebrates my relation to my self and my belongings, to my body, to my feelings, my intentions, my possibilities and perception.” (448)

If there is, for Stengers, a divine function of salvation regarding Whitehead’s God, it is that which only opens through following Whitehead’s call for a secularization of the notion of the divine. (469, 477) Nothing (not a soul) is lost (in this new secularism), although it is only saved in “the unimaginable divine experience.” (469) This “does not make God the being to whom one may say ‘Thou,’ for he has no other value than the difference he will make in the occasional experience that will derive from him its initial aim.” (477) For Stengers, Whitehead wanted to save God from the role assigned to God by the theological propositions that make God the mere respondent to the religious vision. (479) Instead, God affirms the “full solemnity of the world” (493) for us through a neutral metaphysics in which God stands for all appetite, but impersonally so — saving what is affirmed and excluded alike. (490)

Stengers concludes with one of the most astonishing characteristics of Whitehead’s philosophy: namely, his missing ethics. Instead of viewing this as a lack, she conceives his philosophy as ethos, ethos as habit, and habit as aesthetics, (515) “celebrating the adventure of impermanent syntheses.” This ethos, for Stengers, is not “critical wakefulness,” but “the difference between dream and nightmare” — a dream, a storytelling from within the Platonic cave, together with those who live and argue within in it, but also enjoy together the living values that can be received at the interstices. (516-7) In the end, as in the beginning, the adventure of alternative thinking in Whitehead asks us to walk with him in his vectors of disarming politeness — by asking polite questions that one creature may address to another creature. (518)

If there is a weakness in Stengers’ rendering of Whitehead’s work, it is of a more generic nature, demonstrating its embeddedness in a wider cultural spirit or zeitgeist. Anyone who has some knowledge of the history and development of the reception of, and scholarship on, Whitehead will not fail to discover that Stengers is not the only one who has rediscovered this Whitehead, the Whitehead of the alternative adventure, at least within the last twenty years. Her sporadic recourse to Deleuze functions only as a fleeting spark of light that, if slowed down, would highlight the philosophic background on which current thinkers (including Stengers) have begun to view Whitehead. Although this remains almost undetected between the tectonic shifts of Stengers´ reconfiguration of Whitehead’s thought, one will find Stengers’ work to be the outcome of this same tradition. As with several other of these newer approaches, one of the (unfortunate) fault-lines of Stengers’ endeavor is that, when its sources remain hidden, it contradicts the Whiteheadian spirit of recollection, rediscovery and synthesis in ever new concrescences. Originality (creativity) must not suppress the traditions on which it stands; in particular, a hundred years of Whiteheadian scholarship in process theology that is left in silence. It is sad that a rediscovery of Whitehead should narrow the creative synthesis down by being dominated by such a negative prehension. Granted that from afar one might not see the inner diversity and rich potential of process theology’s rhizomatic development, but to think that to name “God” (anew) in (Whitehead’s) philosophy today is original when it in fact rehearses positions process theology has developed over the last century still leaves me with a question: Is freedom from the past necessarily coupled with its oblivion?

In any case, Stengers’ Thinking with Whitehead is an important contribution to the current landscape of the rediscovery of Whitehead in philosophy and adjunct disciplines. It is also a gift for addressing urgent questions of survival and the “good and better life,” the envisioning of which Whitehead sees as a function of philosophy. May Stengers’ rendering of such an alternative congregation of thought for a new future of civilization steer us toward a more peaceful, polite, and less viciously violent vision.

When Exponential Progress Becomes Reality (Medium)

Niv Dror

“I used to say that this is the most important graph in all the technology business. I’m now of the opinion that this is the most important graph ever graphed.”

Steve Jurvetson

Moore’s Law

The expectation that your iPhone keeps getting thinner and faster every two years. Happy 50th anniversary.

Components get cheapercomputers get smallera lot of comparisontweets.

In 1965 Intel co-founder Gordon Moore made his original observation, noticing that over the history of computing hardware, the number of transistors in a dense integrated circuit doubles approximately every two years. The prediction was specific to semiconductors and stretched out for a decade. Its demise has long been predicted, and eventually will come to an end, but continues to be valid to this day.

Expanding beyond semiconductors, and reshaping all kinds of businesses, including those not traditionally thought of as tech.

Yes, Box co-founder Aaron Levie is the official spokesperson for Moore’s Law, and we’re all perfectly okay with that. His cloud computing company would not be around without it. He’s grateful. We’re all grateful. In conversations Moore’s Law constantly gets referenced.

It has become both a prediction and an abstraction.

Expanding far beyond its origin as a transistor-centric metric.

But Moore’s Law of integrated circuits is only the most recent paradigm in a much longer and even more profound technological trend.

Humanity’s capacity to compute has been compounding for as long as we could measure it.

5 Computing Paradigms: Electromechanical computer build by IBM for the 1890 U.S. Census → Alan Turing’s relay based computer that cracked the Nazi Enigma → Vacuum-tube computer predicted Eisenhower’s win in 1952 → Transistor-based machines used in the first space launches → Integrated-circuit-based personal computer

The Law of Accelerating Returns

In his 1999 book The Age of Spiritual Machines Google’s Director of Engineering, futurist, and author Ray Kurzweil proposed “The Law of Accelerating Returns”, according to which the rate of change in a wide variety of evolutionary systems tends to increase exponentially. A specific paradigm, a method or approach to solving a problem (e.g., shrinking transistors on an integrated circuit as an approach to making more powerful computers) provides exponential growth until the paradigm exhausts its potential. When this happens, a paradigm shift, a fundamental change in the technological approach occurs, enabling the exponential growth to continue.

Kurzweil explains:

It is important to note that Moore’s Law of Integrated Circuits was not the first, but the fifth paradigm to provide accelerating price-performance. Computing devices have been consistently multiplying in power (per unit of time) from the mechanical calculating devices used in the 1890 U.S. Census, to Turing’s relay-based machine that cracked the Nazi enigma code, to the vacuum tube computer that predicted Eisenhower’s win in 1952, to the transistor-based machines used in the first space launches, to the integrated-circuit-based personal computer.

This graph, which venture capitalist Steve Jurvetson describes as the most important concept ever to be graphed, is Kurzweil’s 110 year version of Moore’s Law. It spans across five paradigm shifts that have contributed to the exponential growth in computing.

Each dot represents the best computational price-performance device of the day, and when plotted on a logarithmic scale, they fit on the same double exponential curve that spans over a century. This is a very long lasting and predictable trend. It enables us to plan for a time beyond Moore’s Law, without knowing the specifics of the paradigm shift that’s ahead. The next paradigm will advance our ability to compute to such a massive scale, it will be beyond our current ability to comprehend.

The Power of Exponential Growth

Human perception is linear, technological progress is exponential. Our brains are hardwired to have linear expectations because that has always been the case. Technology today progresses so fast that the past no longer looks like the present, and the present is nowhere near the future ahead. Then seemingly out of nowhere, we find ourselves in a reality quite different than what we would expect.

Kurzweil uses the overall growth of the internet as an example. The bottom chart being linear, which makes the internet growth seem sudden and unexpected, whereas the the top chart with the same data graphed on a logarithmic scale tell a very predictable story. On the exponential graph internet growth doesn’t come out of nowhere; it’s just presented in a way that is more intuitive for us to comprehend.

We are still prone to underestimate the progress that is coming because it’s difficult to internalize this reality that we’re living in a world of exponential technological change. It is a fairly recent development. And it’s important to get an understanding for the massive scale of advancements that the technologies of the future will enable. Particularly now, as we’ve reachedwhat Kurzweil calls the “Second Half of the Chessboard.”

(In the end the emperor realizes that he’s been tricked, by exponents, and has the inventor beheaded. In another version of the story the inventor becomes the new emperor).

It’s important to note that as the emperor and inventor went through the first half of the chessboard things were fairly uneventful. The inventor was first given spoonfuls of rice, then bowls of rice, then barrels, and by the end of the first half of the chess board the inventor had accumulated one large field’s worth — 4 billion grains — which is when the emperor started to take notice. It was only as they progressed through the second half of the chessboard that the situation quickly deteriorated.

# of Grains on 1st half: 4,294,967,295

# of Grains on 2nd half: 18,446,744,069,414,600,000

Mind-bending nonlinear gains in computing are about to get a lot more realistic in our lifetime, as there have been slightly more than 32 doublings of performance since the first programmable computers were invented.

Kurzweil’s Predictions

Kurzweil is known for making mind-boggling predictions about the future. And his track record is pretty good.

“…Ray is the best person I know at predicting the future of artificial intelligence.” —Bill Gates

Ray’s prediction for the future may sound crazy (they do sound crazy), but it’s important to note that it’s not about the specific prediction or the exact year. What’s important to focus on is what the they represent. These predictions are based on an understanding of Moore’s Law and Ray’s Law of Accelerating Returns, an awareness for the power of exponential growth, and an appreciation that information technology follows an exponential trend. They may sound crazy, but they are not based out of thin air.

And with that being said…

Second Half of the Chessboard Predictions

“By the 2020s, most diseases will go away as nanobots become smarter than current medical technology. Normal human eating can be replaced by nanosystems. The Turing test begins to be passable. Self-driving cars begin to take over the roads, and people won’t be allowed to drive on highways.”

“By the 2030s, virtual reality will begin to feel 100% real. We will be able to upload our mind/consciousness by the end of the decade.”

To expand image → https://twitter.com/nivo0o0/status/564309273480409088

Not quite there yet…

“By the 2040s, non-biological intelligence will be a billion times more capable than biological intelligence (a.k.a. us). Nanotech foglets will be able to make food out of thin air and create any object in physical world at a whim.”

These clones are cute.

“By 2045, we will multiply our intelligence a billionfold by linking wirelessly from our neocortex to a synthetic neocortex in the cloud.”

Multiplying our intelligence a billionfold by linking our neocortex to a synthetic neocortex in the cloud — what does that actually mean?

In March 2014 Kurzweil gave an excellent talk at the TED Conference. It was appropriately called: Get ready for hybrid thinking.

Here is a summary:

To expand image → https://twitter.com/nivo0o0/status/568686671983570944

These are the highlights:

Nanobots will connect our neocortex to a synthetic neocortex in the cloud, providing an extension of our neocortex.

Our thinking then will be a hybrid of biological and non-biological thinking(the non-biological portion is subject to the Law of Accelerating Returns and it will grow exponentially).

The frontal cortex and neocortex are not really qualitatively different, so it’s a quantitative expansion of the neocortex (like adding processing power).

The last time we expanded our neocortex was about two million years ago. That additional quantity of thinking was the enabling factor for us to take aqualitative leap and advance language, science, art, technology, etc.

We’re going to again expand our neocortex, only this time it won’t be limited by a fixed architecture of inclosure. It will be expanded without limits, by connecting our brain directly to the cloud.

We already carry a supercomputer in our pocket. We have unlimited access to all the world’s knowledge at our fingertips. Keeping in mind that we are prone to underestimate technological advancements (and that 2045 is not a hard deadline) is it really that far of a stretch to imagine a future where we’re always connected directly from our brain?

Progress is underway. We’ll be able to reverse engineering the neural cortex within five years. Kurzweil predicts that by 2030 we’ll be able to reverse engineer the entire brain. His latest book is called How to Create a Mind… This is the reason Google hired Kurzweil.

Hybrid Human Machines

To expand image → https://twitter.com/nivo0o0/status/568686671983570944

“We’re going to become increasingly non-biological…”

“We’ll also have non-biological bodies…”

“If the biological part went away it wouldn’t make any difference…”

They* will be as realistic as real reality.”

Impact on Society

technological singularity —“the hypothesis that accelerating progress in technologies will cause a runaway effect wherein artificial intelligence will exceed human intellectual capacity and control, thus radically changing civilization” — is beyond the scope of this article, but these advancements will absolutely have an impact on society. Which way is yet to be determined.

There may be some regret

Politicians will not know who/what to regulate.

Evolution may take an unexpected twist.

The rich-poor gap will expand.

The unimaginable will become reality and society will change.