Arquivo mensal: dezembro 2012

Sala de aula do futuro (Correio Braziliense)

[O maior problema do artigo: “futuro” aqui tomado como algo não problemático; despolitização da educação]

JC e-mail 4640, de 10 de Dezembro de 2012

Universidade inglesa desenvolve carteiras coletivas com telas sensíveis ao toque para tornar o ensino mais atraente às crianças. Os equipamentos se mostraram capazes de, quando bem utilizados pelo professor, melhorar a habilidade matemática dos alunos.

“Algum de vocês está fazendo 30 mais 31?”, pergunta a pequena Chelsea aos colegas de seu grupo. Prontamente, Adam responde: “Eu estou fazendo todas as de diminuir”. Jack, na outra ponta da mesa, protesta: “Eu também estou fazendo as de diminuir!”. “Então, eu vou fazer as de somar”, conclui a garotinha inglesa. A conversa ocorre em um dos quatro grupos de crianças entre 8 e 10 anos que tentam criar o maior número de expressões matemáticas cujo resultado seja aquele determinado pela professora. Um jogo em que aprendem sem nem perceber que estão estudando. A interação dos estudantes é um exemplo perfeito do que os pedagogos chamam de aprendizado colaborativo. Nesse caso, porém, a discussão matemática teve um catalisador essencial: a tecnologia.

Um grupo de pesquisadores da Escola de Educação da Universidade de Durham, no Reino Unido, desenvolveu e testou, durante três anos, um conjunto de carteiras escolares com telas sensíveis ao toque pensadas especialmente para facilitar o ensino. O invento se mostrou muito eficaz após os experimentos, dos quais participaram aproximadamente 100 crianças. As mesas digitais, grandes o bastante para serem usadas por cerca de quatro alunos, são conectadas em rede a um tablet controlado pelo professor e à lousa no centro da sala de aula, na qual o conteúdo das carteiras pode ser ampliado e discutido em conjunto. Os testes foram feitos com atividades voltadas para o aprendizado de expressões matemáticas. Depois, os resultados foram comparados aos de grupos que realizaram atividade parecida, mas com as ferramentas tradicionais de ensino: giz, lápis e papel.

Ao usar o novo sistema, 45% dos alunos ampliaram seu repertório de expressões numéricas. Na outra turma, esse índice foi de 16%. Ambos os grupos conseguiram aumentar suas habilidades na fluência do aprendizado, no entanto, aqueles em contato com “a sala de aula do futuro” também tiveram benefícios em flexibilidade. Fluência é a capacidade de aplicar procedimentos ou fórmulas a situações cotidianas. É o caso de uma criança que, ao aprender a operação de subtração, usa o conhecimento para calcular o troco em uma compra.

Já a flexibilidade acontece ao aplicar uma gama de soluções para novos problemas, em vez de apenas uma compreensão de como e quando usar os procedimentos aprendidos em sala. “Não queremos que as crianças apenas saibam fazer, mas que sejam reflexivas, capazes de abstração, de pensar o próprio pensamento. Isso é flexibilidade. Ser capaz de transpor a solução para o problema a outros e novos contextos”, esclarece o engenheiro e psicólogo Francisco Antônio Fialho, professor da Universidade Federal de Santa Catarina (UFSC), que não participou do estudo.

Os resultados do experimento inglês, publicados na revista Learning and Instruction, mostram que, ao usar as mesas coletivas, as crianças foram capazes de trabalhar em equipe na busca por novas maneiras de resolver e responder problemas, usando soluções criativas. O estudo coloca em prática o que Fialho chama de arquétipo do mago. Nele, a tecnologia seria a varinha mágica usada pelo aluno capaz de fazer e criar coisas inimagináveis, inclusive, trazer de volta o encanto da educação.

“A tecnologia para a criança é um brinquedo. Ela não gosta de matemática, mas gosta de brincar. Ali, ela está participando de um joguinho com os colegas.” O psicólogo acredita que não basta ensinar a fazer. Hoje, a educação precisa incentivar a inovação de forma colaborativa. “Basta compararmos a produção da ponta de uma lança a um mouse. Sabemos que a ponta da lança foi um artesão quem fez. Já para fabricar o mouse existe um monte de pessoas envolvidas. Tudo que temos hoje surge da troca de ideias.”

Desafio – A principal autora da pesquisa, Emma Mercier, acredita que a tecnologia pode ser usada para apoiar o raciocínio complexo, o pensamento e as interações entre as disciplinas, ao mesmo tempo em que é capaz de aumentar o prazer da atividade no curto prazo. A longo prazo, no entanto, ela considera essencial criar atividades que sejam desafiadoras e envolventes. “Essa geração de crianças está entrando em um mundo em que o uso da tecnologia será normal, precisamos prepará-las para isso. No entanto, é importante que elas se envolvam em atividades de aprendizagem que deem o suporte a uma profunda compreensão das disciplinas, em vez de apenas utilizar um dispositivo tecnológico em particular”, enfatiza Mercier. Para ela, a tecnologia permite aos professores usar um tipo de pedagogia social, que pode apoiar a aprendizagem.

“Essa relação é o sonho de todo mundo: fazer com que a tecnologia esteja em prol da atividade”, acredita Sérgio Abranches, professor do Núcleo de Estudos de Hipertexto e Tecnologias na Educação, da Universidade Federal de Pernambuco (UFPE). “Não pode ser só uma forma de dinamização e motivação que também são elementos importantes. Mas, além do estudo instrumentalizado, queremos que chegue de fato a contribuir com a aprendizagem.”

Tablets no Brasil – A implementação de dispositivos tecnológicos nas salas de aula brasileiras ainda dá os primeiros passos. Uma das mais recentes medidas adotadas pelo governo federal foi a compra de cerca de cinco mil tablets, que deverão chegar às escolas públicas do país no ano que vem. Professores serão treinados para usar os dispositivos em sala de aula e, segundo a pasta, terão à sua disposição cerca de 15 mil aulas, além de obras literárias e livros didáticos em versão digital.

Segundo Sérgio Abranches, professor da UFPE, o tablet é um modo diferente de aplicar a tecnologia, que pode colher resultados interessantes. Mesmo se tratando de um dispositivo individual, Abranches explica que, geralmente, ele é usado coletivamente pelos alunos. Tudo depende da proposta pedagógica feita pelo professor.

“Temos observado que o professor ainda não assimilou a tecnologia digital. Os nativos digitais – como chamamos aqueles que já nasceram em um mundo com a internet – não imaginam o mundo sem ela. Já o professor não tem essa cultura. Às vezes, a vivencia no âmbito pessoal, mas não na prática pedagógica.” Por isso, muitas vezes, os dispositivos acabam usados de forma tímida, como uma ferramenta e não como parte de uma cultura tecnológica. “O primeiro desafio é este: entender pela lógica de quem nasceu com todo esse aparato”, acredita Abranches.

Anúncios

Eight examples of where the IPCC has missed the mark on its predictions and projections (The Daily Climate)

flooded-768

A “king tide” leaves parts of Sausalito, Calif., flooded in 2010. Disagreement over the impact of ice-sheet melting on sea-level rise has led the Intergovernmental Panel on Climate Change to omit their influence – and thus underestimate sea-level rise – in recent reports, a pattern the panel repeats with other key findings. Photo by Yanna B./flickr.

Dec. 6, 2012

Correction appended

By Glenn Scherer
The Daily Climate

Scientists will tell you: There are no perfect computer models. All are incomplete representations of nature, with uncertainty built into them. But one thing is certain: Several fundamental projections found in Intergovernmental Panel on Climate Change reports have consistently underestimated real-world observations, potentially leaving world governments at doubt as to how to guide climate policy.

emissions

Emissions

At the heart of all IPCC projections are “emission scenarios:” low-, mid-, and high-range estimates for future carbon emissions. From these “what if” estimates flow projections for temperature, sea-rise, and more.

Projection: In 2001, the IPCC offered a range of fossil fuel and industrial emissions trends, from a best-case scenario of 7.7 billion tons of carbon released each year by 2010 to a worst-case scenario of 9.7 billion tons.

Reality: In 2010, global emissions from fossil fuels alone totaled 9.1 billion tons of carbon, according to federal government’s Earth Systems Research Laboratory.

Why the miss? While technically within the range, scientists never expected emissions to rise so high so quickly, said IPCC scientist Christopher Fields. The IPCC, for instance, failed to anticipate China’s economic growth, or resistance by the United States and other nations to curbing greenhouse gases.

“We really haven’t explored a world in which the emissions growth rate is as rapid as we have actually seen happen,” Fields said.

Temperature

IPCC models use the emission scenarios discussed above to estimate average global temperature increases by the year 2100.

warming-300

Projection: The IPCC 2007 assessment projected a worst-case temperature rise of 4.3° to 11.5° Fahrenheit, with a high probability of 7.2°F.

Reality: We are currently on track for a rise of between 6.3° and 13.3°F, with a high probability of an increase of 9.4°F by 2100, according to the Massachusetts Institute of Technology. Other modelers are getting similar results, including a study published earlier this month by the Global Carbon Project consortium confirming the likelihood of a 9ºF rise.

Why the miss? IPCC emission scenarios underestimated global CO2 emission rates, which means temperature rates were underestimated too. And it could get worse: IPCC projections haven’t included likely feedbacks such as large-scale melting of Arctic permafrost and subsequent release of large quantities of CO2 and methane, a greenhouse gas 20 times more potent, albeit shorter lived, in the atmosphere than carbon dioxide.

Arctic Meltdown

Five years ago, the summer retreat of Arctic ice wildly outdistanced all 18 IPCC computer models, amazing IPCC scientists. It did so again in 2012.

ice-600

Projection: The IPCC has always confidently projected that the Arctic ice pack was safe at least until 2050 or well beyond 2100.

Reality: Summer ice is thinning faster than every climate projection, and today scientists predict an ice-free Arctic in years, not decades. Last summer, Arctic sea ice extent plummeted to 1.32 million square miles, the lowest level ever recorded – 50 percent below the long-term 1979 to 2000 average.

Why the miss? For scientists, it is increasingly clear that the models are under-predicting the rate of sea ice retreat because they are missing key real-world interactions.

“Sea ice modelers have speculated that the 2007 minimum was an aberration… a matter of random variability, noise in the system, that sea ice would recover.… That no longer looks tenable,” says IPCC scientist Michael Mann. “It is a stunning reminder that uncertainty doesn’t always act in our favor.”

Ice Sheets

Greenland and Antarctica are melting, even though IPCC said in 1995 that they wouldn’t be.

Projection: In 1995, IPCC projected “little change in the extent of the Greenland and Antarctic ice sheets… over the next 50-100 years.” In 2007 IPCC embraced a drastic revision: “New data… show[s] that losses from the ice sheets of Greenland and Antarctica have very likely contributed to sea level rise over 1993 to 2003.”

Today, ice loss in Greenland and Antarctica is trending at least 100 years ahead of projections compared to IPCC’s first three reports.

Reality: Today, ice loss in Greenland and Antarctica is trending at least 100 years ahead of projections compared to IPCC’s first three reports.

Why the miss? “After 2001, we began to realize there were complex dynamics at work – ice cracks, lubrication and sliding of ice sheets,” that were melting ice sheets quicker, said IPCC scientist Kevin Trenberth. New feedbacks unknown to past IPCC authors have also been found. A 2012 study, for example, showed that the reflectivity of Greenland’s ice sheet is decreasing, causing ice to absorb more heat, likely escalating melting.

Sea-Level Rise

The fate of the world’s coastlines has become a classic example of how the IPCC, when confronted with conflicting science, tends to go silent.

Projection: In the 2001 report, the IPCC projected a sea rise of 2 millimeters per year. The worst-case scenario in the 2007 report, which looked mostly at thermal expansion of the oceans as temperatures warmed, called for up to 1.9 feet of sea-level-rise by century’s end.

Today: Observed sea-level-rise has averaged 3.3 millimeters per year since 1990. By 2009, various studies that included ice-melt offered drastically higher projections of between 2.4 and 6.2 feet sea level rise by 2100.

Why the miss? IPCC scientists couldn’t agree on a value for the contribution melting Greenland and Antarctic ice sheets would add to sea-level rise. So they simply left out the data to reach consensus. Science historian Naomi Oreskes calls this – one of IPCC’s biggest underestimates – “consensus by omission.”

Ocean Acidification

To its credit, the IPCC admits to vast climate change unknowns. Ocean acidification is one such impact.

Projection: Unmentioned as a threat in the 1990, 1995 and 2001 IPCC reports. First recognized in 2007, when IPCC projected acidification of between 0.14 and 0.35 pH units by 2100. “While the effects of observed ocean acidification on the marine biosphere are as yet undocumented,” said the report, “the progressive acidification of oceans is expected to have negative impacts on marine shell-forming organisms (e.g. corals) and their dependent species.”

Reality: The world’s oceans absorb about a quarter of the carbon dioxide humans release annually into the atmosphere. Since the start of the Industrial Revolution, the pH of surface ocean waters has fallen by 0.1 pH units. Since the pH scale is logarithmic, this change represents a stunning 30 percent increase in acidity.

Why the miss? Scientists didn’t have the data. They began studying acidification by the late 1990s, but there weren’t many papers on the topic until mid-2000, missing the submission deadline for IPCC’s 2001 report. Especially alarming are new findings that ocean temperatures and currents are causing parts of the seas to become acidic far faster than expected, threatening oysters and other shellfish.

National Oceanic and Atmospheric Administration chief Jane Lubchenco has called acidification the “equally evil twin” to global warming.

Thawing Tundra

Some carbon-cycle feedbacks that could vastly amplify climate change – especially a massive release of carbon and methane from thawing permafrost – are extremely hard to model.

Projection: In 2007, IPCC reported with “high confidence” that “methane emissions from tundra… and permafrost have accelerated in the past two decades, and are likely to accelerate further.” However, the IPCC offered no projections regarding permafrost melt.

Reality: Scientists estimate that the world’s permafrost holds 1.5 trillion tons of frozen carbon. That worries scientists: The Arctic is warming faster than anywhere else on earth, and researchers are seeing soil temperatures climb rapidly, too. Some permafrost degradation is already occurring.

Large-scale tundra wildfires in 2012 added to the concern.

Why the miss? This is controversial science, with some researchers saying the Arctic tundra is stable, others saying it will defrost only over long periods of time, and still more convinced we are on the verge of a tipping point, where the tundra thaws rapidly and catastrophically. A major 2005 study, for instance, warned that the entire top 11 feet of global permafrost could disappear by century’s end, with potentially cataclysmic climate impacts.

The U.N. Environmental Programme revealed this week that IPCC’s fifth assessment, due for release starting in September, 2013, will again “not include the potential effects of the permafrost carbon feedback on global climate.”

Tipping points

The IPCC has been silent on tipping points – non-linear “light switch” moments when the climate system abruptly shifts from one paradigm to another.

The trouble with tipping points is they’re hard to spot until you’ve passed one.

Projection: IPCC has made no projections regarding tipping-point thresholds.

Reality: The scientific jury is still out as to whether we have reached any climate thresholds – a point of no return for, say, an ice-free Arctic, a Greenland meltdown, the slowing of the North Atlantic Ocean circulation, or permanent changes in large-scale weather patterns like the jet stream, El Niño or monsoons. The trouble with tipping points is they’re hard to spot until you’ve passed one.

Why the miss? Blame the computers: These non-linear events are notoriously hard to model. But with scientists recognizing the sizeable threat tipping points represent, they will be including some projections in the 2013-14 assessment.

Correction (Dec. 6, 2012): Earlier editions incorrectly compared global carbon dioxide emissions against carbon emissions scenarios. Carbon dioxide is heavier, incorrectly skewing the comparison. Global use of fossil fuels in 2010 produced about 30 billion tons of carbon dioxide but only 9.1 tons of carbon, putting emissions within the extreme end of IPCC scenarios. The story has been changed to reflect that.

© Glenn Scherer, 2012. All rights reserved.

Graphic of emissions scenario courtesy U.S. Global Change Research Program. Photo of activist warning of 6ºC warming © Adela Nistora. Graphic showing Arctic summer ice projections vs. observations by the Vancouver Observer.

Glenn Scherer is senior editor of Blue Ridge Press, a news service that has been providing environmental commentary and news to U.S. newspapers since 2007.

DailyClimate.org is a foundation-funded news service covering climate change. Contact editor Douglas Fischer at dfischer [at] dailyclimate.org

Scientists Pioneer Method to Predict Environmental Collapse (Science Daily)

Researcher Enlou Zhang takes a core sample from the bed of Lake Erhai in China. (Credit: University of Southampton)

Nov. 19, 2012 — Scientists at the University of Southampton are pioneering a technique to predict when an ecosystem is likely to collapse, which may also have potential for foretelling crises in agriculture, fisheries or even social systems.

The researchers have applied a mathematical model to a real world situation, the environmental collapse of a lake in China, to help prove a theory which suggests an ecosystem ‘flickers’, or fluctuates dramatically between healthy and unhealthy states, shortly before its eventual collapse.

Head of Geography at Southampton, Professor John Dearing explains: “We wanted to prove that this ‘flickering’ occurs just ahead of a dramatic change in a system — be it a social, ecological or climatic one — and that this method could potentially be used to predict future critical changes in other impacted systems in the world around us.”

A team led by Dr Rong Wang extracted core samples from sediment at the bottom of Lake Erhai in Yunnan province, China and charted the levels and variation of fossilised algae (diatoms) over a 125-year period. Analysis of the core sample data showed the algae communities remained relatively stable up until about 30 years before the lake’s collapse into a turbid or polluted state. However, the core samples for these last three decades showed much fluctuation, indicating there had been numerous dramatic changes in the types and concentrations of algae present in the water — evidence of the ‘flickering’ before the lake’s final definitive change of state.

Rong Wang comments: “By using the algae as a measure of the lake’s health, we have shown that its eco-system ‘wobbled’ before making a critical transition — in this instance, to a turbid state.

“Dramatic swings can be seen in other data, suggesting large external impacts on the lake over a long time period — for example, pollution from fertilisers, sewage from fields and changes in water levels — caused the system to switch back and forth rapidly between alternate states. Eventually, the lake’s ecosystem could no longer cope or recover — losing resilience and reaching what is called a ‘tipping point’ and collapsing altogether.”

The researchers hope the method they have trialled in China could be applied to other regions and landscapes.

Co-author Dr Pete Langdon comments: “In this case, we used algae as a marker of how the lake’s ecosystem was holding-up against external impacts — but who’s to say we couldn’t use this method in other ways? For example, perhaps we should look for ‘flickering’ signals in climate data to try and foretell impending crises?”

Journal Reference:

  1. Rong Wang, John A. Dearing, Peter G. Langdon, Enlou Zhang, Xiangdong Yang, Vasilis Dakos, Marten Scheffer.Flickering gives early warning signals of a critical transition to a eutrophic lake stateNature, 2012; DOI:10.1038/nature11655

Social Synchronicity: Research Finds a Connection Between Bonding and Matched Movements (Science Daily)

A new study finds that body-movement synchronization between two participants increases following a short session of cooperative training, suggesting that our ability to synchronize body movements is a measurable indicator of social interaction. (Credit: © Yuri Arcurs / Fotolia)

Dec. 12, 2012 — Humans have a tendency to spontaneously synchronize their movements. For example, the footsteps of two friends walking together may synchronize, although neither individual is consciously aware that it is happening. Similarly, the clapping hands of an audience will naturally fall into synch. Although this type of synchronous body movement has been observed widely, its neurological mechanism and its role in social interactions remain obscure. In a new study, led by cognitive neuroscientists at the California Institute of Technology (Caltech), researchers found that body-movement synchronization between two participants increases following a short session of cooperative training, suggesting that our ability to synchronize body movements is a measurable indicator of social interaction.

“Our findings may provide a powerful tool for identifying the neural underpinnings of both normal social interactions and impaired social interactions, such as the deficits that are often associated with autism,” says Shinsuke Shimojo, Gertrude Baltimore Professor of Experimental Psychology at Caltech and senior author of the study.

Shimojo, along with former postdoctoral scholar Kyongsik Yun, and Katsumi Watanabe, an associate professor at the University of Tokyo, presented their work in a paper published December 11 inScientific Reports, an online and open-access journal from the Nature Publishing Group.

For their study, the team evaluated the hypothesis that synchronous body movement is the basis for more explicit social interaction by measuring the amount of fingertip movement between two participants who were instructed to extend their arms and point their index fingers toward one another — much like the famous scene in E.T. between the alien and Elliott. They were explicitly instructed to keep their own fingers as stationary as possible while keeping their eyes open. The researchers simultaneously recorded the neuronal activity of each participant using electroencephalography, or EEG, recordings. Their finger positions in space were recorded by a motion-capture system.

The participants repeated the task eight times; the first two rounds were called pretraining sessions and the last two were posttraining sessions. The four sessions in between were the cooperative training sessions, in which one person — a randomly chosen leader — made a sequence of large finger movements, and the other participant was instructed to follow the movements. In the posttraining sessions, finger-movement correlation between the two participants was significantly higher compared to that in the pretraining sessions. In addition, socially and sensorimotor-related brain areas were more synchronized between the brains, but not within the brain, in the posttraining sessions. According to the researchers, this experiment, while simple, is novel in that it allows two participants to interact subconsciously while the amount of movement that could potentially disrupt measurement of the neural signal is minimized.

“The most striking outcome of our study is that not only the body-body synchrony but also the brain-brain synchrony between the two participants increased after a short period of social interaction,” says Yun. “This may open new vistas to study the brain-brain interface. It appears that when a cooperative relationship exists, two brains form a loose dynamic system.”

The team says this information may be potentially useful for romantic or business partner selection.

“Because we can quantify implicit social bonding between two people using our experimental paradigm, we may be able to suggest a more socially compatible partnership in order to maximize matchmaking success rates, by preexamining body synchrony and its increase during a short cooperative session” explains Yun.

As part of the study, the team also surveyed the subjects to rank certain social personality traits, which they then compared to individual rates of increased body synchrony. For example, they found that the participants who expressed the most social anxiety showed the smallest increase in synchrony after cooperative training, while those who reported low levels of anxiety had the highest increases in synchrony. The researchers plan to further evaluate the nature of the direct causal relationship between synchronous body movement and social bonding. Further studies may explore whether a more complex social interaction, such as singing together or being teamed up in a group game, increases synchronous body movements among the participants.

“We may also apply our experimental protocol to better understand the nature and the neural correlates of social impairment in disorders where social deficits are a common symptom, as in schizophrenia or autism,” says Shimojo.

The title of the Scientific Reports paper is “Interpersonal body and neural synchronization as a marker of implicit social interaction.” Funding for this research was provided by the Japan Science and Technology Agency’s CREST and the Tamagawa-Caltech gCOE (global Center Of Excellence) programs.

Journal Reference:

  1. Kyongsik Yun, Katsumi Watanabe, Shinsuke Shimojo.Interpersonal body and neural synchronization as a marker of implicit social interactionScientific Reports, 2012; DOI: 10.1038/srep00959

Dead Guts Spill History of Extinct Microbes: Fecal Samples from Archeological Sites Reveal Evolution of Human Gut Microbes (Science Daily)

This shows microbiomes across time and populations. (Credit: Tito RY, Knights D, Metcalf J, Obregon-Tito AJ, Cleeland L, et al. (2012) Insights from Characterizing Extinct Human Gut Microbiomes. PLoS ONE 7(12):e51146.doi:10.1371/journal.pone.0051146)

Dec. 12, 2012 — Extinct microbes in fecal samples from archaeological sites across the world resemble those found in present-day rural African communities more than they resemble the microbes found in the gut of cosmopolitan US adults, according to research published December 12 in the open access journalPLOS ONE by Cecil Lewis and colleagues from the University of Oklahoma.

The researchers analyzed 1400-8000-year-old fecal samples preserved at three archaeological sites: natural mummies from Caserones in northern Chile, and samples from Hinds Cave in the southern US and Rio Zape in northern Mexico. They also used samples from Otzi the Iceman and a soldier frozen on a glacier for nearly a century. They compared the now-extinct microbes in these samples to microbes present in current-day soil and compost, as well as the microbes present in mouths, gut and skin of people in rural African communities and cosmopolitan US adults.

The authors discovered that the extinct human microbes from natural mummies closely resembled compost samples, while one sample from Mexico was found to match that from a rural African child. Overall, the extinct microbial communities were more similar to those from present rural populations than those from cosmopolitan ones. The study concludes, “These results suggest that the modern cosmopolitan lifestyle resulted in a dramatic change to the human gut microbiome.”

As Lewis explains, “It is becoming accepted that modern aseptic and antibiotic practices, are often beneficial but come with a price, such as compromising the natural development of our immune system through changing the relationship we had with microbes ancestrally. What is unclear is what that ancestral state looked like. This paper demonstrates that we can use ancient human biological samples to learn about these ancestral relationships, despite the challenges of subsequent events like degradation and contamination.”

Journal Reference:

  1. Raul Y. Tito, Dan Knights, Jessica Metcalf, Alexandra J. Obregon-Tito, Lauren Cleeland, Fares Najar, Bruce Roe, Karl Reinhard, Kristin Sobolik, Samuel Belknap, Morris Foster, Paul Spicer, Rob Knight, Cecil M. Lewis. Insights from Characterizing Extinct Human Gut Microbiomes.PLoS ONE, 2012; 7 (12): e51146 DOI:10.1371/journal.pone.0051146

Do We Live in a Computer Simulation Run by Our Descendants? Researchers Say Idea Can Be Tested (Science Daily)

The conical (red) surface shows the relationship between energy and momentum in special relativity, a fundamental theory concerning space and time developed by Albert Einstein, and is the expected result if our universe is not a simulation. The flat (blue) surface illustrates the relationship between energy and momentum that would be expected if the universe is a simulation with an underlying cubic lattice. (Credit: Martin Savage)

Dec. 10, 2012 — A decade ago, a British philosopher put forth the notion that the universe we live in might in fact be a computer simulation run by our descendants. While that seems far-fetched, perhaps even incomprehensible, a team of physicists at the University of Washington has come up with a potential test to see if the idea holds water.

The concept that current humanity could possibly be living in a computer simulation comes from a 2003 paper published inPhilosophical Quarterly by Nick Bostrom, a philosophy professor at the University of Oxford. In the paper, he argued that at least one of three possibilities is true:

  • The human species is likely to go extinct before reaching a “posthuman” stage.
  • Any posthuman civilization is very unlikely to run a significant number of simulations of its evolutionary history.
  • We are almost certainly living in a computer simulation.

He also held that “the belief that there is a significant chance that we will one day become posthumans who run ancestor simulations is false, unless we are currently living in a simulation.”

With current limitations and trends in computing, it will be decades before researchers will be able to run even primitive simulations of the universe. But the UW team has suggested tests that can be performed now, or in the near future, that are sensitive to constraints imposed on future simulations by limited resources.

Currently, supercomputers using a technique called lattice quantum chromodynamics and starting from the fundamental physical laws that govern the universe can simulate only a very small portion of the universe, on the scale of one 100-trillionth of a meter, a little larger than the nucleus of an atom, said Martin Savage, a UW physics professor.

Eventually, more powerful simulations will be able to model on the scale of a molecule, then a cell and even a human being. But it will take many generations of growth in computing power to be able to simulate a large enough chunk of the universe to understand the constraints on physical processes that would indicate we are living in a computer model.

However, Savage said, there are signatures of resource constraints in present-day simulations that are likely to exist as well in simulations in the distant future, including the imprint of an underlying lattice if one is used to model the space-time continuum.

The supercomputers performing lattice quantum chromodynamics calculations essentially divide space-time into a four-dimensional grid. That allows researchers to examine what is called the strong force, one of the four fundamental forces of nature and the one that binds subatomic particles called quarks and gluons together into neutrons and protons at the core of atoms.

“If you make the simulations big enough, something like our universe should emerge,” Savage said. Then it would be a matter of looking for a “signature” in our universe that has an analog in the current small-scale simulations.

Savage and colleagues Silas Beane of the University of New Hampshire, who collaborated while at the UW’s Institute for Nuclear Theory, and Zohreh Davoudi, a UW physics graduate student, suggest that the signature could show up as a limitation in the energy of cosmic rays.

In a paper they have posted on arXiv, an online archive for preprints of scientific papers in a number of fields, including physics, they say that the highest-energy cosmic rays would not travel along the edges of the lattice in the model but would travel diagonally, and they would not interact equally in all directions as they otherwise would be expected to do.

“This is the first testable signature of such an idea,” Savage said.

If such a concept turned out to be reality, it would raise other possibilities as well. For example, Davoudi suggests that if our universe is a simulation, then those running it could be running other simulations as well, essentially creating other universes parallel to our own.

“Then the question is, ‘Can you communicate with those other universes if they are running on the same platform?'” she said.

Journal References:

  1. Silas R. Beane, Zohreh Davoudi, Martin J. Savage.Constraints on the Universe as a Numerical SimulationArxiv, 2012 [link]
  2. Nick Bostrom. Are You Living in a Computer Simulation? Philosophical Quarterly, (2003) Vol. 53, No. 211, pp. 243-255 [link]

‘Missing’ Polar Weather Systems Could Impact Climate Predictions (Science Daily)

Intense but small-scale polar storms could make a big difference to climate predictions according to new research. (Credit: NEODAAS / University of Dundee)

Dec. 16, 2012 — Intense but small-scale polar storms could make a big difference to climate predictions, according to new research from the University of East Anglia and the University of Massachusetts.

Difficult-to-forecast polar mesoscale storms occur frequently over the polar seas; however, they are missing in most climate models.

Research published Dec. 16 inNature Geoscience shows that their inclusion could paint a different picture of climate change in years to come.

Polar mesoscale storms are capable of producing hurricane-strength winds which cool the ocean and lead to changes in its circulation.

Prof Ian Renfrew, from UEA’s School of Environmental Sciences, said: “These polar lows are typically under 500 km in diameter and over within 24-36 hours. They’re difficult to predict, but we have shown they play an important role in driving large-scale ocean circulation.

“There are hundreds of them a year in the North Atlantic, and dozens of strong ones. They create a lot of stormy weather, strong winds and snowfall — particularly over Norway, Iceland, and Canada, and occasionally over Britain, such as in 2003 when a massive dump of snow brought the M11 to a standstill for 24 hours.

“We have shown that adding polar storms into computer-generated models of the ocean results in significant changes in ocean circulation — including an increase in heat travelling north in the Atlantic Ocean and more overturning in the Sub-polar seas.

“At present, climate models don’t have a high enough resolution to account for these small-scale polar lows.

“As Arctic Sea ice continues to retreat, polar lows are likely to migrate further north, which could have consequences for the ‘thermohaline’ or northward ocean circulation — potentially leading to it weakening.”

Alan Condron from the University of Massachusetts said: “By simulating polar lows, we find that the area of the ocean that becomes denser and sinks each year increases and causes the amount of heat being transported towards Europe to intensify.

“The fact that climate models are not simulating these storms is a real problem because these models will incorrectly predict how much heat is being moved northward towards the poles. This will make it very difficult to reliably predict how the climate of Europe and North America will change in the near-future.”

Prof Renfrew added: “Climate models are always improving, and there is a trade-off between the resolution of the model, the complexity of the model, and the number of simulations you can carry out. Our work suggests we should put some more effort into resolving such storms.”

‘The impact of polar mesoscale storms on Northeast Atlantic ocean circulation’ by Alan Condron from the University of Massachusetts (US) and Ian Renfrew from UEA (UK), is published in Nature Geoscience on December 16, 2012.

Journal Reference:

  1. Alan Condron, Ian A. Renfrew. The impact of polar mesoscale storms on northeast Atlantic Ocean circulationNature Geoscience, 2012; DOI:10.1038/ngeo1661

Physicist Happens Upon Rain Data Breakthrough (Science Daily)

John Lane looks over data recorded from his laser system as he refines his process and formula to calibrate measurements of raindrops. (Credit: NASA/Jim Grossmann)

Dec. 3, 2012 — A physicist and researcher who set out to develop a formula to protect Apollo sites on the moon from rocket exhaust may have happened upon a way to improve weather forecasting on Earth.

Working in his backyard during rain showers and storms, John Lane, a physicist at NASA’s Kennedy Space Center in Florida, found that the laser and reflector he was developing to track lunar dust also could determine accurately the size of raindrops, something weather radar and other meteorological systems estimate, but don’t measure.

The special quantity measured by the laser system is called the “second moment of the size distribution,” which results in the average cross-section area of raindrops passing through the laser beam.

“It’s not often that you’re studying lunar dust and it ends up producing benefits in weather forecasting,” said Phil Metzger, a physicist who leads the Granular Mechanics and Regolith Operations Lab, part of the Surface Systems Office at Kennedy.

Lane said the additional piece of information would be useful in filling out the complex computer calculations used to determine the current conditions and forecast the weather.

“We may be able to refine (computer weather) models to make them more accurate,” Lane said. “Weather radar data analysis makes assumptions about raindrop size, so I think this could improve the overall drop size distribution estimates.”

The breakthrough came because Metzger and Lane were looking for a way to calibrate a laser sensor to pick up the fine particles of blowing lunar dust and soil. It turns out that rain is a good stand-in for flying lunar soil.

“I was pretty skeptical in the beginning that the numbers would come out anywhere close,” Lane said. “Anytime you do something new, it’s a risk that you’re just wasting your time.”

The genesis of the research was the need to find out how much damage would be done by robotic landers getting too close to the six places on the moon where Apollo astronauts landed, lived and worked.

NASA fears that dust and soil particles thrown up by the rocket exhaust of a lander will scour and perhaps puncture the metal skin of the lunar module descent stages and experiment hardware left behind by the astronauts from 1969 to 1972.

“It’s like sandblasting, if you have something coming down like a rocket engine, and it lifts up this dust, there’s not air, so it just keeps going fast,” Lane said. “Some of the stuff can actually reach escape velocity and go into orbit.”

Such impacts to those materials could ruin their scientific value to researchers on Earth who want to know what happens to human-made materials left on another world for more than 40 years.

“The Apollo sites have value scientifically and from an engineering perspective because they are a record of how these materials on the moon have interacted with the solar system over 40 years,” Metzger said. “They are witness plates to the environment.”

There also are numerous bags of waste from the astronauts laying up there that biologists want to examine simply to see if living organisms can survive on the moon for almost five decades where there is no air and there is a constant bombardment of cosmic radiation.

“If anybody goes back and sprays stuff on the bags or touches the bags, they ruin the experiment,” Metzger said. “It’s not just the scientific and engineering value. They believe the Apollo sites are the most important archaeological sites in the human sphere, more important than the pyramids because it’s the first place humans stepped off the planet. And from a national point of view, these are symbols of our country and we don’t want them to be damaged by wanton ransacking.”

Current thinking anticipates placing a laser sensor on the bottom of one of the landers taking part in the Google X-Prize competition. The sensor should be able to pick up the blowing dust and soil and give researchers a clear set of results so they can formulate restrictions for other landers, such as how far away from the Apollo sites new landers can touch down.

As research continues into the laser sensor, Lane expects the work to continue on the weather forecasting side of the equation, too. Lane already presented some of his findings at a meteorological conference and is working on a research paper to detail the work. “This is one of those topics that span a lot of areas of science,” Lane said.

Water Resources Management and Policy in a Changing World: Where Do We Go from Here? (Science Daily)

Nov. 26, 2012 — Visualize a dusty place where stream beds are sand and lakes are flats of dried mud. Are we on Mars? In fact, we’re on arid parts of Earth, a planet where water covers some 70 percent of the surface.

How long will water be readily available to nourish life here?

Scientists funded by the National Science Foundation’s (NSF) Dynamics of Coupled Natural and Human Systems (CNH) program are finding new answers.

NSF-supported CNH researchers will address water resources management and policy in a changing world at the fall meeting of the American Geophysical Union (AGU), held in San Francisco from Dec. 3-7, 2012.

In the United States, more than 36 states face water shortages. Other parts of the world are faring no better.

What are the causes? Do the reasons lie in climate change, population growth or still other factors?

Among the topics to be covered at AGU are sociohydrology, patterns in coupled human-water resource systems and the resilience of coupled natural and human systems to global change.

Researchers will report, for example, that human population growth in the Andes outweighs climate change as the culprit in the region’s dwindling water supplies. Does the finding apply in other places, and perhaps around the globe?

Scientists presenting results are affiliated with CHANS-Net, an international network of researchers who study coupled natural and human systems.

NSF’s CNH program supports CHANS-Net, with coordination from the Center for Systems Integration and Sustainability at Michigan State University.

CHANS-Net facilitates communication and collaboration among scientists, engineers and educators striving to find sustainable solutions that benefit the environment while enabling people to thrive.

“For more than a decade, NSF’s CNH program has supported projects that explore the complex ways people and natural systems interact with each other,” says Tom Baerwald, NSF CNH program director.

“CHANS-Net and its investigators represent a broad range of projects. They’re developing a new, better understanding of how our planet works. CHANS-Net researchers are finding practical answers for how people can prosper while maintaining environmental quality.”

CNH and CHANS-Net are part of NSF’s Science, Engineering and Education for Sustainability (SEES) investment. NSF’s Directorates for Geosciences; Social, Behavioral and Economic Sciences; and Biological Sciences support the CNH program.

“CHANS-Net has grown to more than 1,000 members who span generations of natural and social scientists from around the world,” says Jianguo “Jack” Liu, principal investigator of CHANS-Net and Rachel Carson Chair in Sustainability at Michigan State University.

“CHANS-Net is very happy to support another 10 CHANS Fellows–outstanding young scientists–to attend AGU, give presentations there, and learn from leaders in CHANS research and build professional networks. We’re looking forward to these exciting annual CHANS-Net events.”

Speakers at AGU sessions organized by CHANS-Net will discuss such subjects as the importance of water conservation in the 21st century; the Gila River and whether its flows might reduce the risk of water shortages in the Colorado River Basin; and historical evolution of the hydrological functioning of the old Lake Xochimilco in the southern Mexico Basin.

Other topics to be addressed include water conflicts in a changing world; system modeling of the Great Salt Lake in Utah to improve the hydro-ecological performance of diked wetlands; and integrating economics into water resources systems analysis.

“Of all our natural resources, water has become the most precious,” wrote Rachel Carson in 1962 in Silent Spring. “By a strange paradox, most of the Earth’s abundant water is not usable for agriculture, industry, or human consumption because of its heavy load of sea salts, and so most of the world’s population is either experiencing or is threatened with critical shortages.”

Fifty years later, more than 100 scientists will present research reflecting Rachel Carson’s conviction that “seldom if ever does nature operate in closed and separate compartments, and she has not done so in distributing Earth’s water supply.”

Go With the Flow in Flood Prediction (Science Daily)

Dec. 3, 2012 — Floods have once again wreaked havoc across the country and climate scientists and meteorologists suggest that the problem is only going to get worse with wetter winters and rivers bursting their banks becoming the norm. A team based at Newcastle University and their colleagues in China have developed a computer model that can work out how the flood flow will develop and where flooding will be worst based on an understanding of fluid dynamics and the underlying topology of a region.

Writing in the journal Progress in Computational Fluid Dynamics,Newcastle civil engineer Qiuhua Liang and colleagues and Chi Zhang of Dalian University of Technology and Junxian Yin, China Institute of Water Resources and Hydropower Research in Beijing, explain how they have developed an adaptive computer model that could provide accurate and efficient predictions about the flow of water as a flood occurs. Such a model might provide environmental agencies and authorities with a more precise early-warning system for residents and businesses in a region at risk of flood. It could also be used by insurance companies to determine the relative risk of different areas within a given region and so make their underwriting of the risk economically viable.

The model is based on a numerical solution to the hydrodynamic equations of fluid flow . This allows the researchers to plot the likely movement of water during a dam break or flash flood over different kinds of terrain and around obstacles even when flood waves are spreading rapidly. The researchers have successfully tested their model on real-world flood data.

The team points out that flood disasters have become a major threat to human lives and assets. “Flood management is therefore an important task for different levels of governments and authorities in many countries”, the researchers explain. “The availability of accurate and efficient flood modelling tools is vital to assist engineers and managers charged with flood risk assessment, prevention and alleviation.”

Journal Reference:

  1. Chi Zhang, Qiuhua Liang, Junxian Yin. A first-order adaptive solution to rapidly spreading flood waves.Progress in Computational Fluid Dynamics, An International Journal, 2013; 13 (1): 1 DOI: 10.1504/PCFD.2013.050645

Blame, Responsibility and Demand for Change Following Floods (Science Daily)

Nov. 25, 2012 — New research shows concerns about governmental failure to act effectively and fairly in the aftermath of extreme weather events can affect the degree to which residents are willing to protect themselves.

Published in the journal Nature Climate Change, the findings of a team led by scientists at the University could prove key to establishing how society should evolve to cope with more turbulent weather and more frequent mega storms.

The team examined attitudes in Cumbria in north west England and Galway in western Ireland, which were both hit by heavy flooding in November 2009. Record rainfall was recorded in both countries, resulting in a number of deaths, properties being severely damaged and economic disruption.

Professor Neil Adger of Geography at the University of Exeter, who led the research, said: “The flooding of 2009 was devastating to both communities. Our study is the first to track the impacts of floods across two countries and how communities and individuals demand change after such events. When people in both studies felt that government had fallen short of their expectations, we found that the resulting perception of helplessness leads to an unwillingness to take personal action to prevent flooding in future.”

Scientists at the University of Exeter worked with colleagues at the National University of Ireland Maynooth and the Tyndall Centre for Climate Change Research at the University of East Anglia, which also provided funding for the study.

Researchers surveyed 356 residents in both areas eight months after the flooding. They measured perceptions of governments’ performances in dealing with the aftermath, as well as perceptions of fairness in that response and the willingness of individuals to take action.

Dr Irene Lorenzoni of the Tyndall Centre comments: “Residents in Galway were significantly more likely to believe that their property would be flooded again than those in Cumbria. Yet it was Cumbrians who believed they had more personal responsibility to adapt to reduce future incidents.

“Whether people felt responses were fair also diverged. In our survey in Cumbria three quarters of respondents agreed that everyone in their community had received prompt help following the flooding, while in Galway it was less than half.”

Dr Conor Murphy of the National University of Ireland, Maynooth said: “The strong perception in Galway that authorities failed to deliver on the expectations of flooded communities in late 2009 is a wakeup call. Given the high exposure of development in flood prone areas it is clear that both England and Ireland need to make major investments in building flood resilience with changing rainfall patterns induced by climate change. Political demand for those investments will only grow.”

Professor Adger says: “Our research shows that climate change is likely to lead to a series of crises which will cause major disruption as instant short-term solutions are sought. We need to consider the implicit contract between citizens and government agencies when planning for floods, to enable fairer and smoother processes of adaptation.”

Journal Reference:

  1. W. Neil Adger, Tara Quinn, Irene Lorenzoni, Conor Murphy, John Sweeney. Changing social contracts in climate-change adaptationNature Climate Change, 2012; DOI:10.1038/nclimate1751

Dogs Can Accurately Sniff out ‘Superbug’ Infections (Science Daily)

A new study finds that dogs can sniff out Clostridium difficile (the infective agent that is responsible for many of the dreaded “hospital acquired infections”) in stool samples and even in the air surrounding patients in hospital with a very high degree of accuracy. (Credit: © CallallooAlexis / Fotolia)

Dec. 13, 2012 — Dogs can sniff out Clostridium difficile (the infective agent that is responsible for many of the dreaded “hospital acquired infections”) in stool samples and even in the air surrounding patients in hospital with a very high degree of accuracy, finds a study in the Christmas issue published on bmj.com today.

The findings support previous studies of dogs detecting various types of cancer and could have great potential for screening hospital wards to help prevent C. difficile outbreaks, say the researchers.

C. difficile infection most commonly occurs in older people who have recently had a course of antibiotics in hospital, but it can also start in the community, especially in care homes. Symptoms can range from mild diarrhoea to a life-threatening inflammation of the bowel.

Early detection is vital to prevent transmission, but diagnostic tests can be expensive and slow, which can delay treatment for up to a week.

Diarrhoea due to C. difficile has a specific smell, and dogs have a superior sense of smell compared with humans. This prompted researchers in the Netherlands to investigate whether a dog could be trained to detect C. difficile.

A two-year old male beagle (called Cliff) was trained by a professional instructor to identify C. difficile in stool samples and in patients with C. difficile infection. He was taught to indicate the presence of the specific scent by sitting or lying down.

The dog had not been trained for detection purposes before.

After two months of training, the dog’s detection abilities were formally tested on 50 C. difficile positive and 50 C. difficilenegative stool samples. He correctly identified all 50 positive samples and 47 out of 50 negative samples.

This equates to 100% sensitivity and 94% specificity (sensitivity measures the proportion of positives correctly identified, while specificity measures the proportion of negatives correctly identified).

The dog was then taken onto two hospital wards to test his detection abilities in patients. He correctly identified 25 out of 30 cases (sensitivity 83%) and 265 out of 270 negative controls (specificity 98%).

The researchers add that the dog was quick and efficient, screening a complete hospital ward for the presence of patients with C. difficile infection in less than 10 minutes.

They point to some study limitations, such as the unpredictability of using an animal as a diagnostic tool and the potential for spreading infections via the dog, and say some unanswered questions remain.

However, they say their study demonstrates that a detection dog can be trained to identify C. difficile infection with a high degree of accuracy, both in stool samples and in hospitalised patients. “This could have great potential for C. difficileinfection screening in healthcare facilities and thus contribute to C. difficile infection outbreak control and prevention,” they conclude.

Journal Reference:

  1. M. K. Bomers, M. A. van Agtmael, H. Luik, M. C. van Veen, C. M. J. E. Vandenbroucke-Grauls, Y. M. Smulders. Using a dog’s superior olfactory sensitivity to identify Clostridium difficile in stools and patients: proof of principle studyBMJ, 2012; 345 (dec13 8): e7396 DOI:10.1136/bmj.e7396

We Are Basically Honest – Except When We Are at Work, Study Suggests (Science Daily)

Dec. 14, 2012 — A new study has revealed we are more honest than you might think. The research by the University of Oxford and the University of Bonn suggests that it pains us to tell lies, particularly when we are in our own homes. It appears that being honest is hugely important to our sense of who we are. However, while it might bother us to tell lies at home, we are less circumspect at work where we are probably more likely to bend the truth, suggests the study.

The researchers conducted simple honesty tests by ringing people in their own homes in Germany and asking them to flip a coin. The study participants were asked over the phone to report on how it landed. The catch to this test was that each of the individuals taking part was given a strong financial incentive to lie without the fear of being found out. The study participants were told that if the coin landed tails up, they would receive 15 euros or a gift voucher; while if the coin landed heads up, they would receive nothing.

Using randomly generated home phone numbers, 658 people were contacted who agreed to take part. Although the researchers could not directly observe the behaviour of the individuals in their own homes, the aggregated reports show a remarkably high level of honesty. Over half of the study participants (55.6 per cent) reported that the coin landed heads-up, which meant they would receive nothing. Only 44.4 per cent reported tails up, collecting their financial reward as a result.

A second similar test was done involving 94 participants over the phone. This time they were asked to report on the results of four consecutive coin tosses with the promise of five euros for every time the coin landed tails up. Despite a potential maximum pay-off of 20 euros, the reports they received from the respondents reflected the likely distribution of a fair coin. This is based on the premise that the coin would have landed tails up around 50 per cent of the time.

All those taking part in the experiments answered questions about their own gender, age, views on honesty and their religious background. The study suggests, however, that personal attributes play no part here as the overall level of honesty demonstrated in both experiments was high.

This latest study can be compared with previous similar studies, which were conducted with students in tightly controlled laboratory situations. In those studies around 75 per cent of participants reported tails-up, which the researchers suggest could infer that people are more honest when they are in their own homes.

Dr Johannes Abeler, from the Department of Economics at the University of Oxford, said: The fact that the financial incentive to lie was outweighed by the perceived cost of lying shows just how honest most people are when they are in their own homes. One theory is that being honest is at the very core of how we want to perceive ourselves and is very important to our sense of self identity. Why it is so important? It may be to do with the social norms we have been given about what is right and wrong from the moment we could walk and talk.

‘This study has implications for policy-makers. For instance, if they want to catch those involved in fraudulent behaviour, perhaps the forms and questionnaires could be designed to reveal more about our personal lives and sense of self-identity. Our experiments showed that if people plainly see that to lie in a given situation would be fraudulent, they shy away from it. However, if people are given “wriggle room,” they can convince themselves that their behaviour is not fraudulent and this does not attack their sense of who they are.’

The computer-assisted telephone interviews were carried out by the Institute for Applied Social Sciences (infas), a private, well-known German research institute. They were conducted between November 2010 and February 2011. Telephone numbers were selected using a random digit dialling technique with numbers randomly based on a data set of all potential landline telephone numbers in Germany. Part of the study consisted of questions relating to the participants’ social background, age and education, their economic and political preferences, their religious beliefs, their attitudes to crime, and their beliefs about other people’s behaviour in the experiment.

Schizophrenia Linked to Social Inequality (Science Daily)

Dec. 14, 2012 — Higher rates of schizophrenia in urban areas can be attributed to increased deprivation, increased population density and an increase in inequality within a neighbourhood, new research reveals. The research, led by the University of Cambridge in collaboration with Queen Mary University of London, was published today in the journal Schizophrenia Bulletin.

Dr James Kirkbride, lead author of the study from the University of Cambridge, said: “Although we already know that schizophrenia tends to be elevated in more urban communities, it was unclear why. Our research suggests that more densely populated, more deprived and less equal communities experience higher rates of schizophrenia and other similar disorders. This is important because other research has shown that many health and social outcomes also tend to be optimal when societies are more equal.”

The scientists used data from a large population-based incidence study (the East London first-episode psychosis study directed by Professor Jeremy Coid at the East London NHS Foundation Trust and Queen Mary, University of London) conducted in three neighbouring inner city, ethnically diverse boroughs in East London: City & Hackney, Newham, and Tower Hamlets.

427 people aged 18-64 years old were included in the study, all of whom experienced a first episode of psychotic disorder in East London between 1996 and 2000. The researchers assessed their social environment through measures of the neighbourhood in which they lived at the time they first presented to mental health services because of a psychotic disorder. Using the 2001 census, they estimated the population aged 18-64 years old in each neighbourhood, and then compared the incidence rate between neighbourhoods.

The incidence of schizophrenia (and other similar disorders where hallucinations and delusions are the dominant feature) still showed variation between neighbourhoods after taking into account age, sex, ethnicity and social class. Three environmental factors predicted risk of schizophrenia — increased deprivation (which includes employment, income, education and crime) increased population density, and an increase in inequality (the gap between the rich and poor).

Results from the study suggested that a percentage point increase in either neighbourhood inequality or deprivation was associated with an increase in the incidence of schizophrenia and other similar disorders of around 4%.

Dr Kirkbride added: “Our research adds to a wider and growing body of evidence that inequality seems to be important in affecting many health outcomes, now possibly including serious mental illness. Our data seems to suggest that both absolute and relative levels of deprivation predict the incidence of schizophrenia.

“East London has changed substantially over recent years, not least because of the Olympic regeneration. It would be interesting to repeat this work in the region to see if the same patterns were found.”

The study also found that risk of schizophrenia in some migrant groups might depend on the ethnic composition of their neighbourhood. For black African people, the study found that rates tended to be lower in neighbourhoods where there were a greater proportion of other people of the same background. By contrast, rates of schizophrenia were lower for the black Caribbean group when they lived in more ethnically-integrated neighbourhoods. These findings support the possibility that the socio-cultural composition of our environment could positively or negatively influence risk of schizophrenia and other similar disorders.

Dr John Williams, Head of Neuroscience and Mental Health at the Wellcome Trust said: “This research reminds us that we must understand the complex societal factors as well as the neural mechanisms that underpin the onset of mental illness, if we are to develop appropriate interventions.”

Journal Reference:

  1. J. B. Kirkbride, P. B. Jones, S. Ullrich, J. W. Coid. Social Deprivation, Inequality, and the Neighborhood-Level Incidence of Psychotic Syndromes in East London.Schizophrenia Bulletin, 2012; DOI: 10.1093/schbul/sbs151

The Opportunistic Apocalypse (Savage Minds)

by  on December 14th, 2012

The third in a guest series about the “Mayan Apocalypse” predicted for Dec. 21, 2012.  The first two posts are here and here.

There are opportunities in the apocalypse.  The end of the world has been commodified.  A few are seriously investing in bunkers, boats, and survival supplies. Tourism is up, not only to Mayan archaeological sites, but also to places like Bugarach, France and Mt. Rtanj, Serbia.  But even those of us on a budget can afford at least a book, a T-shirt or a handbag.

There are opportunities here for academics, too. Many scholars have been quoted in the press lately saying that nothing will happen on Dec 21 , in addition to those who have written comprehensive books and articles discrediting the impending doom. Obviously publishing helps individual careers, and that does not detract from our collective responsibility to debunk ideas that might lead people to physical or financial harm.  But neither can we divorce our work from its larger social implications.

It is telling that the main scholarly players in debunking the Mayan Apocalypse in the U.S. are NASA (which is facing budget cuts) and anthropologists.  Both groups feel the need to prove they are relevant because our collective jobs depend on it. I don’t need to go into great detail with this crowd about academia’s current situation. Academia has gone from being a well-respected, stable job to one where most classes are taught by underpaid, uninsured part-time adjuncts, and many Ph.D.s never find work in academia at all. Tuition fees for undergraduates have skyrocketed while full-time faculty salaries have stagnated.

Among the public (too often talked about as being in “the real world,” as if academics were somehow immune to taxes or swine flu), there seems to be a general distrust of intellectuals. That, combined with the current economic situation, has translated into a loss of research funding, such as cuts to the Fulbright program and NSF. Some public officials specifically state that science and engineering are worth funding, but anthropology is not.  To add insult to injury, the University of California wants to move away from that whole “reading” thing and rebrand itself as a web startup.

Articles, books with general readership, being quoted in the newspaper, and yes, blogging are all concrete ways to show funding agencies and review committees that what we do matters. The way to get exposure among those general audiences is to engage with what interests them — like the end of the world.  Dec. 21, 2012 has become an internet meme. Many online references to it are debunkings or tongue-in-cheek. Newspaper articles on unrelated topics make passing references in jest, stores offer just-in-case-it’s-real sales, people are planning parties.  There seems to be more written to discredit the apocalypse, or make fun on it, than to prepare for it.

We need to remember that this non-believer attention has a purpose, and that purpose is not just (or even primarily) about convincing believers that nothing is going to happen. Rather, it serves to demonstrate something about non-believers themselves.  “We” are sensible and logical, while “they” are superstitious and credulous. “We” value science and data, while “they” turn to astrology, misreadings of ancient texts, and esoteric spirituality.   ”We” remember the non-apocalypses of the past, while “they” have forgotten.

I would argue that discrediting the Mayan Apocalypse is part of an ongoing process of creating western modernity (cue Latour). That modernity requires an “other,” and here that “other” is defined in this case primarily by religious/spiritual belief in the Mayan apocalypse.  The more “other” these Apocalypse believers are, the more clearly they reflect the modernity of non-believers.  (Of course, there are also the “others” of the Maya themselves, and I’ll address that issue in my next post.)

This returns us to the difference I drew in my first post between “Transitional Apocalyptic Expectations” (TAE) and “Catastrophic Apocalyptic Expectations” (CAE).  I suspect the majority of believers are expecting something like a TAE-type event, but media attention focuses on discrediting CAE beliefs, such as a rogue planet hitting the Earth or massive floods. These would be dire catastrophes, but they will also be far easier to disprove. We will all notice if a planet does or does not hit the Earth next week, but many of us — myself included — will miss a transformation in human consciousness among the enlightened.

By providing the (very real) scientific data to discredit the apocalypse, scholars are incorporated into this project of modernity.  Much of the scholarly work on this phenomenon is fascinating and subtle, but the press picks up on two main themes.  One is scientific proof that the apocalypse will not happen, such as astronomical data that Earth is not on a collision course with another planet, Mayan epigraphy that shows the Long Count does not really end, and ethnography that suggests most Maya themselves are not worried about any of this.  The other scholarly theme the press circulates is the long history of apocalyptic beliefs in the west.  In the logic of the metanarrative of western progress, this connects contemporary Apocalypse believers to the past, nonmodernity and “otherness.”

I now find myself in an uncomfortable position, although it is an intellectually interesting corner to be backed into. I agree with my colleagues that the world will not end, that Mayan ideas have been misappropriated, and that we have a responsibility to address public concerns.  At the same time, I can’t help but feel we are being drawn, either reluctantly or willingly, into a larger project than extends far beyond next week.

*   *   *

2012, the movie we love to hate

by  on December 11th, 2012

The second in a guest series about the “Mayan Apocalypse” predicted for Dec. 21, 2012.  The first post is here.

Last summer, I traveled to Philadelphia to visit the Penn Museum exhibit “Maya: the Lords of Time.” It was, as one might expect given the museum collection and the scholars involved, fantastic.  I want to comment on just the beginning of the exhibit, however. On entering, one is immediately greeted by a wall crowded with TV screens, all showing different clips of predicted disasters and people talking fearfully about the end of the world. The destruction, paranoia, and cacophony create a ambiance of chaos and uncertainty. Turning the corner, these images are replaced by widely spaced Mayan artifacts and stela. The effect is striking.  One moves from media-induced insanity to serenity, from endless disturbing jump-cuts to the well-lit, quiet contemplation of beautiful art.

Among these images were scenes from Director Roland Emmerich’s blockbuster film 2012 (2009). This over-the-top disaster film is well used in that context.  Still, it is interesting how often 2012 is mentioned by academics and other debunkers — almost as often as they mention serious alternative thinkers about the Mayan calendar, such as Jose Arguelles (although the film receives less in-depth coverage than he does).

I find this interesting because 2012 is clearly not trying to convince us to stockpile canned goods or build boats to prepare for the end of the Maya Long Count, any more than Emmerich’s previous films were meant to prepare us for alien invasion (Independence Day, 1996) or the effects of global climate change (The Day After Tomorrow, 2004).  Like Emmerich’s previous films,2012 is a chance to watch the urban industrialized world burn (in that way, it has much in common with the currently popular zombie film genre). If you want to see John Cusack survive increasingly implausible crumbling urban landscapes, this film is for you.

The Maya, however, are barely mentioned in 2012. There are no Mayan characters, no one travels to Mesoamerica, there is no mention of the Long Count.  Emmerich’s goal for 2012 was, in his own words (here and here), “a modern retelling of Noah’s Ark.” In fact, he claims that the movie originally had nothing to do with the 2012 phenomenon at all.  Instead, he was convinced – reluctantly – to include the concept because of public interest in the Maya calendar.

This explains why the Maya only receive two passing mentions in 2012 — one is a brief comment that even “they” had been able to predict the end of the world, the other a short news report on a cult suicide in Tikal. The marketing aspect of the film emphasized these Maya themes (all of the film footage about the Maya is in the trailer, the movie website starts with a rotating image of the Maya calendar, and there are related extras on the DVD), but the movie itself had basically nothing to do with the Maya, the Mayan Long Count, or Dec 21.

Nevertheless, this film’s impact on public interest in Dec 21 is measurable.  Google Trends, which gives data on the number of times particular search terms are used, gives us a sense of the impact of this $200,000,000  film. I looked at a number of related terms, but have picked the ones that show thegeneral pattern: There is a spike of interest in 2012 apocalyptic ideas when the 2012 marketing campaign starts (November 2008), a huge spike when the film is released (November 2009), and a higher baseline of interest from then until now. Since January, interest in the Mayan calendar/apocalypse has been steadily climbing (and in fact, is higher every time I check this link; it automatically updates). In other words, the 2012 movie both responded to, and reinforced, public interest in the 2012 phenomenon.

Here I return to Michael D. Gordin’s The Pseudoscience Wars (2012).  This delightful book deals with the scientific response to Velikovsky, who believed that the miracles of the Old Testament and other ancient myths documented the emergence of a comet from Jupiter, its traumatic interactions with Earth, and its eventual settling into the role of the planet Venus. (The final chapter also discusses the 2012 situation.)  Gordin’s main focus is understanding why Velikovsky — unlike others labeled “crackpots” before him — stirred the public ire of astronomers and physicists. Academics’ real concern was not Velikovsky’s ideas per se, but how much attention he received by being published by MacMillan — a major publisher of science textbooks — which implied the book had scientific legitimacy. Velikovsky’s “Worlds in Collision” was a major bestseller when it was released in 1950, and academics felt the ideas had to be addressed so that the public would not be misled.

With the Mayan Apocalypse, no major academic publisher is lending legitimacy to these theories.   Books about expected events of 2012 (mainly TAE ideas) are published by specialty presses that focus on the spiritual counterculture, such as Evolver EditionsInner Traditions/Bear & CompanyShambhala, and John Hunt Publishing.  Instead, film media has become the battleground for public attention (perhaps because reading is declining?). The immense amount of money put into movies, documentaries, and TV shows about the Mayan Apocalypse is creating public interest today, and in some ways this parallels what Macmillan did for Velikovsky in the 1950s.

One example of this is the viral marketing campaign for 2012 conducted in November 2008.   Columbia pictures created webpages that were not clearly marked as advertising (these no longer appear to be available), promoting the idea that scientists really did know the world would end and were preparing.  This type of advertising was not unique to this film, but in this case it reinforced already existing fears that the end really was nigh.  NASA began responding to public fears about 2012 as a result of this marketing campaign, and many of the academics interested in addressing these concerns also published after this time.

Academics are caught in something of a bind here.  Do we respond to public fears, in the hopes of debunking them, but no doubt also increasing the public interest in the very ideas we wish to discredit?  Should we respond in the hopes of selling a few more books or receiving a few more citations, thus generating interest in the rest of what our discipline does?  As anthropologists we are not immune to the desires of public interest, certainly (obviously I’m not — here I am, blogging away), nor should we be.  Perhaps something good can come of the non-end-of-the-world.  I’ll turn to this question next time.

*   *   *

The End is Nigh. Start blogging.

by  on December 4th, 2012

Savage Minds welcomes guest blogger Clare A. Sammells.

My thanks to the editors of Savage Minds for allowing me to guest blog this month. Hopefully I will not be among the last of Savage Mind’s guests, given that the End of the World is nigh.

You hadn’t heard? On or around Dec 21, 2012, the Maya Long Count will mark the end of a 5125 year cycle. Will this be a mere a calendrical turn, no more inherently eventful that the transition from Dec 31, 2012 to Jan 1, 2013? Will this be a moment of astronomical alignments, fiery conflagrations, and social upheavals? Or will there be a shift in human consciousness, an opportunity for the prepared to improve their lives and achieve enlightenment?

I am going to bet with the house: I do not think the world is going to end in a few weeks.  That way, either the world doesn’t end — another victory for predictive anthropology! — or the world does end, and nothing I write here will matter much anyway. (More seriously, I don’t think our world is destined to end with a bang).

I am not a Mayanist, an archaeologist, or an astronomer. I won’t be discussing conflicting interpretations of Maya long count dates, astronomical observations, or Classical-era Maya stela inscriptions. Books by David Stuart,Anthony Aveni, and Matthew Restall and Amara Solari all provide detailed arguments using those data, and analyze the current phenomenon in light of the long history of western fascinations with End Times.  Articles by John HoopesKevin Whitesides, and Robert Sitler, among others, address “New Age” interpretations of the Maya.  Many ethnographers have considered how Maya peoples understand their complex interactions with “New Age” spiritualists and tourists, among them Judith MaxwellQuetzil Casteneda and Walter Little.

My own interest lies in how indigenous timekeeping is interpreted in the Andes. I conducted ethnographic research focusing on tourism in Tiwanaku, Bolivia — a pre-Incan archaeological site near Lake Titicaca, and a contemporary Aymara village.  One of the first things I noticed was that every tour guide tells visitors about multiple calendars inscribed in the stones of the site, most famously in the Puerta del Sol.  These calendrical interpretations are meaningful to Bolivian visitors, foreign tourists, and local Tiwanakenos for understanding the histories, ethnicities, and politics centered in this place. I took a stab at addressing some of these ideas in a recent article, where I considered how interconnected archaeological theories and political projects of the 1930s fed into what is today accepted conventional knowledge about Tiwanakota calendars.  I’m now putting together a book manuscript about temporal intersections in Tiwanaku.  The parallels between that situation and the Maya 2012 Phenomena led me to consider the prophecies, expectations, YouTube videos, blog posts, scholarly debunkings, and tourist travels motivated by the end of the Maya Long Count.

survey by the National Geographic Channel suggested that 27% of those in the United States think the Maya may have predicted a catastrophe for December 21.  But it is important to note that there is no agreement, even among believers, about what will happen. I tend to think of these beliefs as collecting into two broad (and often overlapping) camps.

Many believe that “something” will happen on (or around) Dec 21, 2012, but do not anticipate world destruction. I think of these beliefs as “Transitional Apocalyptic Expectations” (TAE). Writers such as José Argüelles and John Major Jenkins, for example, believe that there will be a shift in human consciousness, and tend to view the end of the 13th baktun as an opportunity for human improvement.

On the other hand, there are those who believe that the world will end abruptly, in fire, flood, cosmic radiation, or collision with other planets. I think of these beliefs as “Catastrophic Apocalyptic Expectations” (CAE).  While some share my belief that the numbers of serious CAE-ers is small, there are panics and survivalists reported by the press in RussiaFrance, and Los Angeles.  Tragically, there has been at least one suicide.  And of course, there has been a major Hollywood movie (“2012″), which I’ll be discussing more in my next post.

As anthropologists, we certainly should respond to public fears.  But we should also wonder why this fear, out of so many possible fears, is the one to capture public imagination.  Beliefs in paranormal activities, astrology, and the like are historically common, although the specifics change over time.  Michael D. Gordin’s excellent book The Pseudoscience Wars (2012) convincingly suggests that there are larger societal reasons why some fringe theories attract scholarly and public attention while others go ignored.  The Mayan Apocalypse has certainly attracted massive attention, from scholarly rebuttals from anthropologists, NASA, and others, to numerous popular parodies such as GQ’s survival tipsLOLcats, and my personal favorite, an advertisement for Mystic Mayan Power Cloaks.

There seems to be a general fascination with the Mayan calendar — even among those who know relatively little about the peoples that label refers to.  Some are anxiously watching the calendar count down, others are trying to reassure them, and many more simply watching, cracking jokes, or even selling supplies.  But there is something interesting about the fact that so many in the United States and Europe are talking about it at all.  I look forward to exploring these questions further with all of you.

Clare A. Sammells is Assistant Professor of Anthropology at Bucknell University. She is currently living in Madrid, where she is writing about concepts of time in Tiwanaku and conducting ethnographic research on food among Bolivian migrants.  She is not stockpiling canned goods.

Moral Injuries and the Environment: Healing the Soul Wounds of the Body Politic (Science & Environmental Health Network)

By Carolyn Raffensperger – December 6th, 2012

I have a hypothesis about the lack of public support for environmental action. I suspect that many people suffer from a sense of moral failure over environmental matters. They know that we are in deep trouble, that their actions are part of it, but there is so little they or anyone can do individually. Anne Karpf writing about climate change in the Guardian said this: “I now recycle everything possible, drive a hybrid car and turn down the heating. Yet somewhere in my marrow I know that this is just a vain attempt to exculpate myself – it wasn’t me, guv.”

To fully acknowledge our complicity in the problem but to be unable to act at the scale of the problem creates cognitive dissonance. Renee Aron Lertzman describes this as “environmental melancholia”, a form of hopelessness.  It is not apathy.  It is sorrow. The moral failure and the inability to act leads to what some now identify in other spheres as a moral injury, which is at the root of some post-traumatic stress disorders or ptsd.

The US military has been investigating the causes of soldiers’ ptsd because the early interpretations of it being fear-based didn’t match what psychologists were hearing from the soldiers themselves. What psychologists heard wasn’t fear, but sorrow and loss. Soldiers suffering from ptsd expressed enormous grief over things like killing children and civilians or over not being able to save a fellow soldier. They discovered that at the core of much of ptsd was a moral injury, which author Ed Tick calls a soul wound.

According to the U.S. Department of Veterans Affairs, “[e]vents are considered morally injurious if they “transgress deeply held moral beliefs and expectations”. Thus, the key precondition for moral injury is an act of transgression, which shatters moral and ethical expectations that are rooted in religious or spiritual beliefs, or culture-based, organizational, and group-based rules about fairness, the value of life, and so forth.”

The moral injury stemming from our participation in destruction of the planet has two dimensions: knowledge of our role and an inability to act. We know that we are causing irreparable damage. We are both individually and collectively responsible. But we are individually unable to make systemic changes that actually matter. The moral injury isn’t so much a matter of the individual psyche, but a matter of the body politic. Our culture lacks the mechanisms for taking account of collective moral injuries and then finding the vision and creativity to address them.  The difference between a soldier’s moral injury and our environmental moral injuries is that environmental soul wounds aren’t a shattering of moral expectations but a steady, grinding erosion, a slow-motion relentless sorrow.

My environmental lawyer friend Bob Gough says that he suffers from pre-traumatic stress disorder. Pre-traumatic stress disorder is short hand for the fact that he is fully aware of the future trauma, the moral injury that we individually and collectively suffer, the effects on the Earth of that injury and our inability to act in time.  Essentially pre-traumatic stress disorder, the environmentalist’s malady, is a result of our inability to prevent harm.

James Hillman once wrote a book with Michael Ventura called “We’ve Had a Hundred Years of Psychotherapy and the World’s getting Worse.” In it Hillman said that for years people would go into a therapist and say “the traffic in L.A. is making me crazy” and the therapist would say “let’s deal with your mother issues.” Hillman said “deal with the traffic in L.A.”

So much of environmental or health messaging speaks to us as individuals.  “Stop smoking, get more exercise, change your light bulbs.”  We take on the individual responsibility for the moral failure.  Sure, we need to do all that we can as individuals–that is part of preventing any further damage to the planet or our own souls.  But that isn’t enough.  We all know it.  We have to overcome our assumption that the problem is our mother issues (or the equivalent) and deal with the traffic in L.A., climate change, the loss of the pollinators.  These are not things we can address individually.  We have to do them together.

Healing the moral injury we suffer individually and collectively from our participation in destruction of the planet will require strong intervention in all spheres of life. Actions like creating a cabinet level office of the guardian of future generations or 350.org’s campaign for colleges to divest of oil stocks, or revamping public transportation are beginning steps. Can we think of a hundred more bold moves to make reparations and give future generations a sporting chance? Our moral health, our sanity—and our survival—depend on it.

Monbiot: The Gift of Death (The Guardian)

December 10, 2012

Pathological consumption has become so normalised that we scarcely notice it.

By George Monbiot, published in the Guardian 11th December 2012

There’s nothing they need, nothing they don’t own already, nothing they even want. So you buy them a solar-powered waving queen; a belly button brush; a silver-plated ice cream tub holder; a “hilarious” inflatable zimmer frame; a confection of plastic and electronics called Terry the Swearing Turtle; or – and somehow I find this significant – a Scratch Off World wall map.

They seem amusing on the first day of Christmas, daft on the second, embarrassing on the third. By the twelfth they’re in landfill. For thirty seconds of dubious entertainment, or a hedonic stimulus that lasts no longer than a nicotine hit, we commission the use of materials whose impacts will ramify for generations.

Researching her film The Story of Stuff, Annie Leonard discovered that of the materials flowing through the consumer economy, only 1% remain in use six months after sale(1). Even the goods we might have expected to hold onto are soon condemned to destruction through either planned obsolescence (breaking quickly) or perceived obsolesence (becoming unfashionable).

But many of the products we buy, especially for Christmas, cannot become obsolescent. The term implies a loss of utility, but they had no utility in the first place. An electronic drum-machine t-shirt; a Darth Vader talking piggy bank; an ear-shaped i-phone case; an individual beer can chiller; an electronic wine breather; a sonic screwdriver remote control; bacon toothpaste; a dancing dog: no one is expected to use them, or even look at them, after Christmas Day. They are designed to elicit thanks, perhaps a snigger or two, and then be thrown away.

The fatuity of the products is matched by the profundity of the impacts. Rare materials, complex electronics, the energy needed for manufacture and transport are extracted and refined and combined into compounds of utter pointlessness. When you take account of the fossil fuels whose use we commission in other countries, manufacturing and consumption are responsible for more than half of our carbon dioxide production(2). We are screwing the planet to make solar-powered bath thermometers and desktop crazy golfers.

People in eastern Congo are massacred to facilitate smart phone upgrades of ever diminishing marginal utility(3). Forests are felled to make “personalised heart-shaped wooden cheese board sets”. Rivers are poisoned to manufacture talking fish. This is pathological consumption: a world-consuming epidemic of collective madness, rendered so normal by advertising and the media that we scarcely notice what has happened to us.

In 2007, the journalist Adam Welz records, 13 rhinos were killed by poachers in South Africa. This year, so far, 585 have been shot(4). No one is entirely sure why. But one answer is that very rich people in Vietnam are now sprinkling ground rhino horn on their food or snorting it like cocaine to display their wealth. It’s grotesque, but it scarcely differs from what almost everyone in industrialised nations is doing: trashing the living world through pointless consumption.

This boom has not happened by accident. Our lives have been corralled and shaped in order to encourage it. World trade rules force countries to participate in the festival of junk. Governments cut taxes, deregulate business, manipulate interest rates to stimulate spending. But seldom do the engineers of these policies stop and ask “spending on what?”. When every conceivable want and need has been met (among those who have disposable money), growth depends on selling the utterly useless. The solemnity of the state, its might and majesty, are harnessed to the task of delivering Terry the Swearing Turtle to our doors.

Grown men and women devote their lives to manufacturing and marketing this rubbish, and dissing the idea of living without it. “I always knit my gifts”, says a woman in a television ad for an electronics outlet. “Well you shouldn’t,” replies the narrator(5). An advertisement for Google’s latest tablet shows a father and son camping in the woods. Their enjoyment depends on the Nexus 7’s special features(6). The best things in life are free, but we’ve found a way of selling them to you.

The growth of inequality that has accompanied the consumer boom ensures that the rising economic tide no longer lifts all boats. In the US in 2010 a remarkable 93% of the growth in incomes accrued to the top 1% of the population(7). The old excuse, that we must trash the planet to help the poor, simply does not wash. For a few decades of extra enrichment for those who already possess more money than they know how to spend, the prospects of everyone else who will live on this earth are diminished.

So effectively have governments, the media and advertisers associated consumption with prosperity and happiness that to say these things is to expose yourself to opprobrium and ridicule. Witness last week’s Moral Maze programme, in which most of the panel lined up to decry the idea of consuming less, and to associate it, somehow, with authoritarianism(8). When the world goes mad, those who resist are denounced as lunatics.

Bake them a cake, write them a poem, give them a kiss, tell them a joke, but for god’s sake stop trashing the planet to tell someone you care. All it shows is that you don’t.

http://www.monbiot.com

1. http://www.storyofstuff.org/movies-all/story-of-stuff/

2. It’s 57%. See http://www.monbiot.com/2010/05/05/carbon-graveyard/

3. See the film Blood in the Mobile. http://bloodinthemobile.org/

4.http://e360.yale.edu/feature/the_dirty_war_against_africas_remaining_rhinos/2595/

5. http://www.youtube.com/watch?v=i7VE2wlDkr8&list=UU25QbTq58EYBGf2_PDTqzFQ&index=9

6. http://www.ubergizmo.com/2012/07/commercial-for-googles-nexus-7-tablet-revealed/

7. Emmanuel Saez, 2nd March 2012. Striking it Richer: the Evolution of Top Incomes in the United States (Updated with 2009 and 2010 estimates).http://elsa.berkeley.edu/~saez/saez-UStopincomes-2010.pdf

8. http://www.bbc.co.uk/programmes/b01p424r

A luta pela sobrevivência dos Guaranis-Kaiowás (BBC Brasil)

Mônica Vasconcelos – Da BBC Brasil em Londres

12 de dezembro, 2012 – 09:58 (Brasília) 11:58 GMT

Uma série de fotos feitas pela fotógrafa paulistana Rosa Gauditano mostra a luta pela sobrevivência de índios Guarani-Kaiowá na beira das estradas de Mato Grosso do Sul.

Há hoje mais de 30 acampamentos indígenas nas rodovias do Estado, habitados, em grande parte, por Kaiowás.

“Fazem isso por desespero, mas também como uma forma de protesto”, disse a fotógrafa.

“Eu fotografo povos indígenas há 20 anos e nunca havia visto situação de penúria tão grande. O que está acontecendo no Brasil é um genocídio silencioso”.

“Em algum momento, os índios, os fazendeiros, o governo e a sociedade brasileira como um todo terão de chegar a um consenso e resolver a situação desse povo. São 43 mil pessoas que precisam de sua terra para viver com dignidade”.

“E se a solução é indenizar os fazendeiros que geram riqueza para o Brasil e que adquiriram a terra por meios legais, que seja”.

Nas imagens, feitas ao longo dos últimos três anos, o povo da segunda maior etnia indígena brasileira também é visto acampado provisoriamente em fazendas onde há disputa pela propriedade da terra ou vivendo em reservas demarcadas – às vezes, à custa de sangue derramado.

Gauditano começou a fotografar povos indígenas no Brasil em 1991. Desde então, vem documentando a cultura de diversas etnias indígenas, publicando livros e realizando exposições sobre o tema, no Brasil e no exterior (ela expôs seu trabalho no centro cultural South Bank, em Londres, Grã-Bretanha, em 2010).

Ao lado de representantes da etnia Xavante, Gauditano é também co-fundadora da ONG Nossa Tribo, que tenta ampliar a comunicação entre povos indígenas e o resto da população.

Suicídios

Segundo dados do último censo, há hoje 896,9 mil índios no Brasil. Os cerca de 43 mil Kaiowás são naturais da região onde hoje ficam o Estado de Mato Grosso do Sul e o Paraguai.

Em outubro, o caso de uma comunidade dessa tribo, acampada na fazenda Cambará, no município de Iguatemi, MS, causou comoção no Brasil.

Após uma ordem de despejo emitida pela Justiça Federal, os 170 índios do acampamento, em um local conhecido como Pyelito Kue, escreveram uma carta que dizia: “Pedimos ao Governo e à Justiça Federal para não decretar a ordem de despejo/expulsão, mas solicitamos para decretar a nossa morte coletiva e para enterrar nós todos aqui”.

A carta, divulgada pelo Conselho Indigenista Missionário (Cimi) foi interpretada como uma ameaça de suicídio coletivo. Ela circulou pelas redes sociais e deu origem a uma grande campanha em defesa dos índios, com protestos em vários pontos do país.

Como resultado, um tribunal decidiu pela permanência dos índios no local. Mas a situação do grupo ainda não está regularizada.

Em entrevista à BBC Brasil, Rosa Gauditano explicou por que a carta da comunidade de Pyelito Kue foi interpretada como uma ameaça de suicídio.

“Isso foi mal interpretado, por causa do histórico de mortes por suicídio entre os Kaiowás”, explicou. “Não disseram que iam fazer um suicídio coletivo. A intenção era dizer ao governo federal que dali só sairiam mortos”.

O índice de suicídios entre os Kaiowás começou a crescer a partir da década de 80, quando mais e mais fazendeiros passaram a adquirir terras na região do Mato Grosso do Sul, ou receberam concessões de terras do governo. Desde então, a região se dedica à produção intensiva de soja, milho, cana de açúcar e gado.

Removidos da terra, os Guaranis-Kaiowás – que ocupavam tradicionalmente a vasta região – começaram a ser levados para reservas demarcadas pelas autoridades.

“Essas reservas hoje têm uma população muito grande e as pessoas não conseguem viver ali do modo tradicional, não conseguem plantar ou caçar”, disse a fotógrafa.

Segundo o antropólogo do Centro de Estudos Ameríndios da Universidade de São Paulo Spensy Pimentel, que estuda a etnia Guarani-Kaiowá e trabalha com Gauditano, há 42 mil hectares de terras demarcadas pelo governo no Estado. “Essas são as áreas efetivamente disponíveis”, disse Pimentel à BBC Brasil. “Há mais uns 50 mil hectares demarcados, mas tudo embargado pela Justiça”.

À primeira vista, o território disponível parece grande. Mas se fosse dividido entre a população Kaiowá, cada índio receberia pouco menos do que um hectare de terra – 10.000 m2 (100m x 100m). Ali, ele teria de viver e dali tirar seu sustento – algo impossível para qualquer agricultor.

Pimentel lembrou, no entanto, que esse tipo de cálculo usa critérios que não se aplicam à cultura indígena. “A Constituição brasileira assegura aos índios o direito às suas terras tradicionalmente ocupadas segundo seus próprios critérios”.

Espremidos em reservas superpovoadas, os índios vivem sob estresse físico e mental. O alcoolismo e o uso de drogas são comuns.

Segundo o Ministério da Saúde, de 2000 a 2011 houve 555 suicídios de índios, a maioria Guaranis-Kaiowás. E o Estado de Mato Grosso do Sul é o campeão em número de suicídios no Brasil.

Esse comportamento não é parte da “tradição” da etnia, explicou o antropólogo.

“Os indígenas mais velhos são unânimes em afirmar que, por mais que possam entender a decisão de uma pessoa que toma essa opção, eles não viram mais que um ou dois casos de suicídios antes dos anos 80”, disse. “Nesse sentido, os suicídios não podem ser vistos fora do contexto do confinamento dos Guarani-Kaiowá que foi produzido pelo Estado brasileiro. Foi dentro das reservas superlotadas e diante da falta de perspectiva de vida para os jovens que os suicídios se transformaram em uma epidemia”.

Beira de Estrada

Outra resposta para essa situação de estresse intolerável – explicou Gauditano – foi abandonar as reservas e ir para a estrada.

Fotógrafa experiente, Gauditano se confessou chocada ao se deparar com os acampamentos nas estradas que cercam a cidade de Dourados, um dos polos econômicos do Estado de Mato Grosso do Sul.

“As famílias vão para as estradas, fazem acampamentos em um espaço de 30 m que fica entre a cerca da fazenda e a beira da estrada. Ficam vivendo ali durante anos. Às vezes, se mudam de um ponto para outro se são pressionados. Não têm água potável nem banheiro, não podem fazer uma roça, não têm comida, escola, nada. E fazem as casas com plástico preto. A temperatura dentro dessas cabanas chega a 50 graus durante o dia, não dá pra ficar ali dentro”.

“Crianças, velhos, famílias inteiras ficam acampadas na beira da estrada. É um desespero. E há muitos atropelamentos, porque aquilo é um corredor de auto-estrada, onde passam ônibus, caminhões, carros”.

Uma das fotos mostra a situação dentro de uma cabana à beira da estrada. Quando chove, a água alaga as cabanas, explicou a fotógrafa.

“Uma vez, choveu muito e eles passaram quatro meses com 50 cm de água dentro das cabanas”.

“O que você vê na foto é barro. A cama está suspensa porque tem barro dentro da cabana, então eles puseram pedras para poderem andar ali dentro. Se você pisa entre as pedras, seu pé afunda”.

“É como um lodo, tem até um sapo ali. Eu fiz a foto e na hora não vi, porque não tinha janela”.

Violência

A fotógrafa disse ter ficado marcada pelo olhar dos índios.

“O olhar. As pessoas têm um olhar tão triste que você fica incomodado. Bebezinhos, crianças e velhos te olham e parece que estão olhando para o nada.”

O que as fotos de Gauditano não mostram, no entanto, é a violência que permeia as vidas do povo Guarani-Kaiowá.

“Gerações de líderes são assassinadas e você não acha os corpos. Há uma violência latente, muito grande, por baixo do pano.”

Esperança

Em meio ao sofrimento que observou em suas expedições ao MS para fotografar os Guaranis-Kaiowás, Gauditano disse também ter encontrado serenidade e leveza.

Na aldeia Guaiviry, no município de Aral Moreira, a fotógrafa registrou imagens de crianças que cantavam e dançavam.

“A cena me passa esperança. O que segura o povo indígena é sua história, sua língua, sua religião e seus rituais”, disse. “E criança sempre tem um bom astral. Sentem a barra pesada, mas estão sempre brincando e pulando”.

O ano passado deve ter sido traumático para as crianças de Guaiviry. O cacique da tribo, Nísio Gomes, foi assassinado em novembro de 2011.

A terra da comunidade foi demarcada, mas a demarcação foi contestada e o caso está sendo julgado pela Justiça.

Em outra cena de aparente tranquilidade, uma Guarani-Kaiowá é vista rodeada de porquinhos.

Mas a relativa paz e contentamento em que vivem a índia e sua família, em uma pequena reserva demarcada – a aldeia de Piracuá, no município de Bela Vista -, tiveram um custo alto. Em 1983, um líder indígena que vivia na região, Marçal de Souza, também foi assassinado.

“Hoje, as famílias vivem bem ali, com sua terra, podendo fazer pequenas plantações de subsistência. Tem escola, assistência do governo, uma mata nativa”, explicou Gauditano.

‘Comunicação é Poder’

Mas se por um lado os Kaiowás anseiam por viver em paz em seus territórios – e eles entendem que as reservas ocuparão apenas uma parte da terra que um dia foi deles -, a comunidade também abraça a modernidade, disse Gauditano.

“A tecnologia é muito importante para os índios hoje, principalmente o video, os celulares e a internet”.

Segundo a fotógrafa, esses recursos permitem a comunicação não apenas dentro das próprias comunidades, mas entre as comunidades e o mundo lá fora.

“A tecnologia e as mídias sociais tiveram um papel fundamental na divulgação do drama dos Kaiowás despejados da aldeia em Pyelito Kue.”

“Nunca vi uma mobilização tão grande da população brasileira em defesa de uma comunidade indígena como a que aconteceu em outubro”, disse Gauditano.

“Isso me fez perceber o potencial imenso de mídias sociais, como o Facebook, para a causa indígena. Realmente, comunicação é poder!”.

Um dos acampamentos fotografados por Rosa Gauditano, o Laranjeira, ficava na BR 163, nas imediações de Dourados, MS. Desde que as fotos foram feitas – em 2010 – os índios conseguiram as terras que reivindicavam, no município de Rio Brilhante. Entraram nas terras, mas ainda vivem em situação provisória, aguardando que a Fundação Nacional do Índio (Funai) identifique formalmente as terras – processo burocrático demorado, feito com base em pareceres de antropólogos.

Guarani Kaiowá março 2010

Guerreiro Guarani Kaiowá recebe representantes da Secretaria de Direitos Humanos da Presidência da República, no antigo acampamento Laranjeira, na BR 163. O grupo ainda vive em situação provisória, aguardando que a Funai identifique formalmente sua terra. Foto: Rosa Gauditano/Studio R

Índios Guarani Kaiowá

O Kaiowá mostra para a câmera uma espiga de milho tradicional, de uma variedade que vem sendo cultivada pela tribo há séculos. ‘Ele não queria ser fotografado. A tristeza no olhar dele e a pobreza da comunidade são evidentes’, comenta Gauditano. Em outubro de 2012, uma comunidade Kaiowá escreveu uma polêmica carta, interpretada por alguns como uma ameaça de suicídio coletivo. O resultado foi uma grande campanha em defesa dos índios, com protestos em vários pontos do país. Foto: Rosa Gauditano/Studio R

Ver todas as fotos aqui.

You Can Give a Boy a Doll, but You Can’t Make Him Play With It (The Atlantic)

By Christina Hoff Sommers

DEC 6 2012, 11:29 AM ET 223

The logistical and ethical problems with trying to make toys gender-neutral

sommers_boydoll_post.jpg

Top-Toy

Is it discriminatory and degrading for toy catalogs to show girls playing with tea sets and boys with Nerf guns? A Swedish regulatory group says yes. The Reklamombudsmannen (RO) has reprimanded Top-Toy, a licensee of Toys”R”Us and one of the largest toy companies in Northern Europe, for its “outdated” advertisements and has pressured it to mend its “narrow-minded” ways. After receiving “training and guidance” from RO equity experts, Top-Toy introduced gender neutrality in its 2012 Christmas catalogue. The catalog shows little boys playing with a Barbie Dream House and girls with guns and gory action figures. As its marketing director explains, “For several years, we have found that the gender debate has grown so strong in the Swedish market that we have had to adjust.”

Swedes can be remarkably thorough in their pursuit of gender parity. A few years ago, a feminist political party proposed a law requiring men to sit while urinating—less messy and more equal. In 2004, the leader of the Sweden’s Left Party Feminist Council, Gudrun Schyman,proposed a “man tax”—a special tariff to be levied on men to pay for all the violence and mayhem wrought by their sex. In April 2012, following the celebration of International Women’s Day, the Swedes formally introduced the genderless pronoun “hen” to be used in place of he and she (han and hon).

Egalia, a new state-sponsored pre-school in Stockholm, is dedicated to the total obliteration of the male and female distinction. There are no boys and girls at Egalia—just “friends” and “buddies.” Classic fairy tales like Cinderellaand Snow White have been replaced by tales of two male giraffes who parent abandoned crocodile eggs. The Swedish Green Party would like Egalia to be the norm: It has suggested placing gender watchdogs in all of the nation’s preschools. “Egalia gives [children] a fantastic opportunity to be whoever they want to be,” says one excited teacher. (It is probably necessary to add that this is not an Orwellian satire or a right-wing fantasy: This school actually exists.)

The problem with Egalia and gender-neutral toy catalogs is that boys and girls, on average, do not have identical interests, propensities, or needs. Twenty years ago, Hasbro, a major American toy manufacturing company, tested a playhouse it hoped to market to both boys and girls. It soon emerged that girls and boys did not interact with the structure in the same way. The girls dressed the dolls, kissed them, and played house. The boys catapulted the toy baby carriage from the roof. A Hasbro manager came up with a novel explanation: “Boys and girls are different.”

They are different, and nothing short of radical and sustained behavior modification could significantly change their elemental play preferences. Children, with few exceptions, are powerfully drawn to sex-stereotyped play. David Geary, a developmental psychologist at the University of Missouri, told me in an email this week, “One of the largest and most persistent differences between the sexes are children’s play preferences.” The female preference for nurturing play and the male propensity for rough-and-tumble hold cross-culturally and even cross-species (with a few exceptions—female spotted hyenas seem to be at least as aggressive as males). Among our close relatives such as vervet and rhesus monkeys, researchers have found that females play with dolls far more than their brothers, who prefer balls and toy cars. It seems unlikely that the monkeys were indoctrinated by stereotypes in a Top-Toy catalog. Something else is going on.

Biology appears to play a role. Several animal studies have shown that hormonal manipulation can reverse sex-typed behavior. When researchers exposed female rhesus monkeys to male hormones prenatally, these females later displayed male-like levels of rough-and-tumble play. Similar results are found in human beings. Congenital adrenal hyperplasia (CAH) is a genetic condition that results when the female fetus is subjected to unusually large quantities of male hormones—adrenal androgens. Girls with CAH tend to prefer trucks, cars, and construction sets over dolls and play tea sets. As psychologist Doreen Kimura reported in Scientific American, “These findings suggest that these preferences were actually altered in some way by the early hormonal environment.” They also cast doubt on the view that gender-specific play is primarily shaped by socialization.

Professor Geary does not have much hope for the new gender-blind toy catalogue: “The catalog will almost certainly disappear in a few years, once parents who buy from it realize their kids don’t want these toys.” Most little girls don’t want to play with dump trucks, as almost any parent can attest. Including me: When my granddaughter Eliza was given a toy train, she placed it in a baby carriage and covered it with a blanket so it could get some sleep.

Androgyny advocates like our Swedish friends have heard such stories many times, and they have an answer. They acknowledge that sex differences have at least some foundation in biology, but they insist that culture can intensify or diminish their power and effect. Even if Eliza is prompted by nature to interact with a train in a stereotypical female way, that is no reason for her parents not to energetically correct her. Hunter College psychologist Virginia Valian, a strong proponent of Swedish-style re-genderization, wrote in the book Why So Slow? The Advancement of Women, “We do not accept biology as destiny … We vaccinate, we inoculate, we medicate… I propose we adopt the same attitude toward biological sex differences.”

Valian is absolutely right that we do not have to accept biology as destiny. But the analogy is ludicrous: We vaccinate, inoculate, and medicate children against disease. Is being a gender-typical little boy or girl a pathology in need of a cure? Failure to protect children from small pox, diphtheria, or measles places them in harm’s way. I don’t believe there is any such harm in allowing male/female differences to flourish in early childhood. As one Swedish mother, Tanja Bergkvist,told the Associated Press, “Different gender roles aren’t problematic as long as they are equally valued.” Gender neutrality is not a necessary condition for equality. Men and women can be different—but equal. And for most human beings, the differences are a vital source for meaning and happiness. Since when is uniformity a democratic ideal?

Few would deny that parents and teachers should expose children to a wide range of toys and play activities. But what the Swedes are now doing in some of their classrooms goes far beyond encouraging children to experiment with different toys and play styles—they are requiring it. And toy companies who resist the gender neutrality mandate face official censure. Is this kind of social engineering worth it? Is it even ethical?

To succeed, the Swedish parents, teachers and authorities are going to have to police—incessantly—boys’ powerful attraction to large-group rough-and-tumble play and girls’ affinity for intimate theatrical play. As Geary says, “You can change some of these behaviors with reinforcement and monitoring, but they bounce back once this stops.” But this constant monitoring can also undermine children’s healthy development.

Anthony Pellegrini, a professor of early childhood education at the University of Minnesota, defines the kind of rough-and-tumble play that boys favor as a behavior that includes “laughing, running, smiling, jumping, open-hand beating, wrestling, play fighting, chasing and fleeing.” This kind of play is often mistakenly regarded as aggression, but according to Pellegrini, it is the very opposite. In cases of schoolyard aggression, the participants are unhappy, they part as enemies, and there are often tears and injuries. Rough-and-tumble play brings boys together, makes them happy, and is a critical party of their social development.

Researchers Mary Ellin Logue (University of Maine) and Hattie Harvey (University of Denver ) agree, and they have documented the benefits of boys’ “bad guy” superhero action narratives. Teachers tend not to like such play, say Logue and Harvey, but it improves boys’ conversation, creative writing skills, and moral imagination. Swedish boys, like American boys, are languishing far behind girls in school. In a 2009 study Logue and Harvey ask an important question the Swedes should consider: “If boys, due to their choices of dramatic play themes, are discouraged from dramatic play, how will this affect their early language and literacy development and their engagement in school?”

What about the girls? Nearly 30 years ago, Vivian Gussin Paley, a beloved kindergarten teacher at the Chicago Laboratory Schools and winner of a MacArthur “genius” award, published a classic book on children’s play entitled Boys & Girls: Superheroes in the Doll Corner. Paley wondered if girls are missing out by not partaking in boys’ superhero play, but her observations of the “doll corner” allayed her doubts. Girls, she learned, are interested in their own kind of domination. Boys’ imaginative play involves a lot of conflict and imaginary violence; girls’ play, on the other hand, seems to be much gentler and more peaceful. But as Paley looked more carefully, she noticed that the girls’ fantasies were just as exciting and intense as the boys—though different. There were full of conflict, pesky characters and imaginary power struggles. “Mothers and princesses are as powerful as any superheroes the boys can devise.” Paley appreciated the benefits of gendered play for both sexes, and she had no illusions about the prospects for its elimination: “Kindergarten is a triumph of sexual self-stereotyping. No amount of adult subterfuge or propaganda deflects the five-year-old’s passion for segregation by sex.”

But subterfuge and propaganda appear to be the order of the day in Sweden. In their efforts to free children from the constraints of gender, the Swedish reformers are imposing their own set of inviolate rules, standards, and taboos. Here is how Slate author Nathalie Rothchild describes a gender-neutral classroom:

One Swedish school got rid of its toy cars because boys “gender-coded” them and ascribed the cars higher status than other toys. Another preschool removed “free playtime” from its schedule because, as a pedagogue at the school put it, when children play freely ‘stereotypical gender patterns are born and cemented. In free play there is hierarchy, exclusion, and the seed to bullying.’ And so every detail of children’s interactions gets micromanaged by concerned adults, who end up problematizing minute aspects of children’s lives, from how they form friendships to what games they play and what songs they sing.

The Swedes are treating gender-conforming children the way we once treated gender-variant children. Formerly called “tomboy girls” and “sissy boys” in the medical literature, these kids are persistently attracted to the toys of the opposite sex. They will often remain fixated on the “wrong” toys despite relentless, often cruel pressure from parents, doctors, and peers. Their total immersion in sex-stereotyped culture—a non-stop Toys”R”Us indoctrination—seems to have little effect on their passion for the toys of the opposite sex. There was a time when a boy who displayed a persistent aversion to trucks and rough play and a fixation on frilly dolls or princess paraphernalia would have been considered a candidate for behavior modification therapy. Today, most experts encourage tolerance, understanding, and acceptance: just leave him alone and let him play as he wants. The Swedes should extend the same tolerant understanding to the gender identity and preferences of the vast majority of children.

Why Sandy Has Meteorologists Scared, in 4 Images (The Atlantic)

By Alexis Madrigal

OCT 28 2012, 12:23 PM ET 126

She’s huge. She’s strong and might get stronger. She’s strange. She’s directing the might of her storm surge right at New York City.

sandycomes_615.jpgUpdate 10/29, 4:49pm: The Eastern seaboard has battened down the hatches. Hurricane Sandy is expected to make landfall in New Jersey in the next few hours, but flooding has been reported in Atlantic City and pieces of New York during this morning’s high tide cycle. The Metropolitan Transportation Authority already shut down rail, bus, and subway service in NYC, as did Washington DC’s authorities. All eyes are on the 8 o’clock hour, when the storm surge from Sandy will combine with a very high tide to create maximum water levels. In the worst case scenario, the storm surge will hit precisely at the moment the tide peaks at 8:53pm. In that scenario, New York City, in particular, could sustain substantial damage, especially to its transportation infrastructure.

The good news, if there is any, is that the forecast hasn’t worsened much. It is what it has been, which is grim. Meteorologist Jeff Masters put it in simple terms. “As the core of Sandy moves ashore, the storm will carry with it a gigantic bulge of water that will raise waters levels to the highest storm tides ever seen in over a century of record keeping, along much of the coastline of New Jersey and New York,” Masters wrote today. “The peak danger will be between 7 pm – 10 pm, when storm surge rides in on top of the high tide.”

Here’s the latest map of the prospective storm surge tonight. You can compare it to the image at the bottom, which shows what the forecast was yesterday.

probofstormsurge_1029.jpg

* * *

Hurricane Sandy has already caused her first damage in New York: the subway system will be shut as of 7pm tonight. Meteorologists are scared, so city planners are scared.

For many, the hullabaloo raises memories of Irene, which despite causing $15.6 billion worth of damages in the United States, did not live up to its pre-arrival hype.

By almost all measures, this storm looks like it could be worse: higher winds, a path through a more populated area, worse storm surge, and a greater chance it’ll linger. The atmospherics, you might say, all point to this being the worst storm in recent history.

I’ve been watching weather nerds freak out about a few different graphs over the last several days, which they’ve sent around like sports fans would tweet a particularly vicious hit in the NFL. You don’t want to look, but you also can’t help it.

Dr. Ryan Maue, a meteorologist at WeatherBELL, put out this animated GIF of the storm’s approach yesterday. “This is unprecedented –absolutely stunning upper-level configuration pinwheeling #Sandy on-shore like ping-pong ball,” he tweeted. It shows how cold air to the north and west of the storm spin Sandy into the mid-atlantic coastline. (Nota bene: his models also show very high winds at skyscraper altitudes.)

 

hurricanegif.gifThis morning, the Wall Street Journal’s Eric Holthaus (@WSJweather), tweeted the following map. “Oh my…. I have never seen so much purple on this graphic. By far. Never,” he said. “Folks, please take this storm seriously.” The storm is strong *and* huge. And when it encounters the cold air from the north and west, it will develop renewed strength thanks to that interaction, a process known as “baroclinic enhancement.”

sandyboom.gif

This last graphic I created from National Oceanographic and Atmospheric Administration data that has weather watchers worried. It shows the probability of a greater than six foot storm surge  in and around New York City. Hurricane Irene, by comparison, caused a four foot surge.
probofstormsurge.jpg
Note that the highest probabilities are focused tightly around New York City, which also happens to be the most densely populated area in the country. That’s a very bad combination. Jeff Masters, author of the must-read storm blog Wunderground, laid out the general problem.
“[According to last night’s forecast], the destructive potential of the storm surge was exceptionally high: 5.7 on a scale of 0 to 6,” he wrote. “This is a higher destructive potential than any hurricane observed between 1969 – 2005, including Category 5 storms like Katrina, Rita, Wilma, Camille, and Andrew.”
Specifically, New York City’s infrastructure may take an unprecedented hit. The subway narrowly escaped flooding during Irene, and Sandy (for all the reasons above) is expected to be worse. So…

“According to the latest storm surge forecast for NYC from NHC, Sandy’s storm surge is expected to be several feet higher than Irene’s. If the peak surge arrives near Monday evening’s high tide at 9 pm EDT, a portion of New York City’s subway system could flood, resulting in billions of dollars in damage,” Masters concluded. “I give a 50% chance that Sandy’s storm surge will end up flooding a portion of the New York City subway system.”

Update 1:06pm: To get a taste of how forecasters are feeling, here is The Weather Channel’s senior meteorologist, Stu Ostro:

History is being written as an extreme weather event continues to unfold, one which will occupy a place in the annals of weather history as one of the most extraordinary to have affected the United States.

On Twitter, Alan Robinson pointed out that I left out another scary map, the rainfall forecast, which shows the storm “sitting over the Delaware and Susquehanna watersheds.” Much of the damage that Irene caused came from flooding rivers. However, there is one key factor militating against similar damage, Jeff Masters of Wunderground says. Irene hit when the ground was already very wet. Sandy is striking when ground moisture is roughly average. Here’s Masters whole statement:

Hurricane Irene caused $15.8 billion in damage, most of it from river flooding due to heavy rains. However, the region most heavily impacted by Irene’s heavy rains had very wet soils and very high river levels before Irene arrived, due to heavy rains that occurred in the weeks before the hurricane hit. That is not the case for Sandy; soil moisture is near average over most of the mid-Atlantic, and is in the lowest 30th percentile in recorded history over much of Delaware and Southeastern Maryland. One region of possible concern is the Susquehanna River Valley in Eastern Pennsylvania, where soil moisture is in the 70th percentile, and river levels are in the 76th – 90th percentile. This area is currently expected to receive 3 – 6 inches of rain (Figure 4), which is probably not enough to cause catastrophic flooding like occurred for Hurricane Irene. I expect that river flooding from Sandy will cause less than $1 billion in damage.

Mais de 30% das terras indígenas na Amazônia sofrerão impacto por causa de hidrelétricas (Agência Brasil)

JC e-mail 4639, de 07 de Dezembro de 2012

A avaliação é do procurador Felício Pontes, do Ministério Público Federal (MPF) no Pará.

Mais de 30% das terras indígenas na Amazônia vão sofrer algum tipo de impacto com a construção das hidrelétricas previstas para a região. Na avaliação do procurador Felício Pontes, do Ministério Público Federal (MPF) no Pará, o projeto do governo brasileiro, que prevê a instalação de 153 empreendimentos nos próximos 20 anos, também vai afetar a vida de quase todas as populações tradicionais amazonenses.

“Aprendemos isso da pior maneira possível”, avaliou Pontes, destacando o caso de Tucuruí, no Pará. A construção da usina hidrelétrica no município paraense, em 1984, causou mudanças econômicas e sociais em várias comunidades próximas à barragem. No município de Cametá, por exemplo, pescadores calculam que a produção local passou de 4,7 mil toneladas por ano para 200 toneladas de peixes desde que a usina foi construída.

Pontes lembrou que tanto a legislação brasileira quanto a Convenção 169 da Organização Internacional do Trabalho (OIT) determinam que as autoridades consultem as comunidades locais, sempre que existir possibilidade de impactos provocados por decisões do setor privado ou dos governos. Mas, segundo ele, esse processo não tem sido cumprido da forma adequada.

Para Pontes, o governo brasileiro precisa se posicionar sobre as comunidades e os investimentos previstos para infraestrutura. Na avaliação do procurador, o posicionamento virá quando o Supremo Tribunal Federal (STF) julgar, no próximo ano, ação que trata da falta de consulta prévia às comunidades tradicionais antes da construção do Complexo de Belo Monte. “O STF vai definir a posição brasileira”, disse, defendendo a exigência do consentimento das comunidades indígenas e povos tradicionais antes do início das obras.

Os projetos de infraestrutura previstos pelo governo na região da Amazônia dominam os debates do Fórum Amazônia Sustentável, que ocorre em Belém, no Pará. Representantes de organizações ambientais e alguns poucos empresários discutem, desde quarta-feira (5), soluções para impasses entre a infraestrutura necessária identificada pelo setor privado e a o retorno dos investimentos para as comunidades locais.

“Já vivemos vários ciclos diferentes na Amazônia e estamos reproduzindo o antigo olhar da Amazônia como provedora de recursos para o desenvolvimento do País e do mundo e, nem sempre, as necessidades de desenvolvimento da região”, disse Adriana Ramos, coordenadora do evento e do Instituto Socioambiental (ISA).

Segundo ela, a proposta do fórum é chegar a um “debate do como fazer”, já que os movimentos reconhecem que o governo não vai recuar dos projetos. “É possível ter na Amazônia a compatibilização de diferentes modelos de desenvolvimento, mas, mesmo a grande estrutura para atendimento de demandas externas pode ser mais ou menos impactante. Infelizmente, ainda estamos fazendo da forma mais impactante”, lamentou.

Adriana Ramos criticou a falta de investimentos prévios em projetos como o de Belo Monte. Para ela, o governo teria que prever o aumento da população e, consequentemente, a pressão por mais serviços públicos, como saneamento e saúde em municípios como Altamira, no Pará.

“Além de serem feitas sem essa preocupação existe um esforço dos setores para a desregulação dessas atividades, com mudanças como a do Código Florestal e da regra de licenciamento”, acrescentou, explicando que, agora, órgãos como a Fundação Nacional do Índio e a Fundação Palmares têm 90 dias para responder se determinada obra impacta uma terra indígena. “Se não responder, o processo de licenciamento anda como se não houvesse impacto sobre terra indígena. Esses tipos de mudanças legais sinalizam que não há vontade de encontrar o caminho certo, há vontade de se fazer de qualquer jeito. É desanimador”, lamentou.

O fórum termina hoje (7) com um documento que vai orientar todos os debates e ações das organizações ambientais a partir do ano que vem, em relação a temas como a regularização fundiária na região, o debate sobre transporte e cidades sustentáveis e repartição e uso sustentável de recursos das florestas.

Natural Step: the Science of Sustainability (Yes Magazine)

Dr. Karl-Henrik Robert had an epiphany about the conditions required to sustain life – this epiphany catalyzed a consensus among Sweden’s top sceintists about the scientific foundations for sustainablity

by Dr. Karl-Henrik Robert

http://www.yesmagazine.org

posted Aug 30, 1998

What do cells need to sustain life? How can human systems of production be a sustainable part of consensus among Sweden’s top scientists about the scientific foundations for sustainability

Dr. Karl-Henrik Robèrt, a Swedish cancer doctor and medical researcher, founded The Natural Step to inject some science into the environmental debate – and provide a solid foundation for action. He spoke to YES! executive editor Sarah van Gelder during his recent trip to the US.

SARAHHow did you go from being a doctor to taking on this large question of sustainability? 

KARL: My career centered on my work as a medical doctor heading a cancer ward in a university hospital, the largest one outside of Stockholm. I was concerned with the environment as a private human being, but I didn’t know what I could do except to pay my dues to Greenpeace and other NGOs.

My epiphany came one day when I was studying cells from cancer patients. It hit me that cells are the unifying unit of all living things. The difference between our cells and the cells of plants are so minor that it’s almost embarrassing; the makeup is almost identical all the way down to the molecular level.

You can’t argue with them or negotiate with them. You can’t ask them to do anything they can’t do. And their complexity is just mind blowing!

Since politicians and business people also are constituted of cells, I had a feeling that a broad understanding of these cells might help us reach a consensus on the basic requirements for the continuation of life.

Most people are not aware that it took living cells about 3.5 billion years to transform the virgin soup of the atmosphere – which was a toxic, chaotic mixture of sulfurous compounds, methane, carbon dioxide, and other substances – into the conditions that could support complex life.

In just the last decades humans have reversed this trend. First we found concentrated energy like fossil fuels and nuclear power. As a result, we can create such a high throughput of resources that natural processes no longer have the time to process the waste and build new resources.

Dispersed junk is increasing in the system as we lose soils, forests, and species. So we have reversed evolution. The Earth is running back towards the chaotic state it came from at a tremendous speed.

On an intuitive level, everyone knows that the natural environment is also the habitat for our economy, and if it goes down the drain, so does the economy.

Despite that, the green movement attacks business, and business reacts defensively. So much of the debate focuses on the details – so much is like monkeys chattering among the leaves of the tree while the trunk and roots die.

I thought we could go beyond that stalemate if we could begin to build a consensus based on much more solid, comprehensive thinking.

SARAHWhat did you do with this insight? What was your plan for getting beyond the stalemate in the environmental debate? 

KARL: I had a daydream that I could write a consensus statement with other scientists about the conditions that are essential to life. Instead of asking them what environmental issues they disagreed on, I could ask them where there was agreement and use that as a basis for a consensus that would serve as a platform for sounder decision-making in society.

In August 1988, when I wrote the first effort to frame a consensus, I believed that my colleagues would agree wholeheartedly with what I had written, it was so well thought through. Actually, it took 21 iterations to reach a consensus among this group of 50 ecologists, chemists, physicists, and medical doctors.

I was able to raise funds to mail this consensus statement as a booklet with an audio cassette to all 4.3 million households in Sweden. This statement describes how badly we are performing with respect to the natural systems around us and how dangerous the situation is. It makes the point that debating about policy is not bad in itself – but it is bad when the debate is based on misunderstandings and poor knowledge. It doesn’t matter if you are on the left or the right – the consensus platform takes us beyond arguments about what is and is not true. That was the start of The Natural Step.

SARAHKarl, could you explain briefly the Natural Step system conditions? 

KARL: The four system conditions describe the principles that make a society sustainable. The first two system conditions have to do with avoiding concentrations of pollutants from synthetic substances and from substances mined or pumped from the Earth’s crust to ensure that they aren’t systematically increasing in nature.

The third condition says we must avoid overharvesting and displacing natural systems.

Finally, system condition number four says we must be efficient when it comes to satisfying human needs by maximizing the benefit from the resources used.

Today, society is well outside the framework set by these conditions, and as a result, we are running towards increasing economic problems as we run out of fresh and non-polluted resources.

SARAHSo if we follow these conditions we can avoid the reverse evolution you mentioned earlier – we can quit dispersing persistent substances into the biosphere and make it possible for nature to continue to provide us with the basic resources we need to live – soil, air, a stable climate, water, and so on. In other words, these conditions will help us judge whether our actions are sustainable. Is this an approach that businesses and government officials find compelling?

KARL: I think most people in business understand that we are running into a funnel of declining resources globally.

We will soon be 10 billion people on Earth – at the same time as we are running out of forests, crop land, and fisheries. We need more and more resource input for the same crop or timber yield. At the same time, pollution is increasing systematically and we have induced climate change. All that together creates a resource funnel.

By decreasing your dependence on activities that violate the system conditions, you move towards the opening of the resource funnel. You can do this through step by step reducing your dependence on:

• heavy metals and fossil fuels that dissipate into the environment (condition #1)

• persistent unnatural compounds like bromine-organic antiflammables or persistent pesticides (condition #2)

• wood and food from ecologically maltreated land and materials that require long-distance transportation (condition #3)

• wasting resources (system #4).

Any organization that directs its investments towards the opening of the funnel through complying with these system conditions will do better in business than their ignorant competitors. This is due to inevitable changes at the wall of the funnel in the form of increased costs for resources, waste management, insurance, loans, international business agreements, taxes, and public fear. In addition, there is the question of competition from those who direct their investments more skillfully towards the opening of the funnel – thus avoiding those costs – and sooner or later getting rewarded by their customers.

Once we have understood the funnel, the rest is a matter of timing. And time is now running out. Many corporations have already run into the wall of the funnel as a result of violating the system conditions. And today many companies are getting relatively stronger in comparison with others as a result of previous investments in line with the system conditions. Of course there are a large number of companies who still benefit in the short term from violating the principles of the common good, but in the long run, they have no future.

So if you ask business people, “Do you think that this could possibly influence tomorrow’s market?” they get embarrassed, because they all understand it will. The issue is to foresee the nature of that influence, because if you do, you will prosper from it

SARAHI want to ask you about the fourth condition because it seems as though that’s the one that has been most controversial. Perhaps that is because it is based on human systems more than natural systems.

KARL: The fourth principle is about the internal resource flows in a society, but it is still a logical first-order principle that follows as a conclusion from the first three. The reason people regard the fourth principle as a separate value is the word “fairness,” which is part of the fourth principle.

Most people understand that the first three principles set a frame for societal behavior. If matter from the Earth’s crust is no longer going to systematically increase in concentration, nor man-made compounds, and if we are going to live from the interest of what nature gives us – not use up nature’s capital – the first-order conclusion is that we must be much more efficient about how we meet our needs.

Fairness is an efficiency parameter if we look at the whole global civilization. It is not an efficient way of meeting human needs if one billion people starve while another billion have excess. It would be more efficient to distribute resources so that at least vital needs were met everywhere. Otherwise, for example, if kids are starving somewhere, dad goes out to slash and burn the rain forest to feed them – and so would I if my kids were dying. And this kind of destruction is everyone’s problem, because we live in the same
ecosphere.

SARAHI realize you reached consensus among the scientists and the foundations for sustainability, but has your approach been controversial in the larger society?

KARL: No. The business community found it refreshing to be involved in a dialogue that did not involve someone pointing fingers at them and telling them what they should do.

This dialogue was the opposite of that; it involved a group of scientists describing the situation with regards to the environment and then asking for advice about how to remove the obstacles to sustainability. The business community, municipalities, and farmers actually enjoyed being part of it.

SARAHWhy do companies choose to adopt The Natural Step? Is it that they understand the science and want to contribute to a more sustainable world? Or do they see TNS primarily as a winning business strategy? 

KARL: It is a mixture of both, and it is hard to evaluate which is most important. My feeling is that top people in business have a tough image that they display in board rooms. Privately, after the board meeting, they would much rather do well by doing good, than doing well by contributing to the destruction of our habitat. Because of the rational economic and strategic thinking of the system conditions, they can endorse TNS principles without losing face in front of their tough peers. But as time goes on, the “soft” values become more and more important.

SARAHIn the research I’ve done on Green Plans in the Netherlands, I found that Dutch businesses were concerned that they would be less competitive if they were holding to higher environmental standards than businesses from other countries. How have you dealt with the issue of competitiveness in The Natural Step?

KARL: If you look at the countries where business is very successful, it is not the countries where the standards are low – it is the countries where they have set high goals for what they want to achieve. In the long run, you get competitiveness from increasing standards.

SARAHCan you give me some examples of some things in Sweden that have been done differently out of this understanding?

KARL: The Natural Step introduces a shared mental model that is intellectually strict, but still simple to understand. These are the rules of sustainability; you can plug them into decision-making about any product.

The first thing that happens is that this stimulates creativity, because people enter a much smarter dialogue if they have a shared framework for their goals. We have written books of case studies about how people together found smart and flexible solutions to problems that seemed impossible to solve, including new products, logistics, suppliers, energy sources, and fuels.

A strict shared mental model can really get people working together.

SARAHYou mentioned that this approach requires thinking beyond the short term, and yet especially in the United States, so many CEOs are rewarded based on this quarter’s profits, not on how well they are positioning the company for the next five or ten years. How can companies in that kind of an environment take on this kind of a challenge? 

KARL: If you are audited at quarterly intervals and you can be sued for failing to earn the last buck possible, it is more difficult. But you can still develop a future scenario for your company in which it meets principles that make it ecologically, socially, and economically sustainable – because it is not economically sustainable to rely on behaviors that have no future.

Once you’ve developed that scenario, you look back from this imagined future and ask yourself how those sustainability principles might have been met and what you might do today to get there.

The strategy for business is to select as the first steps toward sustainability those that fulfill two criteria: they must be flexible to build on in the future, and they must provide a return on investments relatively soon; like, for instance, an attractive car that can run on renewable energy as well as gasoline.

SARAHWhat do you see as the trends for the coming years, in terms of a switch to more sustainable practices? 

KARL: A deepening intellectual understanding is a good starting point for change of values. Today, it is considered “rational” to think about economic growth only, whereas a focus on the true underlying reason for people living together in societies is considered non-rational. The TNS approach demonstrates that their present paradigm is, in fact, irrational and that we need new economic tools.

My belief is that free will of individuals and firms will not be sufficient to make sustainable practices widespread – legislation is a crucial part of the walls of the funnel, particularly if we want to make the transition in time.

But this is a dynamic process. The more examples we get of businesses entering the transition out of free will, the easier it will be for proactive politicians. In a democracy, there must be a “market” for proactive decisions in politics, and that market can be created by proactive businesses in dialogue with proactive customers. For example, in Sweden, some of these proactive business leaders are lobbying for green taxes. In that triangle of dialogue: business-market-politicians, a new culture may evolve, with an endorsement of the values we share but have forgotten how to pay attention to.

So, the flow goes: intellectual understanding, some practice and experience, deeper understanding with some change in attitude, preparedness for even more radical change, some more experience, even deeper understanding, and, eventually, an endorsement of the value systems that are inherent in the human constitution.

SARAHWhat worries you the most about the future? You mentioned when you were in Seattle that you anticipate some very difficult times for the world in the years ahead – perhaps even a collapse. Could you
explain what you meant and what you think might cause such a collapse? 

KARL: What worries me the most is the systematic social battering of people all around the world, leading to more and more desperate people who don’t feel any partnership with society because of alienation, poverty, dissolving cultural structures, more and more “molecular” violence (unorganized and self-destructive violence that pops up everywhere without any meaning at all).

The response of the establishment is too superficial, with more and more imprisonment and money spent on defense against those feared, leading to a vicious cycle.

If this goes on long enough, a constructive and new sustainable paradigm in the heads of governments and business leaders will not necessarily help us in time. We will have more and more people who are so hungry to meet their vital human needs that it will be hard to reach them.

SARAHWhat keeps you energized in the face of these enormous challenges? What are your sources of hope? 

KARL: My vision is that we develop a mainstream understanding that nobody wins from destroying our habitat, and that people will see that you do better in business if you work as though society will become sustainable and as though different cultures will survive, because cultural diversity is also essential.

To maintain hope, we cannot only focus on the dark things that are going on. Once in a while if you get a “bird’s eye” perspective, you see all sorts of good examples, and they comfort you. You see more and more people who understand and who are making concrete contributions to the transition to this new understanding.

 

When data prediction is a game, the experts lose out (New Scientist)

Specialist Knowledge Is Useless and Unhelpful

By |Posted Saturday, Dec. 8, 2012, at 7:45 AM ET

 Airplanes at an airport.Airplanes at an airport. iStockphoto/Thinkstock.

Jeremy Howard founded email company FastMail and the Optimal Decisions Group, which helps insurance companies set premiums. He is now president and chief scientist of Kaggle, which has turned data prediction into sport.

Peter Aldhous: Kaggle has been described as “an online marketplace for brains.” Tell me about it.
Jeremy Howard: It’s a website that hosts competitions for data prediction. We’ve run a whole bunch of amazing competitions. One asked competitors to develop algorithms to mark students’ essays. One that finished recently challenged competitors to develop a gesture-learning system for the Microsoft Kinect. The idea was to show the controller a gesture just once, and the algorithm would recognize it in future. Another competition predicted the biological properties of small molecules being screened as potential drugs.

PA: How exactly do these competitions work?
JH: They rely on techniques like data mining and machine learning to predict future trends from current data. Companies, governments, and researchers present data sets and problems, and offer prize money for the best solutions. Anyone can enter: We have nearly 64,000 registered users. We’ve discovered that creative-data scientists can solve problems in every field better than experts in those fields can.

PA: These competitions deal with very specialized subjects. Do experts enter?
JH: Oh yes. Every time a new competition comes out, the experts say: “We’ve built a whole industry around this. We know the answers.” And after a couple of weeks, they get blown out of the water.

PA: So who does well in the competitions?
JH: People who can just see what the data is actually telling them without being distracted by industry assumptions or specialist knowledge. Jason Tigg, who runs a pretty big hedge fund in London, has done well again and again. So has Xavier Conort, who runs a predictive analytics consultancy in Singapore.

PA: You were once on the leader board yourself. How did you get involved?
JH: It was a long and strange path. I majored in philosophy in Australia, worked in management consultancy for eight years, and then in 1999 I founded two start-ups—one an email company, the other helping insurers optimize risks and profits. By 2010, I had sold them both. I started learning Chinese and building amplifiers and speakers because I hadn’t made anything with my hands. I travelled. But it wasn’t intellectually challenging enough. Then, at a meeting of statistics users in Melbourne, somebody told me about Kaggle. I thought: “That looks intimidating and really interesting.”

PA: How did your first competition go?
JH: Setting my expectations low, my goal was to not come last. But I actually won it. It was on forecasting tourist arrivals and departures at different destinations. By the time I went to the next statistics meeting I had won two out of the three competitions I entered. Anthony Goldbloom, the founder of Kaggle, was there. He said: “You’re not Jeremy Howard, are you? We’ve never had anybody win two out of three competitions before.”

PA: How did you become Kaggle’s chief scientist?
JH: I offered to become an angel investor. But I just couldn’t keep my hands off the business. I told Anthony that the site was running slowly and rewrote all the code from scratch. Then Anthony and I spent three months in America last year, trying to raise money. That was where things got really serious, because we raised $11 million. I had to move to San Francisco and commit to doing this full-time.

PA: Do you still compete?
JH: I am allowed to compete, but I can’t win prizes. In practice, I’ve been too busy.

PA: What explains Kaggle’s success in solving problems in predictive analytics?
JH: The competitive aspect is important. The more people who take part in these competitions, the better they get at predictive modeling. There is no other place in the world I’m aware of, outside professional sport, where you get such raw, harsh, unfettered feedback about how well you’re doing. It’s clear what’s working and what’s not. It’s a kind of evolutionary process, accelerating the survival of the fittest, and we’re watching it happen right in front of us. More and more, our top competitors are also teaming up with each other.

PA: Which statistical methods work best?
JH: One that crops up again and again is called the random forest. This takes multiple small random samples of the data and makes a “decision tree” for each one, which branches according to the questions asked about the data. Each tree, by itself, has little predictive power. But take an “average” of all of them and you end up with a powerful model. It’s a totally black-box, brainless approach. You don’t have to think—it just works.

PA: What separates the winners from the also-rans?
JH: The difference between the good participants and the bad is the information they feed to the algorithms. You have to decide what to abstract from the data. Winners of Kaggle competitions tend to be curious and creative people. They come up with a dozen totally new ways to think about the problem. The nice thing about algorithms like the random forest is that you can chuck as many crazy ideas at them as you like, and the algorithms figure out which ones work.

PA: That sounds very different from the traditional approach to building predictive models. How have experts reacted?
JH: The messages are uncomfortable for a lot of people. It’s controversial because we’re telling them: “Your decades of specialist knowledge are not only useless, they’re actually unhelpful; your sophisticated techniques are worse than generic methods.” It’s difficult for people who are used to that old type of science. They spend so much time discussing whether an idea makes sense. They check the visualizations and noodle over it. That is all actively unhelpful.

PA: Is there any role for expert knowledge?
JH: Some kinds of experts are required early on, for when you’re trying to work out what problem you’re trying to solve. The expertise you need is strategy expertise in answering these questions.

PA: Can you see any downsides to the data-driven, black-box approach that dominates on Kaggle?
JH: Some people take the view that you don’t end up with a richer understanding of the problem. But that’s just not true: The algorithms tell you what’s important and what’s not. You might ask why those things are important, but I think that’s less interesting. You end up with a predictive model that works. There’s not too much to argue about there.

Reading history through genetics (Columbia University)

5-Dec-2012, by Holly Evarts

New method analyzes recent history of Ashkenazi and Masai populations, paving the way to personalized medicine

New York, NY—December 5, 2012—Computer scientists at Columbia’s School of Engineering and Applied Science have published a study in the November 2012 issue of The American Journal of Human Genetics (AJHG) that demonstrates a new approach used to analyze genetic data to learn more about the history of populations. The authors are the first to develop a method that can describe in detail events in recent history, over the past 2,000 years. They demonstrate this method in two populations, the Ashkenazi Jews and the Masai people of Kenya, who represent two kinds of histories and relationships with neighboring populations: one that remained isolated from surrounding groups, and one that grew from frequent cross-migration across nearby villages.

“Through this work, we’ve been able to recover very recent and refined demographic history, within the last few centuries, in contrast to previous methods that could only paint broad brushstrokes of the much deeper past, many thousands of years ago,” says Computer Science Associate Professor Itsik Pe’er, who led the research. “This means that we can now use genetics as an objective source of information regarding history, as opposed to subjective written texts.”

Pe’er’s group uses computational genetics to develop methods to analyze DNA sequence variants. Understanding the history of a population, knowing which populations had a shared origin and when, which groups have been isolated for a long time, or resulted from admixture of multiple original groups, and being able to fully characterize their genetics is, he explains, “essential in paving the way for personalized medicine.”

For this study, the team developed the mathematical framework and software tools to describe and analyze the histories of the two populations and discovered that, for instance, Ashkenazi Jews are descendants of a small number—in the hundreds—of individuals from the late medieval times, and since then have remained genetically isolated while their population has expanded rapidly to several millions today.

“Knowing that the Ashkenazi population has expanded so recently from a very small number has practical implications,” notes Pe’er. “If we can obtain data on only a few hundreds of individuals from this population, a perfectly feasible task in today’s technology, we will have effectively collected the genomes of millions of current Ashkenazim.” He and his team are now doing just that, and have already begun to analyze a first group of about 150 Ashkenazi genomes.

The genetic data of the Masai, a semi-nomadic people, indicates the village-by-village structure of their population. Unlike the isolated Ashkenazi group, the Masai live in small villages but regularly interact and intermarry across village boundaries. The ancestors of each village therefore typically come from many different places, and a single village hosts an effective gene pool that is much larger than the village itself.

Previous work in population genetics was focused on mutations that occurred very long ago, say the researchers, and therefore able to only describe population changes that occurred at that timescale, typically before the agricultural revolution. Pe’er’s research has changed that, enabling scientists to learn more about recent changes in populations and start to figure out, for instance, how to pinpoint severe mutations in personal genomes of specific individuals—mutations that are more likely to be associated with disease.

“This is a thrilling time to be working in computational genetics,” adds Pe’er, citing the speed in which data acquisition has been accelerating; much faster than the ability of computing hardware to process such data. “While the deluge of big data has forced us to develop better algorithms to analyze them, it has also rewarded us with unprecedented levels of understanding.”

###

Pe’er’s team worked closely on this research with study co-authors, Ariel Darvasi, PhD of the Hebrew University of Jerusalem, who was responsible for collecting most of the study samples, and Todd Lencz, PhD of Feinstein institute for Medical Research, who handled genotyping of the DNA samples. The team’s computing and analysis took place in the Columbia Initiative in Systems Biology (CISB).

This research is supported by the National Science Foundation (NSF). The computing facility of CISB is supported by the National Institutes of Health (NIH).