Arquivo da tag: Mediação tecnológica

Cientistas apontam problemas da cobertura da imprensa sobre mudanças climáticas (Fapesp)

Especialistas reunidos em São Paulo para debater gestão de riscos dos extremos climáticos manifestam preocupação com dificuldades enfrentadas por jornalistas para lidar com a complexidade do tema (Wikimedia)

21/08/2012

Por Fábio de Castro

Agência FAPESP – Na avaliação de especialistas reunidos em São Paulo para discutir a gestão de riscos dos extremos climáticos e desastres, para que seja possível gerenciar de forma adequada os impactos desses eventos, é fundamental informar a sociedade – incluindo os formuladores de políticas públicas – sobre as descobertas das ciências climáticas.

No entanto, pesquisadores estão preocupados com as dificuldades encontradas na comunicação com a sociedade. A complexidade dos estudos climáticos tende a gerar distorções na cobertura jornalística do tema e o resultado pode ser uma ameaça à confiança do público em relação à ciência.

A avaliação foi feita por participantes do workshop “Gestão dos riscos dos extremos climáticos e desastres na América Central e na América do Sul – o que podemos aprender com o Relatório Especial do IPCC sobre extremos?”, realizado na semana passada na capital paulista.

O evento teve o objetivo de debater as conclusões do Relatório Especial sobre Gestão dos Riscos de Extremos Climáticos e Desastres (SREX, na sigla em inglês) – elaborado e recentemente publicado pelo Painel Intergovernamental sobre Mudanças Climáticas (IPCC) – e discutir opções para gerenciamento dos impactos dos extremos climáticos, especialmente nas Américas do Sul e Central.

O workshop foi realizado pela FAPESP e pelo Instituto Nacional de Pesquisas Espaciais (Inpe), em parceria com o IPCC, o Overseas Development Institute (ODI) e a Climate and Development Knowledge (CKDN), ambos do Reino Unido, e apoio da Agência de Clima e Poluição do Ministério de Relações Exteriores da Noruega.

Durante o evento, o tema da comunicação foi debatido por autores do IPCC-SREX, especialistas em extremos climáticos, gestores e líderes de instituições de prevenção de desastres.

De acordo com Vicente Barros, do Centro de Investigação do Mar e da Atmosfera da Universidade de Buenos Aires, o IPCC, do qual é membro, entrou há três anos em um processo de reestruturação que compreende uma mudança na estratégia de comunicação.

“A partir de 2009, o IPCC passou a ser atacado violentamente e não estávamos preparados para isso, porque nossa função era divulgar o conhecimento adquirido, mas não traduzi-lo para a imprensa. Temos agora um grupo de jornalistas que procura fazer essa mediação, mas não podemos diluir demais as informações e a última palavra na formulação da comunicação é sempre do comitê executivo, porque o peso político do que é expresso pelo painel é muito grande”, disse Barros.

A linguagem é um grande problema, segundo Barros. Se for muito complexa, não atinge o público. Se for muito simplificada, tende a distorcer as conclusões e disseminar visões que não correspondem à realidade.

“O IPCC trata de problemas muito complexos e admitimos que não podemos fazer uma divulgação que chegue a todos. Isso é um problema. Acredito que a comunicação deve permanecer nas mãos dos jornalistas, mas talvez seja preciso investir em iniciativas de treinamento desses profissionais”, disse.

Fábio Feldman, do Fórum Paulista de Mudanças Climáticas, manifestou preocupação com as dificuldades de comunicação dos cientistas com o público, que, segundo ele, possibilitam que os pesquisadores “céticos” – isto é, que negam a influência humana nos eventos de mudanças climáticas – ganhem cada vez mais espaço na mídia e no debate público.

“Vejo com preocupação um avanço do espaço dado aos negacionistas no debate público. A imprensa acha que é preciso usar necessariamente o princípio do contraditório, dando espaço e importância equânimes para as diferentes posições no debate”, disse.

De acordo com Feldman, os cientistas – especialmente aqueles ligados ao IPCC – deveriam ter uma atitude mais pró-ativa no sentido de se contrapor aos “céticos” no debate público.

Posições diferentes

Para Reynaldo Luiz Victoria, da Coordenação do Programa FAPESP de Pesquisa em Mudanças Climáticas Globais, é importante que a imprensa trate as diferentes posições de modo mais equitativo.

“Há casos específicos em que a imprensa trata questões de maneira pouco equitativa – e eventualmente sensacionalista –, mas acho que nós, como pesquisadores, não temos obrigação de reagir. A imprensa deveria nos procurar para fazer o contraponto e esclarecer o público”, disse Victoria à Agência FAPESP.

Victoria, no entanto, destacou a importância de que os “céticos” também sejam ouvidos. “Alguns são cientistas sérios e merecem um tratamento equitativo. Certamente que não se pode ignorá-los, mas, quando fazem afirmações passíveis de contestação, a imprensa deve procurar alguém que possa dar um contraponto. Os jornalistas precisam nos procurar e não o contrário”, disse.

De modo geral, a cobertura da imprensa sobre mudanças climáticas é satisfatória, segundo Victoria. “Os bons jornais publicam artigos corretos e há jornalistas muito sérios produzindo material de alta qualidade”, destacou.

Para Luci Hidalgo Nunes, professora do Departamento de Geografia da Universidade Estadual de Campinas (Unicamp), os negacionistas ganham espaço porque muitas vezes o discurso polêmico tem mais apelo midiático do que a complexidade do conhecimento científico.

“O cientista pode ter um discurso bem fundamentado, mas que é considerado enfadonho pelo público. Enquanto isso, um pesquisador com argumentos pouco estruturados pode fazer um discurso simplificado, portanto atraente para o público, e polêmico, o que rende manchetes”, disse à Agência FAPESP.

Apesar de a boa ciência ter, em relação ao debate público, uma desvantagem inerente à sua complexidade, Nunes acredita ser importante que a imprensa continue pluralista. A pesquisadora publicou um estudo no qual analisa a cobertura do jornal O Estado de S. Paulo sobre mudanças climáticas durante um ano. Segundo Nunes, um dos principais pontos positivos observados consistiu em dar voz às diferentes posições.

“Sou favorável a que a imprensa cumpra seu papel e dê todos os parâmetros, para que haja um debate democrático. Acho que isso está sendo bem feito e a própria imprensa está aberta para nos dar mais espaço. Mas precisamos nos manifestar para criar essas oportunidades”, disse.

Nunes também considera que a cobertura da imprensa sobre mudanças climáticas, de modo geral, tem sido satisfatória, ainda que irregular. “O tema ganha vulto em determinados momentos, mas não se mantém na pauta do noticiário de forma permanente”, disse.

Segundo ela, o assunto sobressaiu especialmente em 2007, com a publicação do primeiro relatório do IPCC, e em 2012 durante a RIO+20.

“Em 2007, a cobertura foi intensa, mas a popularização do tema também deu margem a distorções e exageros. O sensacionalismo é ruim para a ciência, porque faz o tema ganhar as manchetes rapidamente por algum tempo, mas no médio prazo o efeito é inverso: as pessoas percebem os exageros e passam a olhar com descrédito os resultados científicos de modo geral”, disse.

EBay bans sale of spells and hexes (CNN)

By Erin Kim @CNNMoneyTech August 16, 2012: 4:27 PM ET

Starting in September, eBay is blocking the sale of potions and other magical goods.

NEW YORK (CNNMoney) — Sorry, love spell vendors: eBay is cracking down on the sale of magical wares.

Beginning in September, the site is banning the sale of “advice, spells, curses, hexing, conjuring, magic, prayers, blessing services, magic potions, [and] healing sessions,” according to a policy update.

The company is also eliminating its category listings for psychic readings and tarot card sessions.

The update is a part of a “multi-year effort…to build trust in the marketplace and support sellers,” eBay (EBAYFortune 500) wrote in its company blog.

Has anyone actually been buying magic on eBay? It seems so: The site’s “spells and potions” category currently has more than 6,000 active listings and happy feedback from quite a few satisfied buyers.

“Best spell caster on Ebay,” one customer wrote after a recent purchase.

“Wonderful post-spells communication!” another raved. “We bought 4 spells! Highly Recommend!”

Spells and hexes aside, eBay is rolling out a long list of rule tweaks, as it does several times a year. For example, buyers will now be required to contact sellers before getting eBay involved with any issues regarding a purchase. Sellers will also be subject to a fee for ending an auction earlier than planned.

EBay also banned the sale of “work from home businesses & information,” a category that is often abused by scammers.

EBay isn’t the only online marketplace culling its listings. Etsy, a platform for homemade goods, also recently prohibited the sale of various items, including drug paraphernalia and body parts. To top of page

First Published: August 16, 2012: 4:27 PM ET

*   *   *

Etsy blocks sales of drugs and human remains

By Erin Kim @CNNMoneyTech August 10, 2012: 5:55 PM ET

NEW YORK (CNNMoney) — Etsy has become the go-to spot for homemade jewelry, knickknacks and household goods. Apparently, some have also been using the online marketplace to sell everything from drugs to human remains.

Now Etsy is cracking down.

The online marketplace recently revised its policies, excluding from its list of sellable items such products as tobacco, hazardous materials and body parts. (Hair and teeth are still OK).

“Odd as it may sound, we’ve spent long hours over the past several months extensively researching some offbeat and fascinating topics, from issues surrounding the sale of human bones to the corrosive and toxic properties of mercury,” the company wrote on its official blog on Wednesday.

Etsy says the changes are made in order to comply with legal rules and restrictions.

“But beyond that, when it comes right down to it, some things just aren’t in the spirit of Etsy,” the online company wrote. “While we understand that it is possible for certain items to be carefully and legally bought and sold, Etsy is just not the right venue for them.”

The new policy prohibits the sale of human body parts, including but not limited to “things such as skulls, bones, articulated skeletons, bodily fluids, preserved tissues or organs, and other similar products.”

Etsy banned most drug paraphernalia, though the company said it is not explicitly banning the sale of medical drugs. Instead, it’s asking that sellers remove any claims of “cure or relief of a health condition or illness.”

That set off a slew of angry posts from Etsy sellers in the company’s public forums.

“Now I need to change near[ly] a quarter of my listings or remove them,”wrote Etsy user Chrissy-jo, who operates an online store called KindredImages. “How am I going explain the use of a salve or even an aromatherapy eye pillow without making the claim that it aids in healing wounds or it helps relieve migraines?”

Another Etsy user named Irina, who runs PheonixBotanicals, wrote: “As an herbal crafter, I find the idea of being banned from listing traditional uses and folklore of plants quite disheartening.”

Sellers on Etsy operate their own shops, where they vend goods that are usually homemade. The online store plans to reach out to individual sellers to ask them to either remove a problematic listing or make changes to align with the company’s policy. To top of page

First Published: August 10, 2012: 4:10 PM ET

Rooting out Rumors, Epidemics, and Crime — With Math (Science Daily)

ScienceDaily (Aug. 10, 2012) — A team of EPFL scientists has developed an algorithm that can identify the source of an epidemic or information circulating within a network, a method that could also be used to help with criminal investigations.

Investigators are well aware of how difficult it is to trace an unlawful act to its source. The job was arguably easier with old, Mafia-style criminal organizations, as their hierarchical structures more or less resembled predictable family trees.

In the Internet age, however, the networks used by organized criminals have changed. Innumerable nodes and connections escalate the complexity of these networks, making it ever more difficult to root out the guilty party. EPFL researcher Pedro Pinto of the Audiovisual Communications Laboratory and his colleagues have developed an algorithm that could become a valuable ally for investigators, criminal or otherwise, as long as a network is involved. The team’s research was published August 10, 2012, in the journal Physical Review Letters.

Finding the source of a Facebook rumor

“Using our method, we can find the source of all kinds of things circulating in a network just by ‘listening’ to a limited number of members of that network,” explains Pinto. Suppose you come across a rumor about yourself that has spread on Facebook and been sent to 500 people — your friends, or even friends of your friends. How do you find the person who started the rumor? “By looking at the messages received by just 15-20 of your friends, and taking into account the time factor, our algorithm can trace the path of that information back and find the source,” Pinto adds. This method can also be used to identify the origin of a spam message or a computer virus using only a limited number of sensors within the network.

Trace the propagation of an epidemic

Out in the real world, the algorithm can be employed to find the primary source of an infectious disease, such as cholera. “We tested our method with data on an epidemic in South Africa provided by EPFL professor Andrea Rinaldo’s Ecohydrology Laboratory,” says Pinto. “By modeling water networks, river networks, and human transport networks, we were able to find the spot where the first cases of infection appeared by monitoring only a small fraction of the villages.”

The method would also be useful in responding to terrorist attacks, such as the 1995 sarin gas attack in the Tokyo subway, in which poisonous gas released in the city’s subterranean tunnels killed 13 people and injured nearly 1,000 more. “Using this algorithm, it wouldn’t be necessary to equip every station with detectors. A sample would be sufficient to rapidly identify the origin of the attack, and action could be taken before it spreads too far,” says Pinto.

Identifying the brains behind a terrorist attack

Computer simulations of the telephone conversations that could have occurred during the terrorist attacks on September 11, 2001, were used to test Pinto’s system. “By reconstructing the message exchange inside the 9/11 terrorist network extracted from publicly released news, our system spit out the names of three potential suspects — one of whom was found to be the mastermind of the attacks, according to the official enquiry.”

The validity of this method thus has been proven a posteriori. But according to Pinto, it could also be used preventatively — for example, to understand an outbreak before it gets out of control. “By carefully selecting points in the network to test, we could more rapidly detect the spread of an epidemic,” he points out. It could also be a valuable tool for advertisers who use viral marketing strategies by leveraging the Internet and social networks to reach customers. For example, this algorithm would allow them to identify the specific Internet blogs that are the most influential for their target audience and to understand how in these articles spread throughout the online community.

NOAA Raises Hurricane Season Prediction Despite Expected El Niño (Science Daily)

ScienceDaily (Aug. 10, 2012) — This year’s Atlantic hurricane season got off to a busy start, with 6 named storms to date, and may have a busy second half, according to the updated hurricane season outlook issued Aug. 9, 2012 by NOAA’s Climate Prediction Center, a division of the National Weather Service. The updated outlook still indicates a 50 percent chance of a near-normal season, but increases the chance of an above-normal season to 35 percent and decreases the chance of a below-normal season to only 15 percent from the initial outlook issued in May.

Satellite image of Hurricane Ernesto taken on Aug. 7, 2012 in the Gulf of Mexico. (Credit: NOAA)

Across the entire Atlantic Basin for the season — June 1 to November 30 — NOAA’s updated seasonal outlook projects a total (which includes the activity-to-date of tropical storms Alberto, Beryl, Debbie, Florence and hurricanes Chris and Ernesto) of:

  • 12 to 17 named storms (top winds of 39 mph or higher), including:
  • 5 to 8 hurricanes (top winds of 74 mph or higher), of which:
  • 2 to 3 could be major hurricanes (Category 3, 4 or 5; winds of at least 111 mph)

The numbers are higher from the initial outlook in May, which called for 9-15 named storms, 4-8 hurricanes and 1-3 major hurricanes. Based on a 30-year average, a normal Atlantic hurricane season produces 12 named storms, six hurricanes, and three major hurricanes.

“We are increasing the likelihood of an above-normal season because storm-conducive wind patterns and warmer-than-normal sea surface temperatures are now in place in the Atlantic,” said Gerry Bell, Ph.D., lead seasonal hurricane forecaster at the Climate Prediction Center. “These conditions are linked to the ongoing high activity era for Atlantic hurricanes that began in 1995. Also, strong early-season activity is generally indicative of a more active season.”

However, NOAA seasonal climate forecasters also announced today that El Niño will likely develop in August or September.

“El Niño is a competing factor, because it strengthens the vertical wind shear over the Atlantic, which suppresses storm development. However, we don’t expect El Niño’s influence until later in the season,” Bell said.

“We have a long way to go until the end of the season, and we shouldn’t let our guard down,” said Laura Furgione, acting director of NOAA’s National Weather Service. “Hurricanes often bring dangerous inland flooding as we saw a year ago in the Northeast with Hurricane Irene and Tropical Storm Lee. Even people who live hundreds of miles from the coast need to remain vigilant through the remainder of the season.”

“It is never too early to prepare for a hurricane,” said Tim Manning, FEMA’s deputy administrator for protection and national preparedness. “We are in the middle of hurricane season and now is the time to get ready. There are easy steps you can take to get yourself and your family prepared. Visit www.ready.gov to learn more.”

How Computation Can Predict Group Conflict: Fighting Among Captive Pigtailed Macaques Provides Clues (Science Daily)

ScienceDaily (Aug. 13, 2012) — When conflict breaks out in social groups, individuals make strategic decisions about how to behave based on their understanding of alliances and feuds in the group.

Researchers studied fighting among captive pigtailed macaques for clues about behavior and group conflict. (Credit: iStockphoto/Natthaphong Phanthumchinda)

But it’s been challenging to quantify the underlying trends that dictate how individuals make predictions, given they may only have seen a small number of fights or have limited memory.

In a new study, scientists at the Wisconsin Institute for Discovery (WID) at UW-Madison develop a computational approach to determine whether individuals behave predictably. With data from previous fights, the team looked at how much memory individuals in the group would need to make predictions themselves. The analysis proposes a novel estimate of “cognitive burden,” or the minimal amount of information an organism needs to remember to make a prediction.

The research draws from a concept called “sparse coding,” or the brain’s tendency to use fewer visual details and a small number of neurons to stow an image or scene. Previous studies support the idea that neurons in the brain react to a few large details such as the lines, edges and orientations within images rather than many smaller details.

“So what you get is a model where you have to remember fewer things but you still get very high predictive power — that’s what we’re interested in,” says Bryan Daniels, a WID researcher who led the study. “What is the trade-off? What’s the minimum amount of ‘stuff’ an individual has to remember to make good inferences about future events?”

To find out, Daniels — along with WID co-authors Jessica Flack and David Krakauer — drew comparisons from how brains and computers encode information. The results contribute to ongoing discussions about conflict in biological systems and how cognitive organisms understand their environments.

The study, published in the Aug. 13 edition of the Proceedings of the National Academy of Sciences, examined observed bouts of natural fighting in a group of 84 captive pigtailed macaques at the Yerkes National Primate Research Center. By recording individuals’ involvement — or lack thereof — in fights, the group created models that mapped the likelihood any number of individuals would engage in conflict in hypothetical situations.

To confirm the predictive power of the models, the group plugged in other data from the monkey group that was not used to create the models. Then, researchers compared these simulations with what actually happened in the group. One model looked at conflict as combinations of pairs, while another represented fights as sparse combinations of clusters, which proved to be a better tool for predicting fights. From there, by removing information until predictions became worse, Daniels and colleagues calculated the amount of information each individual needed to remember to make the most informed decision whether to fight or flee.

“We know the monkeys are making predictions, but we don’t know how good they are,” says Daniels. “But given this data, we found that the most memory it would take to figure out the regularities is about 1,000 bits of information.”

Sparse coding appears to be a strong candidate for explaining the mechanism at play in the monkey group, but the team points out that it is only one possible way to encode conflict.

Because the statistical modeling and computation frameworks can be applied to different natural datasets, the research has the potential to influence other fields of study, including behavioral science, cognition, computation, game theory and machine learning. Such models might also be useful in studying collective behaviors in other complex systems, ranging from neurons to bird flocks.

Future research will seek to find out how individuals’ knowledge of alliances and feuds fine tunes their own decisions and changes the groups’ collective pattern of conflict.

The research was supported by the National Science Foundation, the John Templeton Foundation through the Santa Fe Institute, and UW-Madison.

How Do They Do It? Predictions Are in for Arctic Sea Ice Low Point (Science Daily)

ScienceDaily (Aug. 14, 2012) — It’s become a sport of sorts, predicting the low point of Arctic sea ice each year. Expert scientists with decades of experience do it but so do enthusiasts, whose guesses are gamely included in a monthly predictions roundup collected by Sea Ice Outlook, an effort supported by the U.S. government.

Arctic sea ice, as seen from an ice breaker. (Credit: Bonnie Light, UW)

When averaged, the predictions have come in remarkably close to the mark in the past two years. But the low and high predictions are off by hundreds of thousands of square kilometers.

Researchers are working hard to improve their ability to more accurately predict how much Arctic sea ice will remain at the end of summer. It’s an important exercise because knowing why sea ice declines could help scientists better understand climate change and how sea ice is evolving.

This year, researchers from the University of Washington’s Polar Science Center are the first to include new NASA sea ice thickness data collected by airplane in a prediction.

They expect 4.4 million square kilometers of remaining ice (about 1.7 million square miles), just barely more than the 4.3 million kilometers in 2007, the lowest year on record for Arctic sea ice. The median of 23 predictions collected by the Sea Ice Outlook and released on Aug. 13 is 4.3 million.

“One drawback to making predictions is historically we’ve had very little information about the thickness of the ice in the current year,” said Ron Lindsay, a climatologist at the Polar Science Center, a department in the UW’s Applied Physics Laboratory.

To make their prediction, Lindsay and Jinlun Zhang, an oceanographer in the Polar Science Center, start with a widely used model pioneered by Zhang and known as the Pan-Arctic Ice Ocean Modeling and Assimilation System. That system combines available observations with a model to track sea ice volume, which includes both ice thickness and extent.

But obtaining observations about current-year ice thickness in order to build their short-term prediction is tough. NASA is currently in the process of designing a new satellite that will replace one that used to deliver ice thickness data but has since failed. In the meantime, NASA is running a program called Operation IceBridge that uses airplanes to survey sea ice as well as Arctic ice sheets.

“This is the first year they made a concerted effort to get the data from the aircraft, process it and get it into hands of scientists in a timely manner,” Lindsay said. “In the past, we’ve gotten data from submarines, moorings or satellites but none of that data was available in a timely manner. It took months or even years.”

There’s a shortcoming to the IceBridge data, however: It’s only available through March. The radar used to measure snow depth on the surface of the ice, an important element in the observation system, has trouble accurately gauging the depth once it has melted and so the data is only collected through the early spring before the thaw.

The UW scientists have developed a method for informing their prediction that is starting to be used by others. Researchers have struggled with how best to forecast the weather in the Arctic, which affects ice melt and distribution.

“Jinlun came up with the idea of using the last seven summers. Because the climate is changing so fast, only the recent summers are probably relevant,” Lindsay said.

The result is seven different possibilities of what might happen. “The average of those is our best guess,” Lindsay said.

Despite the progress in making predictions, the researchers say their abilities to foretell the future will always be limited. Because they can’t forecast the weather very far in advance and because the ice is strongly affected by winds, they have little confidence beyond what the long-term trend tells us in predictions that are made far in advance.

“The accuracy of our prediction really depends on time,” Zhang said. “Our June 1 prediction for the Sept. 15 low point has high uncertainty but as we approach the end of June or July, the uncertainty goes down and the accuracy goes up.”

In hindsight, that’s true historically for the average predictions collected by Study of Environmental Arctic Change’s Sea Ice Outlook, a project funded by the National Science Foundation and the National Oceanic and Atmospheric Administration.

While the competitive aspect of the predictions is fun, the researchers aren’t in it to win it.

“Essentially it’s not for prediction but for understanding,” Zhang said. “We do it to improve our understanding of sea ice processes, in terms of how dynamic processes affect the seasonal evolution of sea ice.”

That may not be entirely the same for the enthusiasts who contribute a prediction. One climate blog polls readers in the summer for their best estimate of the sea ice low point. It’s included among the predictions collected by the Sea Ice Outlook, with an asterisk noting it as a “public outlook.”

The National Science Foundation and NASA fund the UW research into the Arctic sea ice low point.

Need an Expert? Try the Crowd (Science Daily)

ScienceDaily (Aug. 14, 2012) — “It’s potentially a new way to do science.”

In 1714, the British government held a contest. They offered a large cash prize to anyone who could solve the vexing “longitude problem” — how to determine a ship’s east/west position on the open ocean — since none of their naval experts had been able to do so.

Lots of people gave it a try. One of them, a self-educated carpenter named John Harrison, invented the marine chronometer — a rugged and highly precise clock — that did the trick. For the first time, sailors could accurately determine their location at sea.

A centuries-old problem was solved. And, arguably, crowdsourcing was born.

Crowdsourcing is basically what it sounds like: posing a question or asking for help from a large group of people. Coined as a term in 2006, crowdsourcing has taken off in the internet era. Think of Wikipedia, and its thousands of unpaid contributors, now vastly larger than the Encyclopedia Britannica.

Crowdsourcing has allowed many problems to be solved that would be impossible for experts alone. Astronomers rely on an army of volunteers to scan for new galaxies. At climateprediction.net, citizens have linked their home computers to yield more than a hundred million hours of climate modeling; it’s the world’s largest forecasting experiment.

But what if experts didn’t simply ask the crowd to donate time or answer questions? What if the crowd was asked to decide what questions to ask in the first place?

Could the crowd itself be the expert?

That’s what a team at the University of Vermont decided to explore — and the answer seems to be yes.

Prediction from the people

Josh Bongard and Paul Hines, professors in UVM’s College of Engineering and Mathematical Sciences, and their students, set out to discover if volunteers who visited two different websites could pose, refine, and answer questions of each other — that could effectively predict the volunteers’ body weight and home electricity use.

The experiment, the first of its kind, was a success: the self-directed questions and answers by visitors to the websites led to computer models that effectively predict user’s monthly electricity consumption and body mass index.

Their results, “Crowdsourcing Predictors of Behavioral Outcomes,” were published in a recent edition of IEEE Transactions: Systems, Man and Cybernetics, a journal of the Institute of Electrical and Electronics Engineers.

“It’s proof of concept that a crowd actually can come up with good questions that lead to good hypotheses,” says Bongard, an expert on machine science.

In other words, the wisdom of the crowd can be harnessed to determine which variables to study, the UVM project shows — and at the same time provide a pool of data by responding to the questions they ask of each other.

“The result is a crowdsourced predictive model,” the Vermont scientists write.

Unexpected angles

Some of the questions the volunteers posed were obvious. For example, on the website dedicated to exploring body weight, visitors came up with the question: “Do you think of yourself as overweight?” And, no surprise, that proved to be the question with the most power to predict people’s body weight.

But some questions posed by the volunteers were less obvious. “We had some eye-openers,” Bongard says. “How often do you masturbate a month?” might not be the first question asked by weight-loss experts, but it proved to be the second-most-predictive question of the volunteer’s self-reported weights — more predictive than “how often do you eat during a day?”

“Sometimes the general public has intuition about stuff that experts miss — there’s a long literature on this,” Hines says.

“It’s those people who are very underweight or very overweight who might have an explanation for why they’re at these extremes — and some of those explanations might not be a simple combination of diet and exercise,” says Bongard. “There might be other things that experts missed.”

Cause and correlation

The researchers are quick to note that the variables revealed by the evolving Q&A on the experimental websites are simply correlated to outcomes — body weight and electricity use — not necessarily the cause.

“We’re not arguing that this study is actually predictive of the causes,” says Hines, “but improvements to this method may lead in that direction.”

Nor do the scientists make claim to being experts on body weight or to be providing recommendations on health or diet (though Hines is an expert on electricity, and the EnergyMinder site he and his students developed for this project has a larger aim to help citizens understand and reduce their household energy use.)

“We’re simply investigating the question: could you involve participants in the hypothesis-generation part of the scientific process?” Bongard says. “Our paper is a demonstration of this methodology.”

“Going forward, this approach may allow us to involve the public in deciding what it is that is interesting to study,” says Hines. “It’s potentially a new way to do science.”

And there are many reasons why this new approach might be helpful. In addition to forces that experts might simply not know about — “can we elicit unexpected predictors that an expert would not have come up with sitting in his office?” Hines asks — experts often have deeply held biases.

Faster discoveries

But the UVM team primarily sees their new approach as potentially helping to accelerate the process of scientific discovery. The need for expert involvement — in shaping, say, what questions to ask on a survey or what variable to change to optimize an engineering design — “can become a bottleneck to new insights,” the scientists write.

“We’re looking for an experimental platform where, instead of waiting to read a journal article every year about what’s been learned about obesity,” Bongard says, “a research site could be changing and updating new findings constantly as people add their questions and insights.”

The goal: “exponential rises,” the UVM scientists write, in the discovery of what causes behaviors and patterns — probably driven by the people who care about them the most. For example, “it might be smokers or people suffering from various diseases,” says Bongard. The team thinks this new approach to science could “mirror the exponential growth found in other online collaborative communities,” they write.

“We’re all problem-solving animals,” says Bongard, “so can we exploit that? Instead of just exploiting the cycles of your computer or your ability to say ‘yes’ or ‘no’ on a survey — can we exploit your creative brain?”

Global Warming’s Terrifying New Math (Rolling Stone)

Three simple numbers that add up to global catastrophe – and that make clear who the real enemy is

by: Bill McKibben

reckoning illoIllustration by Edel Rodriguez

If the pictures of those towering wildfires in Colorado haven’t convinced you, or the size of your AC bill this summer, here are some hard numbers about climate change: June broke or tied 3,215 high-temperature records across the United States. That followed the warmest May on record for the Northern Hemisphere – the 327th consecutive month in which the temperature of the entire globe exceeded the 20th-century average, the odds of which occurring by simple chance were 3.7 x 10-99, a number considerably larger than the number of stars in the universe.

Meteorologists reported that this spring was the warmest ever recorded for our nation – in fact, it crushed the old record by so much that it represented the “largest temperature departure from average of any season on record.” The same week, Saudi authorities reported that it had rained in Mecca despite a temperature of 109 degrees, the hottest downpour in the planet’s history.

Not that our leaders seemed to notice. Last month the world’s nations, meeting in Rio for the 20th-anniversary reprise of a massive 1992 environmental summit, accomplished nothing. Unlike George H.W. Bush, who flew in for the first conclave, Barack Obama didn’t even attend. It was “a ghost of the glad, confident meeting 20 years ago,” the British journalist George Monbiot wrote; no one paid it much attention, footsteps echoing through the halls “once thronged by multitudes.” Since I wrote one of the first books for a general audience about global warming way back in 1989, and since I’ve spent the intervening decades working ineffectively to slow that warming, I can say with some confidence that we’re losing the fight, badly and quickly – losing it because, most of all, we remain in denial about the peril that human civilization is in.

When we think about global warming at all, the arguments tend to be ideological, theological and economic. But to grasp the seriousness of our predicament, you just need to do a little math. For the past year, an easy and powerful bit of arithmetical analysis first published by financial analysts in the U.K. has been making the rounds of environmental conferences and journals, but it hasn’t yet broken through to the larger public. This analysis upends most of the conventional political thinking about climate change. And it allows us to understand our precarious – our almost-but-not-quite-finally hopeless – position with three simple numbers.

The First Number: 2° Celsius

If the movie had ended in Hollywood fashion, the Copenhagen climate conference in 2009 would have marked the culmination of the global fight to slow a changing climate. The world’s nations had gathered in the December gloom of the Danish capital for what a leading climate economist, Sir Nicholas Stern of Britain, called the “most important gathering since the Second World War, given what is at stake.” As Danish energy minister Connie Hedegaard, who presided over the conference, declared at the time: “This is our chance. If we miss it, it could take years before we get a new and better one. If ever.”

In the event, of course, we missed it. Copenhagen failed spectacularly. Neither China nor the United States, which between them are responsible for 40 percent of global carbon emissions, was prepared to offer dramatic concessions, and so the conference drifted aimlessly for two weeks until world leaders jetted in for the final day. Amid considerable chaos, President Obama took the lead in drafting a face-saving “Copenhagen Accord” that fooled very few. Its purely voluntary agreements committed no one to anything, and even if countries signaled their intentions to cut carbon emissions, there was no enforcement mechanism. “Copenhagen is a crime scene tonight,” an angry Greenpeace official declared, “with the guilty men and women fleeing to the airport.” Headline writers were equally brutal: COPENHAGEN: THE MUNICH OF OUR TIMES? asked one.

The accord did contain one important number, however. In Paragraph 1, it formally recognized “the scientific view that the increase in global temperature should be below two degrees Celsius.” And in the very next paragraph, it declared that “we agree that deep cuts in global emissions are required… so as to hold the increase in global temperature below two degrees Celsius.” By insisting on two degrees – about 3.6 degrees Fahrenheit – the accord ratified positions taken earlier in 2009 by the G8, and the so-called Major Economies Forum. It was as conventional as conventional wisdom gets. The number first gained prominence, in fact, at a 1995 climate conference chaired by Angela Merkel, then the German minister of the environment and now the center-right chancellor of the nation.

Some context: So far, we’ve raised the average temperature of the planet just under 0.8 degrees Celsius, and that has caused far more damage than most scientists expected. (A third of summer sea ice in the Arctic is gone, the oceans are 30 percent more acidic, and since warm air holds more water vapor than cold, the atmosphere over the oceans is a shocking five percent wetter, loading the dice for devastating floods.) Given those impacts, in fact, many scientists have come to think that two degrees is far too lenient a target. “Any number much above one degree involves a gamble,” writes Kerry Emanuel of MIT, a leading authority on hurricanes, “and the odds become less and less favorable as the temperature goes up.” Thomas Lovejoy, once the World Bank’s chief biodiversity adviser, puts it like this: “If we’re seeing what we’re seeing today at 0.8 degrees Celsius, two degrees is simply too much.” NASA scientist James Hansen, the planet’s most prominent climatologist, is even blunter: “The target that has been talked about in international negotiations for two degrees of warming is actually a prescription for long-term disaster.” At the Copenhagen summit, a spokesman for small island nations warned that many would not survive a two-degree rise: “Some countries will flat-out disappear.” When delegates from developing nations were warned that two degrees would represent a “suicide pact” for drought-stricken Africa, many of them started chanting, “One degree, one Africa.”

Despite such well-founded misgivings, political realism bested scientific data, and the world settled on the two-degree target – indeed, it’s fair to say that it’s the only thing about climate change the world has settled on. All told, 167 countries responsible for more than 87 percent of the world’s carbon emissions have signed on to the Copenhagen Accord, endorsing the two-degree target. Only a few dozen countries have rejected it, including Kuwait, Nicaragua and Venezuela. Even the United Arab Emirates, which makes most of its money exporting oil and gas, signed on. The official position of planet Earth at the moment is that we can’t raise the temperature more than two degrees Celsius – it’s become the bottomest of bottom lines. Two degrees.

The Second Number: 565 Gigatons

Scientists estimate that humans can pour roughly 565 more gigatons of carbon dioxide into the atmosphere by midcentury and still have some reasonable hope of staying below two degrees. (“Reasonable,” in this case, means four chances in five, or somewhat worse odds than playing Russian roulette with a six-shooter.)

This idea of a global “carbon budget” emerged about a decade ago, as scientists began to calculate how much oil, coal and gas could still safely be burned. Since we’ve increased the Earth’s temperature by 0.8 degrees so far, we’re currently less than halfway to the target. But, in fact, computer models calculate that even if we stopped increasing CO2 now, the temperature would likely still rise another 0.8 degrees, as previously released carbon continues to overheat the atmosphere. That means we’re already three-quarters of the way to the two-degree target.

How good are these numbers? No one is insisting that they’re exact, but few dispute that they’re generally right. The 565-gigaton figure was derived from one of the most sophisticated computer-simulation models that have been built by climate scientists around the world over the past few decades. And the number is being further confirmed by the latest climate-simulation models currently being finalized in advance of the next report by the Intergovernmental Panel on Climate Change. “Looking at them as they come in, they hardly differ at all,” says Tom Wigley, an Australian climatologist at the National Center for Atmospheric Research. “There’s maybe 40 models in the data set now, compared with 20 before. But so far the numbers are pretty much the same. We’re just fine-tuning things. I don’t think much has changed over the last decade.” William Collins, a senior climate scientist at the Lawrence Berkeley National Laboratory, agrees. “I think the results of this round of simulations will be quite similar,” he says. “We’re not getting any free lunch from additional understanding of the climate system.”

We’re not getting any free lunch from the world’s economies, either. With only a single year’s lull in 2009 at the height of the financial crisis, we’ve continued to pour record amounts of carbon into the atmosphere, year after year. In late May, the International Energy Agency published its latest figures – CO2 emissions last year rose to 31.6 gigatons, up 3.2 percent from the year before. America had a warm winter and converted more coal-fired power plants to natural gas, so its emissions fell slightly; China kept booming, so its carbon output (which recently surpassed the U.S.) rose 9.3 percent; the Japanese shut down their fleet of nukes post-Fukushima, so their emissions edged up 2.4 percent. “There have been efforts to use more renewable energy and improve energy efficiency,” said Corinne Le Quéré, who runs England’s Tyndall Centre for Climate Change Research. “But what this shows is that so far the effects have been marginal.” In fact, study after study predicts that carbon emissions will keep growing by roughly three percent a year – and at that rate, we’ll blow through our 565-gigaton allowance in 16 years, around the time today’s preschoolers will be graduating from high school. “The new data provide further evidence that the door to a two-degree trajectory is about to close,” said Fatih Birol, the IEA’s chief economist. In fact, he continued, “When I look at this data, the trend is perfectly in line with a temperature increase of about six degrees.” That’s almost 11 degrees Fahrenheit, which would create a planet straight out of science fiction.

So, new data in hand, everyone at the Rio conference renewed their ritual calls for serious international action to move us back to a two-degree trajectory. The charade will continue in November, when the next Conference of the Parties (COP) of the U.N. Framework Convention on Climate Change convenes in Qatar. This will be COP 18 – COP 1 was held in Berlin in 1995, and since then the process has accomplished essentially nothing. Even scientists, who are notoriously reluctant to speak out, are slowly overcoming their natural preference to simply provide data. “The message has been consistent for close to 30 years now,” Collins says with a wry laugh, “and we have the instrumentation and the computer power required to present the evidence in detail. If we choose to continue on our present course of action, it should be done with a full evaluation of the evidence the scientific community has presented.” He pauses, suddenly conscious of being on the record. “I should say, a fuller evaluation of the evidence.”

So far, though, such calls have had little effect. We’re in the same position we’ve been in for a quarter-century: scientific warning followed by political inaction. Among scientists speaking off the record, disgusted candor is the rule. One senior scientist told me, “You know those new cigarette packs, where governments make them put a picture of someone with a hole in their throats? Gas pumps should have something like that.”

The Third Number: 2,795 Gigatons

This number is the scariest of all – one that, for the first time, meshes the political and scientific dimensions of our dilemma. It was highlighted last summer by the Carbon Tracker Initiative, a team of London financial analysts and environmentalists who published a report in an effort to educate investors about the possible risks that climate change poses to their stock portfolios. The number describes the amount of carbon already contained in the proven coal and oil and gas reserves of the fossil-fuel companies, and the countries (think Venezuela or Kuwait) that act like fossil-fuel companies. In short, it’s the fossil fuel we’re currently planning to burn. And the key point is that this new number – 2,795 – is higher than 565. Five times higher.

The Carbon Tracker Initiative – led by James Leaton, an environmentalist who served as an adviser at the accounting giant PricewaterhouseCoopers – combed through proprietary databases to figure out how much oil, gas and coal the world’s major energy companies hold in reserve. The numbers aren’t perfect – they don’t fully reflect the recent surge in unconventional energy sources like shale gas, and they don’t accurately reflect coal reserves, which are subject to less stringent reporting requirements than oil and gas. But for the biggest companies, the figures are quite exact: If you burned everything in the inventories of Russia’s Lukoil and America’s ExxonMobil, for instance, which lead the list of oil and gas companies, each would release more than 40 gigatons of carbon dioxide into the atmosphere.

Which is exactly why this new number, 2,795 gigatons, is such a big deal. Think of two degrees Celsius as the legal drinking limit – equivalent to the 0.08 blood-alcohol level below which you might get away with driving home. The 565 gigatons is how many drinks you could have and still stay below that limit – the six beers, say, you might consume in an evening. And the 2,795 gigatons? That’s the three 12-packs the fossil-fuel industry has on the table, already opened and ready to pour.

We have five times as much oil and coal and gas on the books as climate scientists think is safe to burn. We’d have to keep 80 percent of those reserves locked away underground to avoid that fate. Before we knew those numbers, our fate had been likely. Now, barring some massive intervention, it seems certain.

Yes, this coal and gas and oil is still technically in the soil. But it’s already economically aboveground – it’s figured into share prices, companies are borrowing money against it, nations are basing their budgets on the presumed returns from their patrimony. It explains why the big fossil-fuel companies have fought so hard to prevent the regulation of carbon dioxide – those reserves are their primary asset, the holding that gives their companies their value. It’s why they’ve worked so hard these past years to figure out how to unlock the oil in Canada’s tar sands, or how to drill miles beneath the sea, or how to frack the Appalachians.

If you told Exxon or Lukoil that, in order to avoid wrecking the climate, they couldn’t pump out their reserves, the value of their companies would plummet. John Fullerton, a former managing director at JP Morgan who now runs the Capital Institute, calculates that at today’s market value, those 2,795 gigatons of carbon emissions are worth about $27 trillion. Which is to say, if you paid attention to the scientists and kept 80 percent of it underground, you’d be writing off $20 trillion in assets. The numbers aren’t exact, of course, but that carbon bubble makes the housing bubble look small by comparison. It won’t necessarily burst – we might well burn all that carbon, in which case investors will do fine. But if we do, the planet will crater. You can have a healthy fossil-fuel balance sheet, or a relatively healthy planet – but now that we know the numbers, it looks like you can’t have both. Do the math: 2,795 is five times 565. That’s how the story ends.

So far, as I said at the start, environmental efforts to tackle global warming have failed. The planet’s emissions of carbon dioxide continue to soar, especially as developing countries emulate (and supplant) the industries of the West. Even in rich countries, small reductions in emissions offer no sign of the real break with the status quo we’d need to upend the iron logic of these three numbers. Germany is one of the only big countries that has actually tried hard to change its energy mix; on one sunny Saturday in late May, that northern-latitude nation generated nearly half its power from solar panels within its borders. That’s a small miracle – and it demonstrates that we have the technology to solve our problems. But we lack the will. So far, Germany’s the exception; the rule is ever more carbon.

This record of failure means we know a lot about what strategies don’t work. Green groups, for instance, have spent a lot of time trying to change individual lifestyles: the iconic twisty light bulb has been installed by the millions, but so have a new generation of energy-sucking flatscreen TVs. Most of us are fundamentally ambivalent about going green: We like cheap flights to warm places, and we’re certainly not going to give them up if everyone else is still taking them. Since all of us are in some way the beneficiaries of cheap fossil fuel, tackling climate change has been like trying to build a movement against yourself – it’s as if the gay-rights movement had to be constructed entirely from evangelical preachers, or the abolition movement from slaveholders.

People perceive – correctly – that their individual actions will not make a decisive difference in the atmospheric concentration of CO2; by 2010, a poll found that “while recycling is widespread in America and 73 percent of those polled are paying bills online in order to save paper,” only four percent had reduced their utility use and only three percent had purchased hybrid cars. Given a hundred years, you could conceivably change lifestyles enough to matter – but time is precisely what we lack.

A more efficient method, of course, would be to work through the political system, and environmentalists have tried that, too, with the same limited success. They’ve patiently lobbied leaders, trying to convince them of our peril and assuming that politicians would heed the warnings. Sometimes it has seemed to work. Barack Obama, for instance, campaigned more aggressively about climate change than any president before him – the night he won the nomination, he told supporters that his election would mark the moment “the rise of the oceans began to slow and the planet began to heal.” And he has achieved one significant change: a steady increase in the fuel efficiency mandated for automobiles. It’s the kind of measure, adopted a quarter-century ago, that would have helped enormously. But in light of the numbers I’ve just described, it’s obviously a very small start indeed.

At this point, effective action would require actually keeping most of the carbon the fossil-fuel industry wants to burn safely in the soil, not just changing slightly the speed at which it’s burned. And there the president, apparently haunted by the still-echoing cry of “Drill, baby, drill,” has gone out of his way to frack and mine. His secretary of interior, for instance, opened up a huge swath of the Powder River Basin in Wyoming for coal extraction: The total basin contains some 67.5 gigatons worth of carbon (or more than 10 percent of the available atmospheric space). He’s doing the same thing with Arctic and offshore drilling; in fact, as he explained on the stump in March, “You have my word that we will keep drilling everywhere we can… That’s a commitment that I make.” The next day, in a yard full of oil pipe in Cushing, Oklahoma, the president promised to work on wind and solar energy but, at the same time, to speed up fossil-fuel development: “Producing more oil and gas here at home has been, and will continue to be, a critical part of an all-of-the-above energy strategy.” That is, he’s committed to finding even more stock to add to the 2,795-gigaton inventory of unburned carbon.

Sometimes the irony is almost Borat-scale obvious: In early June, Secretary of State Hillary Clinton traveled on a Norwegian research trawler to see firsthand the growing damage from climate change. “Many of the predictions about warming in the Arctic are being surpassed by the actual data,” she said, describing the sight as “sobering.” But the discussions she traveled to Scandinavia to have with other foreign ministers were mostly about how to make sure Western nations get their share of the estimated $9 trillion in oil (that’s more than 90 billion barrels, or 37 gigatons of carbon) that will become accessible as the Arctic ice melts. Last month, the Obama administration indicated that it would give Shell permission to start drilling in sections of the Arctic.

Almost every government with deposits of hydrocarbons straddles the same divide. Canada, for instance, is a liberal democracy renowned for its internationalism – no wonder, then, that it signed on to the Kyoto treaty, promising to cut its carbon emissions substantially by 2012. But the rising price of oil suddenly made the tar sands of Alberta economically attractive – and since, as NASA climatologist James Hansen pointed out in May, they contain as much as 240 gigatons of carbon (or almost half of the available space if we take the 565 limit seriously), that meant Canada’s commitment to Kyoto was nonsense. In December, the Canadian government withdrew from the treaty before it faced fines for failing to meet its commitments.

The same kind of hypocrisy applies across the ideological board: In his speech to the Copenhagen conference, Venezuela’s Hugo Chavez quoted Rosa Luxemburg, Jean-Jacques Rousseau and “Christ the Redeemer,” insisting that “climate change is undoubtedly the most devastating environmental problem of this century.” But the next spring, in the Simon Bolivar Hall of the state-run oil company, he signed an agreement with a consortium of international players to develop the vast Orinoco tar sands as “the most significant engine for a comprehensive development of the entire territory and Venezuelan population.” The Orinoco deposits are larger than Alberta’s – taken together, they’d fill up the whole available atmospheric space.

So: the paths we have tried to tackle global warming have so far produced only gradual, halting shifts. A rapid, transformative change would require building a movement, and movements require enemies. As John F. Kennedy put it, “The civil rights movement should thank God for Bull Connor. He’s helped it as much as Abraham Lincoln.” And enemies are what climate change has lacked.

But what all these climate numbers make painfully, usefully clear is that the planet does indeed have an enemy – one far more committed to action than governments or individuals. Given this hard math, we need to view the fossil-fuel industry in a new light. It has become a rogue industry, reckless like no other force on Earth. It is Public Enemy Number One to the survival of our planetary civilization. “Lots of companies do rotten things in the course of their business – pay terrible wages, make people work in sweatshops – and we pressure them to change those practices,” says veteran anti-corporate leader Naomi Klein, who is at work on a book about the climate crisis. “But these numbers make clear that with the fossil-fuel industry, wrecking the planet is their business model. It’s what they do.”

According to the Carbon Tracker report, if Exxon burns its current reserves, it would use up more than seven percent of the available atmospheric space between us and the risk of two degrees. BP is just behind, followed by the Russian firm Gazprom, then Chevron, ConocoPhillips and Shell, each of which would fill between three and four percent. Taken together, just these six firms, of the 200 listed in the Carbon Tracker report, would use up more than a quarter of the remaining two-degree budget. Severstal, the Russian mining giant, leads the list of coal companies, followed by firms like BHP Billiton and Peabody. The numbers are simply staggering – this industry, and this industry alone, holds the power to change the physics and chemistry of our planet, and they’re planning to use it.

They’re clearly cognizant of global warming – they employ some of the world’s best scientists, after all, and they’re bidding on all those oil leases made possible by the staggering melt of Arctic ice. And yet they relentlessly search for more hydrocarbons – in early March, Exxon CEO Rex Tillerson told Wall Street analysts that the company plans to spend $37 billion a year through 2016 (about $100 million a day) searching for yet more oil and gas.

There’s not a more reckless man on the planet than Tillerson. Late last month, on the same day the Colorado fires reached their height, he told a New York audience that global warming is real, but dismissed it as an “engineering problem” that has “engineering solutions.” Such as? “Changes to weather patterns that move crop-production areas around – we’ll adapt to that.” This in a week when Kentucky farmers were reporting that corn kernels were “aborting” in record heat, threatening a spike in global food prices. “The fear factor that people want to throw out there to say, ‘We just have to stop this,’ I do not accept,” Tillerson said. Of course not – if he did accept it, he’d have to keep his reserves in the ground. Which would cost him money. It’s not an engineering problem, in other words – it’s a greed problem.

You could argue that this is simply in the nature of these companies – that having found a profitable vein, they’re compelled to keep mining it, more like efficient automatons than people with free will. But as the Supreme Court has made clear, they are people of a sort. In fact, thanks to the size of its bankroll, the fossil-fuel industry has far more free will than the rest of us. These companies don’t simply exist in a world whose hungers they fulfill – they help create the boundaries of that world.

Left to our own devices, citizens might decide to regulate carbon and stop short of the brink; according to a recent poll, nearly two-thirds of Americans would back an international agreement that cut carbon emissions 90 percent by 2050. But we aren’t left to our own devices. The Koch brothers, for instance, have a combined wealth of $50 billion, meaning they trail only Bill Gates on the list of richest Americans. They’ve made most of their money in hydrocarbons, they know any system to regulate carbon would cut those profits, and they reportedly plan to lavish as much as $200 million on this year’s elections. In 2009, for the first time, the U.S. Chamber of Commerce surpassed both the Republican and Democratic National Committees on political spending; the following year, more than 90 percent of the Chamber’s cash went to GOP candidates, many of whom deny the existence of global warming. Not long ago, the Chamber even filed a brief with the EPA urging the agency not to regulate carbon – should the world’s scientists turn out to be right and the planet heats up, the Chamber advised, “populations can acclimatize to warmer climates via a range of behavioral, physiological and technological adaptations.” As radical goes, demanding that we change our physiology seems right up there.

Environmentalists, understandably, have been loath to make the fossil-fuel industry their enemy, respecting its political power and hoping instead to convince these giants that they should turn away from coal, oil and gas and transform themselves more broadly into “energy companies.” Sometimes that strategy appeared to be working – emphasis on appeared. Around the turn of the century, for instance, BP made a brief attempt to restyle itself as “Beyond Petroleum,” adapting a logo that looked like the sun and sticking solar panels on some of its gas stations. But its investments in alternative energy were never more than a tiny fraction of its budget for hydrocarbon exploration, and after a few years, many of those were wound down as new CEOs insisted on returning to the company’s “core business.” In December, BP finally closed its solar division. Shell shut down its solar and wind efforts in 2009. The five biggest oil companies have made more than $1 trillion in profits since the millennium – there’s simply too much money to be made on oil and gas and coal to go chasing after zephyrs and sunbeams.

Much of that profit stems from a single historical accident: Alone among businesses, the fossil-fuel industry is allowed to dump its main waste, carbon dioxide, for free. Nobody else gets that break – if you own a restaurant, you have to pay someone to cart away your trash, since piling it in the street would breed rats. But the fossil-fuel industry is different, and for sound historical reasons: Until a quarter-century ago, almost no one knew that CO2 was dangerous. But now that we understand that carbon is heating the planet and acidifying the oceans, its price becomes the central issue.

If you put a price on carbon, through a direct tax or other methods, it would enlist markets in the fight against global warming. Once Exxon has to pay for the damage its carbon is doing to the atmosphere, the price of its products would rise. Consumers would get a strong signal to use less fossil fuel – every time they stopped at the pump, they’d be reminded that you don’t need a semimilitary vehicle to go to the grocery store. The economic playing field would now be a level one for nonpolluting energy sources. And you could do it all without bankrupting citizens – a so-called “fee-and-dividend” scheme would put a hefty tax on coal and gas and oil, then simply divide up the proceeds, sending everyone in the country a check each month for their share of the added costs of carbon. By switching to cleaner energy sources, most people would actually come out ahead.

There’s only one problem: Putting a price on carbon would reduce the profitability of the fossil-fuel industry. After all, the answer to the question “How high should the price of carbon be?” is “High enough to keep those carbon reserves that would take us past two degrees safely in the ground.” The higher the price on carbon, the more of those reserves would be worthless. The fight, in the end, is about whether the industry will succeed in its fight to keep its special pollution break alive past the point of climate catastrophe, or whether, in the economists’ parlance, we’ll make them internalize those externalities.

It’s not clear, of course, that the power of the fossil-fuel industry can be broken. The U.K. analysts who wrote the Carbon Tracker report and drew attention to these numbers had a relatively modest goal – they simply wanted to remind investors that climate change poses a very real risk to the stock prices of energy companies. Say something so big finally happens (a giant hurricane swamps Manhattan, a megadrought wipes out Midwest agriculture) that even the political power of the industry is inadequate to restrain legislators, who manage to regulate carbon. Suddenly those Chevron reserves would be a lot less valuable, and the stock would tank. Given that risk, the Carbon Tracker report warned investors to lessen their exposure, hedge it with some big plays in alternative energy.

“The regular process of economic evolution is that businesses are left with stranded assets all the time,” says Nick Robins, who runs HSBC’s Climate Change Centre. “Think of film cameras, or typewriters. The question is not whether this will happen. It will. Pension systems have been hit by the dot-com and credit crunch. They’ll be hit by this.” Still, it hasn’t been easy to convince investors, who have shared in the oil industry’s record profits. “The reason you get bubbles,” sighs Leaton, “is that everyone thinks they’re the best analyst – that they’ll go to the edge of the cliff and then jump back when everyone else goes over.”

So pure self-interest probably won’t spark a transformative challenge to fossil fuel. But moral outrage just might – and that’s the real meaning of this new math. It could, plausibly, give rise to a real movement.

Once, in recent corporate history, anger forced an industry to make basic changes. That was the campaign in the 1980s demanding divestment from companies doing business in South Africa. It rose first on college campuses and then spread to municipal and state governments; 155 campuses eventually divested, and by the end of the decade, more than 80 cities, 25 states and 19 counties had taken some form of binding economic action against companies connected to the apartheid regime. “The end of apartheid stands as one of the crowning accomplishments of the past century,” as Archbishop Desmond Tutu put it, “but we would not have succeeded without the help of international pressure,” especially from “the divestment movement of the 1980s.”

The fossil-fuel industry is obviously a tougher opponent, and even if you could force the hand of particular companies, you’d still have to figure out a strategy for dealing with all the sovereign nations that, in effect, act as fossil-fuel companies. But the link for college students is even more obvious in this case. If their college’s endowment portfolio has fossil-fuel stock, then their educations are being subsidized by investments that guarantee they won’t have much of a planet on which to make use of their degree. (The same logic applies to the world’s largest investors, pension funds, which are also theoretically interested in the future – that’s when their members will “enjoy their retirement.”) “Given the severity of the climate crisis, a comparable demand that our institutions dump stock from companies that are destroying the planet would not only be appropriate but effective,” says Bob Massie, a former anti-apartheid activist who helped found the Investor Network on Climate Risk. “The message is simple: We have had enough. We must sever the ties with those who profit from climate change – now.”

Movements rarely have predictable outcomes. But any campaign that weakens the fossil-fuel industry’s political standing clearly increases the chances of retiring its special breaks. Consider President Obama’s signal achievement in the climate fight, the large increase he won in mileage requirements for cars. Scientists, environmentalists and engineers had advocated such policies for decades, but until Detroit came under severe financial pressure, it was politically powerful enough to fend them off. If people come to understand the cold, mathematical truth – that the fossil-fuel industry is systematically undermining the planet’s physical systems – it might weaken it enough to matter politically. Exxon and their ilk might drop their opposition to a fee-and-dividend solution; they might even decide to become true energy companies, this time for real.

Even if such a campaign is possible, however, we may have waited too long to start it. To make a real difference – to keep us under a temperature increase of two degrees – you’d need to change carbon pricing in Washington, and then use that victory to leverage similar shifts around the world. At this point, what happens in the U.S. is most important for how it will influence China and India, where emissions are growing fastest. (In early June, researchers concluded that China has probably under-reported its emissions by up to 20 percent.) The three numbers I’ve described are daunting – they may define an essentially impossible future. But at least they provide intellectual clarity about the greatest challenge humans have ever faced. We know how much we can burn, and we know who’s planning to burn more. Climate change operates on a geological scale and time frame, but it’s not an impersonal force of nature; the more carefully you do the math, the more thoroughly you realize that this is, at bottom, a moral issue; we have met the enemy and they is Shell.

Meanwhile the tide of numbers continues. The week after the Rio conference limped to its conclusion, Arctic sea ice hit the lowest level ever recorded for that date. Last month, on a single weekend, Tropical Storm Debby dumped more than 20 inches of rain on Florida – the earliest the season’s fourth-named cyclone has ever arrived. At the same time, the largest fire in New Mexico history burned on, and the most destructive fire in Colorado’s annals claimed 346 homes in Colorado Springs – breaking a record set the week before in Fort Collins. This month, scientists issued a new study concluding that global warming has dramatically increased the likelihood of severe heat and drought – days after a heat wave across the Plains and Midwest broke records that had stood since the Dust Bowl, threatening this year’s harvest. You want a big number? In the course of this month, a quadrillion kernels of corn need to pollinate across the grain belt, something they can’t do if temperatures remain off the charts. Just like us, our crops are adapted to the Holocene, the 11,000-year period of climatic stability we’re now leaving… in the dust.

This story is from the August 2nd, 2012 issue of Rolling Stone.

Programa de computador mimetiza evolução humana (Fapesp)

Software desenvolvido na USP de São Carlos cria e seleciona programas geradores de Árvores de Decisão, ferramentas capazes de fazer previsões. Pesquisa foi premiada nos Estados Unidos, no maior evento de computação evolutiva (Wikimedia)

16/08/2012

Por Karina Toledo

Agência FAPESP – Árvores de Decisão são ferramentas computacionais que conferem às máquinas a capacidade de fazer previsões com base na análise de dados históricos. A técnica pode, por exemplo, auxiliar o diagnóstico médico ou a análise de risco de aplicações financeiras.

Mas, para ter a melhor previsão, é necessário o melhor programa gerador de Árvores de Decisão. Para alcançar esse objetivo, pesquisadores do Instituto de Ciências Matemáticas e de Computação (ICMC) da Universidade de São Paulo (USP), em São Carlos, se inspiraram na teoria evolucionista de Charles Darwin.

“Desenvolvemos um algoritmo evolutivo, ou seja, que mimetiza o processo de evolução humana para gerar soluções”, disse Rodrigo Coelho Barros, doutorando do Laboratório de Computação Bioinspirada (BioCom) do ICMC e bolsista da FAPESP.

A computação evolutiva, explicou Barros, é uma das várias técnicas bioinspiradas, ou seja, que buscam na natureza soluções para problemas computacionais. “É notável como a natureza encontra soluções para problemas extremamente complicados. Não há dúvidas de que precisamos aprender com ela”, disse Barros.

Segundo Barros, o software desenvolvido em seu doutorado é capaz de criar automaticamente programas geradores de Árvores de Decisão. Para isso, faz cruzamentos aleatórios entre os códigos de programas já existentes gerando “filhos”.

“Esses ‘filhos’ podem eventualmente sofrer mutações e evoluir. Após um tempo, é esperado que os programas de geração de Árvores de Decisão evoluídos sejam cada vez melhores e nosso algoritmo seleciona o melhor de todos”, afirmou Barros.

Mas enquanto o processo de seleção natural na espécie humana leva centenas ou até milhares de anos, na computação dura apenas algumas horas, dependendo do problema a ser resolvido. “Estabelecemos cem gerações como limite do processo evolutivo”, contou Barros.

Inteligência artificial

Em Ciência da Computação, é denominada heurística a capacidade de um sistema fazer inovações e desenvolver técnicas para alcançar um determinado fim.

O software desenvolvido por Barros se insere na área de hiper-heurísticas, tópico recente na área de computação evolutiva que tem como objetivo a geração automática de heurísticas personalizadas para uma determinada aplicação ou conjunto de aplicações.

“É um passo preliminar em direção ao grande objetivo da inteligência artificial: o de criar máquinas capazes de desenvolver soluções para problemas sem que sejam explicitamente programadas para tal”, detalhou Barros.

O trabalho deu origem ao artigo A Hyper-Heuristic Evolutionary Algorithm for Automatically Designing Decision-Tree Algorithms, premiado em três categorias na Genetic and Evolutionary Computation Conference (GECCO), maior evento da área de computação evolutiva do mundo, realizado em julho na Filadélfia, Estados Unidos.

Além de Barros, também são autores do artigo os professores André Carlos Ponce de Leon Ferreira de Carvalho, orientador da pesquisa no ICMC, Márcio Porto Basgalupp, da Universidade Federal de São Paulo (Unifesp), e Alex Freitas, da University of Kent, no Reino Unido, que assumiu a co-orientação.

Os autores foram convidados a submeter o artigo para a revista Evolutionary Computation Journal, publicada pelo Instituto de Tecnologia de Massachusetts (MIT). “O trabalho ainda passará por revisão, mas, como foi submetido a convite, tem grande chance de ser aceito”, disse Barros.

A pesquisa, que deve ser concluída somente em 2013, também deu origem a um artigo publicado a convite no Journal of the Brazilian Computer Society, após ser eleito como melhor trabalho no Encontro Nacional de Inteligência Artificial de 2011.

Outro artigo, apresentado na 11ª International Conference on Intelligent Systems Design and Applications, realizada na Espanha em 2011, rendeu convite para publicação na revistaNeurocomputing.

Psychopaths Get a Break from Biology: Judges Reduce Sentences If Genetics, Neurobiology Are Blamed (Science Daily)

ScienceDaily (Aug. 16, 2012) — A University of Utah survey of judges in 19 states found that if a convicted criminal is a psychopath, judges consider it an aggravating factor in sentencing, but if judges also hear biological explanations for the disorder, they reduce the sentence by about a year on average.

The new study, published in the Aug. 17, 2012, issue of the journalScience, illustrates the “double-edged sword” faced by judges when they are given a “biomechanical” explanation for a criminal’s mental disorder:

If a criminal’s behavior has a biological basis, is that reason to reduce the sentence because defective genes or brain function leave the criminal with less self-control and ability to tell right from wrong? Or is it reason for a harsher sentence because the criminal likely will reoffend?

“In a nationwide sample of judges, we found that expert testimony concerning the biological causes of psychopathy significantly reduced sentencing of the psychopath” from almost 14 years to less than 13 years, says study coauthor James Tabery, an assistant professor of philosophy at the University of Utah.

However, the hypothetical psychopath in the study got a longer sentence than the average nine-year sentence judges usually impose for the same crime — aggravated battery — and there were state-to-state differences in whether judges reduced or increased the sentence when given information on the biological causes of psychopathy.

The study was conducted by Tabery; Lisa Aspinwall, a University of Utah associate professor of psychology; and Teneille Brown, an associate professor at the university’s S.J. Quinney College of Law.

The researchers say that so far as they know, their study — funded by a University of Utah grant to promote interdisciplinary research — is the first to examine the effect of the biological causes of criminal behavior on real judges’ reasoning during sentencing.

Biological Explanation of Psychopathy Helps Defendant

The anonymous online survey — distributed with the help of 19 of 50 state court administrators who were approached — involved 181 participating judges reading a scenario, based on a real Georgia case, about a psychopath convicted of aggravated battery for savagely beating a store clerk with a gun during a robbery attempt.

The judges then answered a series of questions, including whether they consider scientific evidence of psychopathy to be an aggravating or mitigating factor that would increase or decrease the sentence, respectively, and what sentence they would impose. They were told psychopathy is incurable and treatment isn’t now an option.

While psychopathy isn’t yet a formal diagnosis in the manual used by psychiatrists, it soon may be added as a category of antisocial personality disorder, Tabery says. The study cited an expert definition of psychopathy as “a clinical diagnosis defined by impulsivity; irresponsibility; shallow emotions; lack of empathy, guilt or remorse; pathological lying; manipulation; superficial charm; and the persistent violation of social norms and expectations.”

The researchers recruited 207 state trial court judges for the study. Six dropped out. Twenty others were excluded because they incorrectly identified the defendant’s diagnosis. That left 181 judges who correctly identified the defendant as a psychopath, including 164 who gave complete data on their sentencing decisions.

The judges were randomly divided into four groups. All the judges read scientific evidence that the convicted criminal was a psychopath and what that means, but only half were given evidence about the genetic and neurobiological causes of the condition. Half the judges in each group got the scientific evidence from the defense, which argued it should mitigate or reduce the sentence, and half the judges got the evidence from the prosecution, which argued it should aggravate or increase the sentence.

Judges who were given a biological explanation for the convict’s psychopathy imposed sentences averaging 12.83 years, or about a year less than the 13.93-year average sentence imposed by judges who were told only that the defendant was a psychopath, but didn’t receive a biological explanation for the condition. In both cases, however, sentencing for the psychopath was longer than the judges’ normal nine-year average sentence for aggravated battery.

Even though the year reduction in sentence may not seem like much, “we were amazed the sentence was reduced at all given that we’re dealing with psychopaths, who are very unsympathetic,” Brown says.

Aspinwall notes: “The judges did not let the defendant off, they just reduced the sentence and showed major changes in the quality of their reasoning.”

The study found that although 87 percent of the judges listed at least one aggravating factor in explaining their decision, when the judges heard evidence about the biomechanical causes of psychopathy from the defense, the proportion of judges who also listed mitigating factors rose from about 30 percent to 66 percent.

Psychopathy was seen as an aggravating factor no matter which side presented the evidence, but it was viewed by the judges as less aggravating when presented by the defense than when presented by the prosecution.

A Disconnect between Sentencing and Criminal Responsibility

One surprising and paradoxical finding of the study was that even though the judges tended to reduce the sentence when given a biological explanation for the defendant’s psychopathy, the judges — when asked explicitly — did not rate the defendant as having less free will or as being less legally or morally responsible for the crime.

“The thought is that responsibility and punishment go hand in hand, so if we see reduced punishment, we would expect to see the judges feel the defendants are less responsible,” Tabery says. “So it is surprising that we got the former, not the latter.”

The researchers also counted explicit mentions by the judges of balancing or weighing factors that increase or reduce sentencing. When evidence of a biological cause of the defendant’s psychopathy was presented by the defense, the judges were about 2.5 times more likely to mention weighing aggravating and mitigating factors than when it was presented by the prosecution or when no biological evidence was presented.

The data show that “the introduction of expert testimony concerning a biological mechanism for psychopathy significantly increased the number of judges invoking mitigating factors in their reasoning and balancing them with aggravating factors,” the researchers conclude. “These findings suggest that the biomechanism did invoke such concepts as reduced culpability due to lack of impulse control, even if these concepts did not affect the ratings of free will and responsibility.”

Brown adds: “In the coming years, we are likely to find out about all kinds of biological causes of criminal behavior, so the question is, why does the law care if most behavior is biologically caused? That’s what is so striking about finding these results in psychopaths, because we’re likely to see an even sharper reduction in sentencing of defendants with a more sympathetic diagnosis, such as mental retardation.”

State Variations in Sentencing

While the overall results showed a reduction in sentencing when judges read biological evidence about the cause of psychopathy, the reduction was greater in some of the 19 states surveyed and nonexistent in others. That is not surprising due to variations in sentencing guidelines, rules of evidence and the extent of judges’ discretion.

There were too few responses from eight states to analyze them individually. In three states — Colorado, New York and Tennessee — biological evidence of psychopathy actually increased the sentence, although the findings weren’t statistically significant.

In eight other states — Alabama, Maryland, Missouri, Nebraska, New Mexico, Oklahoma, Utah and Washington state — biological evidence of psychopathy reduced the sentence or had no effect, and the reduction was statistically significant in two of those states: Utah and Maryland. When just those eight states were examined, the defendant received an average sentence of 10.7 years if evidence was introduced that psychopathy has a biological cause, versus 13.9 years without such evidence.

“We saw sentencing go up in a few states and down in most, and that’s just evidence that it [the double-edge sword] could cut either way,” Brown says.

Aspinwall adds: “When you look at the reasons the judges provide, what is striking to us is the vast majority found the psychopathy diagnosis to be aggravating and, with the presentation of the biological mechanism, also mitigating. So both things are happening.”

Cyborg America: inside the strange new world of basement body hackers (The Verve)

The Verve, 8 August 2012

Shawn Sarver took a deep breath and stared at the bottle of Listerine on the counter. “A minty fresh feeling for your mouth… cures bad breath,” he repeated to himself, as the scalpel sliced open his ring finger. His left arm was stretched out on the operating table, his sleeve rolled up past the elbow, revealing his first tattoo, the Air Force insignia he got at age 18, a few weeks after graduating from high school. Sarver was trying a technique he learned in the military to block out the pain, since it was illegal to administer anesthetic for his procedure.

“A minty fresh feeling… cures bad breath,” Sarver muttered through gritted teeth, his eyes staring off into a void.

Tim, the proprietor of Hot Rod Piercing in downtown Pittsburgh, put down the scalpel and picked up an instrument called an elevator, which he used to separate the flesh inside in Sarver’s finger, creating a small empty pocket of space. Then, with practiced hands, he slid a tiny rare earth metal inside the open wound, the width of a pencil eraser and thinner than a dime. When he tried to remove his tool, however, the metal disc stuck to the tweezers. “Let’s try this again,” Tim said. “Almost done.”

The implant stayed put the second time. Tim quickly stitched the cut shut, and cleaned off the blood. “Want to try it out?” he asked Sarver, who nodded with excitement. Tim dangled the needle from a string of suture next to Sarver’s finger, closer and closer, until suddenly, it jumped through the air and stuck to his flesh, attracted by the magnetic pull of the mineral implant.

“I’m a cyborg!” Sarver cried, getting up to join his friends in the waiting room outside. Tim started prepping a new tray of clean surgical tools. Now it was my turn.

PART.01

With the advent of the smartphone, many Americans have grown used to the idea of having a computer on their person at all times. Wearable technologies like Google’s Project Glass are narrowing the boundary between us and our devices even further by attaching a computer to a person’s face and integrating the software directly into a user’s field of vision. The paradigm shift is reflected in the names of our dominant operating systems. Gone are Microsoft’s Windows into the digital world, replaced by a union of man and machine: the iPhone or Android.

For a small, growing community of technologists, none of this goes far enough. I first met Sarver at the home of his best friend, Tim Cannon, in Oakdale, a Pennsylvania suburb about 30 minutes from Pittsburgh where Cannon, a software developer, lives with his longtime girlfriend and their three dogs. The two-story house sits next to a beer dispensary and an abandoned motel, a reminder the city’s best days are far behind it. In the last two decades, Pittsburgh has been gutted of its population, which plummeted from a high of more than 700,000 in the 1980s to less than 350,000 today. For its future, the city has pinned much of its hopes on the biomedical and robotics research being done at local universities like Carnegie Mellon. “The city was dying and so you have this element of anti-authority freaks are welcome,” said Cannon. “When you have technology and biomedical research and a pissed-off angry population that loves tattoos, this is bound to happen. Why Pittsburgh? It’s got the right amount of fuck you.”

Cannon led me down into the basement, which he and Sarver have converted into a laboratory. A long work space was covered with Arduino motherboards, soldering irons, and electrodes. Cannon had recently captured a garter snake, which eyed us from inside a plastic jar. “Ever since I was a kid, I’ve been telling people that I want to be a robot,” said Cannon. “These days, that doesn’t seem so impossible anymore.” The pair call themselves grinders — homebrew biohackers obsessed with the idea of human enhancement — who are looking for new ways to put machines into their bodies. They are joined by hundreds of aspiring biohackers who populate the movement’s online forums and a growing number, now several dozen, who have gotten the magnetic implants in real life.

GONE ARE MICROSOFT’S WINDOWS INTO THE DIGITALWORLD, REPLACED BY A UNION OF MANAND MACHINE: THE IPHONE ORANDROID

COMPUTERS ARE HARDWARE. APPS ARE SOFTWARE. HUMANS AREWETWARE

“EVER SINCE IWAS A KID, I’VE BEEN TELLING PEOPLE THAT IWANT TO BE A ROBOT.”

Cannon looks and moves a bit like Shaggy from Scooby Doo, a languid rubberband of a man in baggy clothes and a newsboy cap. Sarver, by contrast, stands ramrod-straight, wearing a dapper three-piece suit and waxed mustache, a dandy steampunk with a high-pitched laugh. There is a distinct division of labor between the two: Cannon is the software developer and Sarver, who learned electrical engineering as a mechanic in the Air Force, does the hardware. The moniker for their working unit is Grindhouse Wetwares. Computers are hardware. Apps are software. Humans are wetware.

Cannon, like Sarver, served in the military, but the two didn’t meet until they had both left the service, introduced by a mutual friend in the Pittsburgh area. Politics brought them together. “We were both kind of libertarians, really strong anti-authority people, but we didn’t fit into the two common strains here: idiot anarchist who’s unrealistic or right-wing crazy Christian. Nobody was incorporating technology into it. So there was no political party but just a couple like-minded individuals, who were like… techno-libertarians!”

Cannon got his own neodymium magnetic implant a year before Sarver. Putting these rare earth metals into the body was pioneered by artists on the bleeding edge of piercing culture and transhumanists interested in experimenting with a sixth sense.Steve Haworth, who specializes in the bleeding edge of body modification and considers himself a “human evolution artist,” is considered one of the originators, and helped to inspire a generation of practitioners to perform magnetic implants, including the owner of Hot Rod Piercing in Pittsburgh. (Using surgical tools like a scalpel is a grey area for piercers. Operating with these instruments, or any kind of anesthesia, could be classified as practicing medicine. Without a medical license, a piercer who does this is technically committing assault on the person getting the implant.) On its own, the implant allows a person to feel electromagnetic fields: a microwave oven in their kitchen, a subway passing beneath the ground, or high-tension power lines overhead.

While this added perception is interesting, it has little utility. But the magnet, explains Cannon, is more of a stepping stone toward bigger things. “It can be done cheaply, with minimally invasive surgery. You get used to the idea of having something alien in your body, and kinda begin to see how much more the human body could do with a little help. Sure, feeling other magnets around you is fucking cool, but the real key is, you’re giving the human body a simple, digital input.”

As an example of how that might work, Cannon showed me a small device he and Sarver created called the Bottlenose. It’s a rectangle of black metal about half the size of a pack of cigarettes that slips over your finger. Named after the echolocation used by dolphins, it sends out an electromagnetic pulse and measures the time it takes to bounce back. Cannon slips it over his finger and closes his eyes. “I can kind of sweep the room and get this picture of where things are.” He twirls around the half-empty basement, eyes closed, then stops, pointing directly at my chest. “The magnet in my finger is extremely sensitive to these waves. So the Bottlenose can tell me the shape of things around me and how far away they are.”

The way Cannon sees it, biohacking is all around us. “In a way, eyeglasses are a body hack, a piece of equipment that enhances your sense, and pretty quickly becomes like a part of your body,” says Cannon. He took a pair of electrodes off the workbench and attached them to my temples. “Your brain works through electricity, so why not help to boost that?” A sharp pinch ran across my forehead as the first volts flowed into my skull. He and Sarver laughed as my face involuntarily twitched. “You’re one of us now,” Cannon says with a laugh.

HISTORY.01

In one sense, Mary Shelley’s Frankenstein, part man, part machine, animated by electricity and with superhuman abilities, might be the first dark, early vision of what humans’ bodies would become when modern science was brought to bear. A more utopian version was put forward in 1960, a year before man first travelled into space, by the scientist and inventor Manfred Clynes. Clynes was considering the problem of how mankind would survive in our new lives as outer space dwellers, and concluded that only by augmenting our physiology with drugs and machines could we thrive in extraterrestrial environs. It was Clynes and his co-author Nathan Kline, writing on this subject, who coined the term cyborg.

At its simplest, a cyborg is a being with both biological and artificial parts: metal, electrical, mechanical, or robotic. The construct is familiar to almost everyone through popular culture, perhaps most spectacularly in the recent Iron Man films. Tony Stark is surely our greatest contemporary cyborg: a billionaire businessman who designed his own mechanical heart, a dapper bachelor who can transform into a one-man fighter jet, then shed his armour as easily as a suit of clothes.

Britain is the birthplace of 21st-century biohacking, and the movement’s two foundational figures present a similar Jekyll and Hyde duality. One is Lepht Anonym, a DIY punk who was one of the earliest, and certainly the most dramatic, to throw caution to the wind and implant metal and machines into her flesh. The other is Kevin Warwick, an academic at the University of Reading’s department of cybernetics. Warwick relies on a trained staff of medical technicians when doing his implants. Lepht has been known to say that all she requires is a potato peeler and a bottle of vodka. In an article on h+, Anonym wrote:

I’m sort of inured to pain by this point. Anesthetic is illegal for people like me, so we learn to live without it; I’ve made scalpel incisions in my hands, pushed five-millimeter diameter needles through my skin, and once used a vegetable knife to carve a cavity into the tip of my index finger. I’m an idiot, but I’m an idiot working in the name of progress: I’m Lepht Anonym, scrapheap transhumanist. I work with what I can get.

Anonym’s essay, a series of YouTube videos, and a short profile in Wired established her as the face of the budding biohacking movement. It was Anonym who proved, with herself as the guinea pig, that it was possible to implant RFID chips and powerful magnets into one’s body, without the backing of an academic institution or help from a team of doctors.

 

“She is an inspiration to all of us,” said a biohacker who goes by the name of Sovereign Bleak. “To anyone who was frustrated with the human condition, who felt we had been promised more from the future, she said that it was within our grasp, and our rights, to evolve our bodies however we saw fit.” Over the last decade grinders have begun to form a loose culture, connected mostly by online forums like biohack.me, where hundreds of aspiring cyborgs congregate to swap tips about the best bio-resistant coatings to prevent the body from rejecting magnetic implants and how to get illegal anesthetics shipped from Canada to the United States. There is another strain of biohacking which focuses on the possibilities for DIY genetics, but their work is far more theoretical than the hands-on experiments performed by grinders.

But while Anonym’s renegade approach to bettering her own flesh birthed a new generation of grinders, it seems to have had some serious long-term consequences for her own health. “I’m a wee bit frightened right now,” Anonym wrote on her blog early this year. “I’m hearing things that aren’t there. Sure I see things that aren’t real from time to time because of the stupid habits I had when I was a teenager and the permanent, very mild damage I did to myself experimenting like that, but I don’t usually hear anything and this is not a flashback.”

MEDICAL NEED VERSUS HUMAN ENHANCEMENT

Neil Harbisson was born with a condition that allows him to see only in black and white. He became interested in cybernetics, and eventually began wearing the Eyeborg, a head-mounted camera which translated colors into vibrations that Harbisson could hear. The addition of the Eyeborg to his passport has led some to dub him the first cyborg officially recognized by the federal government. He now plans to extend and improve this cybernetic synesthesia by having the Eyeborg permanently surgically attached to his skull.

Getting a medical team to help him was no easy task. “Their position was that ‘doctors usually repair or fix humans’ and that my operation was not about fixing nor repairing myself but about creating a new sense: the perception of visual elements via bone-conducted sounds,” Harbisson told me by email. “The other main issue was that the operation would allow me to perceive outside the ability of human vision and human hearing (hearing via the bone allows you to hear a wider range of sounds, from infrasounds to ultrasounds, and some lenses can detect ultraviolets and infrareds). It took me over a year to convince them.”

In the end, the bio-ethical community still relies on promises of medical need to justify cybernetic enhancement. “I think I convinced them when I told them that this kind of operation could help ‘fix and repair’ blind people. If you use a different type of chip, a chip that translates words into sound, or distances into sound, for instance, the same electronic eye implant could be used to read or to detect obstacles which could mean the end of Braille and sticks. I guess hospitals and governments will soon start publishing their own laws about which kind of cybernetic implants they find are ethical/legal and which ones they find are not.”

PART.02

THE EXPERIENCE RANKED ALONGSIDE BREAKING MY ARM AND HAVING MY APPENDIX REMOVED

  

I had Lepht Anonym in the back of my mind as I stretched my arm out on the operating table at Hot Rod Piercing. The fingertip is an excellent place for a magnet because it is full of sensitive nerve tissue, fertile ground for your nascent sixth sense to pick up on the electro-magnetic fields all around us. It is also an exceptionally painful spot to have sliced open with a scalpel, especially when no painkillers are available. The experience ranked alongside breaking my arm and having my appendix removed, a level of pain that opens your mind to parts of your body which before you were not conscious of.

For the first few days after the surgery, it was difficult to separate out my newly implanted sense from the bits of pain and sensation created by the trauma of having the magnet jammed in my finger. Certain things were clear: microwave ovens gave off a steady field that was easy to perceive, like a pulsating wave of invisible water, or air heavy from heat coming off a fan. And other magnets, of course, were easy to identify. They lurked like landmines in everyday objects — my earbuds, my messenger bag — sending my finger ringing with a deep, sort of probing force field that shifted around in my flesh.

High-tension wires seemed to give off a sort of pulsating current, but it was often hard to tell, since my finger often began throbbing for no reason, as it healed from the trauma of surgery. Playing with strong, stand-alone magnets was a game of chicken. The party trick of making one leap across a table towards my finger was thrilling, but the awful squirming it caused inside my flesh made me regret it hours later. Grasping a colleague’s stylus too near the magnetic tip put a sort of freezing probe into my finger that I thought about for days afterwards.

Within a few weeks, the sensation began to fade. I noticed fewer and fewer instances of a sixth sense, beyond other magnets, which were quite obvious. I was glad that the implant didn’t interfere with my life, or prevent me from exercising, but I also grew a bit disenchanted, after all the hype and excitement the grinders I interviewed had shared about their newfound way of interacting with the world.

HISTORY.02

If Lepht Anonym is the cautionary tale, Prof. Kevin Warwick is the one bringing academic respectability to cybernetics. He was one of the first to experiment with implants, putting an RFID chip into his body back in 1998, and has also taken the techniques the farthest. In 2002, Prof. Warwick had cybernetic sensors implanted into the nerves of his arm. Unlike the grinders in Pittsburgh, he had the benefits of anesthesia and a full medical team, but he was still putting himself at great risk, as there was no research on the long-term effects of having these devices grafted onto his nervous system. “In a way that is what I like most about this,” he told me. “From an academic standpoint, it’s wide-open territory.”

I chatted with Warwick from his office at The University of Reading, stacked floor to ceiling with books and papers. He has light brown hair that falls over his forehead and an easy laugh. With his long sleeve shirt on, you would never know that his arm is full of complex machinery. The unit allows Warwick to manipulate a robot hand, a mirror of his own fingers and flesh. What’s more, the impulse could flow both ways. Warwick’s wife, Irena, had a simpler cybernetic implant done on herself. When someone grasped her hand, Prof. Warwick was able to experience the same sensation in his hand, from across the Atlantic. It was, Warwick writes, a sort of cybernetic telepathy, or empathy, in which his nerves were made to feel what she felt, via bits of data travelling over the internet.

The work was hailed by the mainstream media as a major step forward in helping amputees and victims of paralysis to regain a full range of abilities. But Prof. Warwick says that misses the point. “I quite like the fact that new medical therapies could potentially come out of this work, but what I am really interested in is not getting people back to normal; it’s enhancement of fully functioning humans to a higher level.”

It’s a sentiment that can take some getting used to. “A decade ago, if you talked about human enhancement, you upset quite a lot of people. Unless the end goal was helping the disabled, people really were not open to it.” With the advent of smartphones, says Prof. Warwick, all that has changed. “Normal folks really see the value of ubiquitous technology. In fact the social element has almost created the reverse. Now, you must be connected all the time.”

While he is an accomplished academic, Prof. Warwick has embraced biohackers and grinders as fellow travelers on the road to exploring our cybernetic future. “A lot of the time, when it comes to putting magnets into your body or RFID chips, there is more information on YouTube than in the peer-reviewed journals. There are artists and geeks pushing the boundaries, sharing information, a very renegade thing. My job is to take that, and apply some more rigorous scientific analysis.”

To that end, Prof. Warwick and one of his PhD students, Ian Harrison, are beginning a series of studies on biohackers with magnetic implants. “When it comes to sticking sensors into your nerve endings, so much is subjective,” says Harrison. “What one person feels, another may not. So we are trying to establish some baselines for future research.”

“IT’S LIKE THIS LAST, UNEXPLORED CONTINENT STARING US IN THE FACE.”The end goal for Prof. Warwick, as it was for the team at Grindhouse Wetwares in Pittsburgh, is still the stuff of science fiction. “When it comes to communication, humans are still so far behind what computers are capable of,” Prof. Warwick explained. “Bringing about brain to brain communication is something I hope to achieve in my lifetime.”For Warwick, this will advance not just the human body and the field of cybernetics, but allow for a more practical evaluation the entire canon of Western thought. “I would like to ask the questions that the philosopher Ludwig Wittgenstein asked, but in practice, not in theory.” It would be another attempt to study the mind, from inside and out, as Wittgenstein proposed. But with access to objective data. “Perhaps he was bang on, or maybe we will rubbish his whole career, but either way, it’s something we should figure out.”

As the limits of space exploration become increasingly clear, a generation of scientists who might once have turned to the stars are seeking to expand humanity’s horizons much closer to home. “Jamming stuff into your body, merging machines with your nerves and brain, it’s brand new,” said Warwick. “It’s like this last, unexplored continent staring us in the face.”

On a hot day in mid-July, I went for a walk around Manhattan with Dann Berg, who had a magnet implanted in his pinky three years earlier. I told him I was a little disappointed how rarely I noticed anything with my implant. “Actually, your experience is pretty common,” he told me. “I didn’t feel much for the first 6 months, as the nerves were healing from surgery. It took a long time for me to gain this kind of ambient awareness.”

Berg worked for a while in the piercing and tattoo studio, which brought him into contact with the body modification community who were experimenting with implants. At the same time, he was teaching himself to code and finding work as a front-end developer building web sites. “To me, these two things, the implant and the programming, they are both about finding new ways to see and experience the world.”

“WE’RE TOUCHING SOMETHING OTHER PEOPLE CAN’T SEE; THEY DON’T KNOW
IT EXISTS.”Berg took me to an intersection at Broadway and Bleecker. In the middle of the crosswalk, he stopped, and began moving his hand over a metal grate. “You feel that?” he asked. “It’s a dome, right here, about a foot off the ground, that just sets my finger off. Somewhere down there, part of the subway system or the power grid is working. We’re touching something other people can’t see; they don’t know it exists. That’s amazing to me.” People passing by gave us odd stares as Berg and I stood next to each other in the street, waving our hands around inside an invisible field, like mystics groping blindly for a ghost.

CYBORGS IN SOCIETY

Last month, a Canadian professor named Steve Mann was eating at a McDonald’s with his family. Mann wears a pair of computerized glasses at all times, similar to Google’s Project Glass. One of the employees asked him to take them off. When he refused, Mann says, an employee tried to rip the glasses off, an alleged attack made more brutal because the device is permanently attached and does not come off his skull without special tools.

On biohacking websites and transhumanist forums, the event was a warning sign of the battle to come. Some dubbed it the first hate crime against cyborgs. That would imply the employees knew Mann’s device was part of him, which is still largely unclear. But it was certainly a harbinger of the friction that will emerge between people whose bodies contain powerful machines and society at large.

PART.03

After zapping my brain with a few dozen volts, the boys from Grindhouse Wetwares offered to cook me dinner. Cannon popped a tray of mashed potatoes in the microwave and showed me where he put his finger to feel the electromagnetic waves streaming off. We stepped out onto the back porch and let his three little puggles run wild. The sound of cars passing on the nearby highway and the crickets warming up for sunset relaxed everyone. I asked what they thought the potential was for biohacking to become part of the mainstream.

“That’s the thing, it’s not that much of a leap,” said Cannon. “We’ve had pacemakers since the ’70s.” Brain implants are now being used to treat Parkinson’s disease and depression. Scientists hope that brain implants might soon restore mobility to paralyzed limbs. The crucial difference is that grinders are pursuing this technology for human enhancement, without any medical need. “How is this any different than plastic surgery, which like half the fucking country gets?” asked Cannon. “Look, you know the military is already working on stuff like this, right? And it won’t be too long before the corporations start following suit.”

Sarver joined the Air Force just weeks after 9/11. “I was a dyed-in-the-wool Roman Catholic Republican. I wasn’t thinking about the military, but after 9/11, I just believed the dogma.” In place of college, he got an education in electronics repairing fighter jets and attack helicopters. He left the war a very different man. “There were no terrorists in Iraq. We were the terrorists. These were scared people, already scared of their own government.”

Yet, while he rejected the conflict in the Middle East, Sarver’s time in the military gave him a new perspective on the human body. “I’ve been in the special forces,” said Sarver. “I know what the limits of the human body are like. Once you’ve seen the capabilities of a 5000psi hydraulic system, it’s no comparison.”

“THIS IS JUST A DECAYING LUMP OF FLESH THAT GETS OLD, IT’S LEAKING FLUID ALL THE TIME”

“IT’S GOING TO BE WEIRD AND UNCOMFORTABLEAND SCARY. BUT YOU CAN DO THAT, OR YOU CAN BECOME OBSOLETE.”

The boys from Grindhouse Wetwares both sucked down Parliament menthols the whole time we talked. There was no irony for them in dreaming of the possibilities for one’s body and willfully destroying it. “For me, the end game is my brain and spinal column in a jar, and a robot body out in the world doing my bidding,” said Sarver. “I would really prefer not to have to rely on an inefficient four-valve pump that sends liquid through these fragile hoses. Fuck cheetahs. I want to punch through walls.”

Flesh and blood are easily shed in grinder circles, at least theoretically speaking. “People recoil from the idea of tampering inside the body,” said Tim. “I am lost when it comes to people’s unhealthy connections to your body. This is just a decaying lump of flesh that gets old, it’s leaking fluid all the time, it’s obscene to think this is me. I am my ideas and the sum of my experiences.” As far as the biohackers are concerned, we are the best argument against intelligent design.

Neither man has any illusions about how fringe biohacking is now. But technology marches on. “People say nobody is going to want to get surgery for this stuff,” admits Cannon. But he believes that will change. “They will or they will be left behind. They have no choice. It’s going to be weird and uncomfortable and scary. But you can do that, or you can become obsolete.”

We came back into the kitchen for dinner. As I wolfed down steak and potatoes, Cannon broke into a nervous grin. “I want to show you something. It’s not quite ready, but this is what we’re working on.” He disappeared down into the basement lab and returned with a small device the size of a cigarette lighter, a simple circuit board with a display attached. This was the HELEDD, the next step in the Grindhouse Wetwares plan to unite man and machine. “This is just a prototype, but when we get it small enough, the idea is to have this beneath my skin,” he said, holding it up against his inner forearm.

The smartphone in your pocket would act as the brain for this implant, communicating via bluetooth with the HELEDD, which would use a series of LED lights to display the time, a text message, or the user’s heart rate. “We’re looking to get sensors in there for the big three,” said Tim. “Heart rate, body temperature, and blood pressure. Because then you are looking at this incredible data. Most people don’t know the effect on a man’s heart when he finds out his wife is cheating on him.”

Cannon hopes to have the operation in the next few months. A big part of what drives the duo to move so fast is the idea that there is no hierarchy established in this space. “We want to be doing this before the FDA gets involved and starts telling us what we can and cannot do. Someday this will be commercially feasible and Apple will design an implant which will sync with your phone, but that is not going to be for us. We like to open things up and break them.”

I point out that Steve Jobs may have died in large part because he was reluctant to get surgery, afraid that if doctors opened him up, they might not be able to put him back together good as new. “We’re grinders,” said Cannon. “I view it as kind of taking the pain for the people who are going to come after me. We’re paying now so that it will become socially acceptable later.”

3rdi, 2010-2011Photographed by Wafaa Bilal, Copyright: Wafaa Bilal
Image of Prof. Kevin Warwick courtesty of Prof. Kevin Warick
Portrait of Prof. Kevin Warwick originally shot for Time Magazine by Jim Naughten

Calgary hail storm: Cloud seeding credited for sparing city from worse disaster (The Calgary Herald)

‘The storm was a monster,’ says weather modification company

BY THANDI FLETCHER, CALGARY HERALD AUGUST 14, 2012

Paul Newell captured dramatic images in the Bearspaw area of northwest Calgary just before the start of the hailstorm on Sunday, Aug. 12, 2012.

Paul Newell captured dramatic images in the Bearspaw area of northwest Calgary just before the start of the hailstorm on Sunday, Aug. 12, 2012. Photograph by: Reader photo , Paul Newell

A ferocious storm that hammered parts of Calgary with hail stones larger than golf balls late Sunday, causing millions of dollars worth of damage, could have been much worse if cloud-seeding planes hadn’t attempted to calm it down.

“The storm was a monster,” said Terry Krauss, project director of the Alberta Severe Weather Management Society, which contracts American-based company Weather Modification Inc. to seed severe weather clouds in Alberta’s skies. The society is funded by a group of insurance companies with a goal of reducing hail damage claims.

Before the storm hit, Krauss said, the company sent all four of its cloud-seeding aircraft into the thick and swirling black clouds. The planes flew for more than 12 hours, shooting silver iodide, a chemical agent that helps limit the size of hail stones, at the top and base of the clouds, until midnight.

But despite the heavy seeding, golf-ball-sized hail stones pelted parts of Calgary late Sunday night, causing widespread damage to cars and homes.

“This one was a beast. It took everything we threw at it and still was able to wreak some havoc,” said Krauss. “I believe if we hadn’t seeded, it would have even been worse.”

Northeast Calgary was worst hit by the storm, where the hail was between five and six centimetres, said Environment Canada meteorologist John Paul Craig. Other parts of the city saw toonie-sized hail from a second storm system, said Craig.

Craig said Sunday’s storm was worse than Calgary’s last major hailstorm, which saw four-centimetre hail stones, in July 2010.

“These hail stones were just a little bit bigger,” he said.

At Royal Oak Audi in the city’s northwest, broken glass from smashed windows littered the lot Monday morning. Of the 85 new and used cars on the lot, general manager Murray Dorren said not a single car was spared from the storm.

“It’s devastating — that’s probably the best word I can come up with,” he said. “It’s unbelievable that Mother Nature can do this much damage in a very short time. I think it probably took a matter of 10 minutes and there’s millions of dollars worth of damage.

Dorren estimated the damage at about $2 million. Across the lot, the dinged-up vehicles looked like dimpled golf balls from the repetitive pounding of the sizable stones. Some windows and sunroofs were shattered, while others were pierced by the heavy hail.

“They look like bullet holes right through the windscreen,” salesman Nick Berkland said of the damage.

Insurance companies and brokers were inundated with calls all day as customers tried to file claims on their wrecked cars and homes.

Ron Biggs, claims director for Intact Insurance, said it’s too early to tell how many claims the hail event will spurn, although he said they received about two to three times their normal call volume on Monday.

Biggs said the level of damage so far appears to be similar to the July 2010 hailstorm, when Intact received about 12,000 hail damage claims.

Chief operating officer Bruce Rabik of Rogers Insurance, which insures several car dealerships in Calgary, said the damage is extensive.

“It’s certainly a bad one,” he said. “We’ve had one dealership, which they estimate 600 damaged cars. A couple other dealerships with 200 damaged cars each.”

Rabik said claims adjusters are overwhelmed with the volume of claims. He urged customers to be patient as it may take a day or two as insurance workers make their way to each home.

Shredded leaves, twigs and broken branches blanketed pathways along the Bow and Elbow rivers as city crews worked to clear them, said Calgary parks pathway lead Duane Sutherland.

“This was the worst that I’ve seen,” said Sutherland.

Once daylight broke Monday, Royal Oak resident Satya Mudlair inspected the exterior of his home, which was riddled with damage. “Lots of holes in the siding, window damage to the two bedroom windows, and the roof a little bit,” he said.

The apple tree in his backyard has also lost about half its apples, he said. Fortunately, his car was parked inside the garage and was spared any dents.

Mudlair said his insurance company told him it would take two or three weeks before the damage would be repaired. “There’s a big pile of names ahead of me,” he said.

Mudlair’s wife, Nirmalla, had just fallen asleep when she was awoken by the sound of hail stones hitting the roof.

“It was very bad. It was like, thump, thump,” she described the pelting sound. “We got scared and I kept running from room to room.”

Cloud-seeding expert Krauss said Calgary has experienced more severe weather than usual this year, although Sunday’s storm was by far the worst.

“It has been a very stormy year,” he said.

© Copyright (c) The Calgary Herald

Occupy, Anthropology, and the 2011 Global Uprisings (Cultural Anthropology)

Hot spot – Occupy, Anthropology, and the 2011 Global Uprisings

Submitted by Cultural Anthropology on Fri, 2012-07-27 10:36

Introduction: Occupy, Anthropology, and the 2011 Global Uprisings

Guest Edited by Jeffrey S. Juris (Northeastern University) and Maple Razsa (Colby College)

Occupy Wall Street burst spectacularly onto the scene last fall with the take-over of New York City’s Zuccotti Park on September 17, 2011, followed by the rapid spread of occupations to cities throughout the US and the world. The movement combined mass occupations of urban public spaces with horizontal forms of organization and large-scale, directly democratic assemblies. Making effective use of the viral flows of images and information generated by the intersections of social and mass media, the occupations mobilized tens of thousands around the globe, including many new activists who had never taken part in a mass movement before, and inspired many more beyond the physical encampments themselves. Before the wave of violent police evictions in November and December of 2011 drove activists into submerged forms of organizing through the winter, the Occupy movements had already captured the public imagination. Bequeathing to us potent new memes such as the 1% (those at the top of the wealth and income scale) and the 99% (the rest of us), Occupy provided a framework for talking about issues that have been long obscured in public life such as class and socio-economic inequality and helped to shift the dominant political-economic discourse from an obsession with budget deficits and austerity to a countervailing concern for jobs, equality, and economic fairness.

In other words, prior to Occupy, much of the populist anger stemming from the 2008 financial crisis in North America and Europe had been effectively channeled by the Right into both an attack on marginalized groups—e.g. immigrants, people of color, Gays and Lesbians—and a particularly pernicious version of the already familiar critique of unbridled spending. This was especially so in the US where the Tea Party tapped into the widespread public ire over the Wall Street bailouts to bolster a far-reaching attack on “big government” through a radical program of fiscal austerity. Of course, the debt problem was a consequence rather than a cause of the crisis, the result of deregulation, predatory lending, and the spread of highly complex financial instruments facilitated by the neoliberal agenda of the very people who were now seeking to impose budgetary discipline (see Financial Crisis Hot Spot).

However, the contributions of Occupy are not exclusively, or even primarily, to be assessed in terms of their intervention in public discourse. The Occupy movements are also a response to a fundamental crisis of representative politics embodied in an embrace of more radical, directly democratic practices and forms. In their commitment to direct democracy and action the politics put into practice in the various encampments are also innovative prefigurative attempts to model alternative forms of political organization, decision making, and sociability. This turn is crucial: while neoliberalism has been endlessly critiqued it seems to live on as the only policy response—in the form of austerity—to the crisis neoliberalism itself has produced. The need for ethnographic accounts of this prefigurative politics, and its attendant challenges and contradictions, is especially urgent given that Occupy has refused official representatives and because occupiers have extended democracy beyond formal institutions into new spheres of life through a range of practices, including the collective seizure of public space, the people’s mic, horizontal organization, hand signals, and general assemblies.

It is also important to remember that Occupy was a relative latecomer—if a symbolically important one—to the social unrest the global crisis and policies of austerity have provoked. Cracks in the veneer of conformity emerged during the 2008 rebellion in Greece, where students, union members, and other social actors, galvanized by the murder of a fifteen year old student, took to the streets to challenge the worsening economic conditions (See Greece Hot Spot). Students were also among the first wave of resistance elsewhere with protests against budget cuts and increased fees in California, Croatia, the UK, and Chile. In the US signs of wider social discontent finally surfaced during the Wisconsin uprising in February 2011, which included the occupation of the Wisconsin State House in opposition to Governor Scott Walker’s attack on collective bargaining for public sector unions under the guise of budgetary discipline (cf. Collins 2012). As in Wisconsin, the widespread circulation of images from the Arab Spring continued to spark the intense feelings of solidarity, political possibility, and agency that ultimately led to the occupation of Wall Street. From the pro-democracy marches in Tunisia in response to the self-immolation of Mohammed Bouazizi to the mass occupations of Cairo’s Tahrir Square in opposition to the Egyptian dictator Hosni Mubarak, the Middle East uprisings, imbued protesters with the sense that dramatic political transformation was possible even as subsequent events have indicated that actual political outcomes are always ambivalent and uncertain (see Arab Spring Hot Spot).

Inspired by the uprisings in Tunisia and Egypt and responding to the working and middle class casualties of Spain and Europe’s debt crisis, hundreds of thousands of protesters took to the streets of Madrid on May 15, 2011 and occupied the Puerta del Sol square, sparking a wave of similar mobilizations and encampments around the Spain that would become known as 15M or the movement of the Indignados. Indeed, the combination of mass public occupations with large-scale participatory assemblies provided a template that would be enacted in Zuccotti Park, in part via the influence of Spanish activists residing in New York. That summer a similar movement of Israeli youths sprang up in Tel Aviv, using tent cities and popular assemblies to shine a light on the rising cost of housing and other living expenses.

Finally, in response to an August 2011 call by the Canadian magazine AdBusters to occupy Wall Street in the spirit of these 2011 Global uprisings, activists occupied Zuccotti Park after being rebuffed by the police in an attempt to take Wall Street itself. The occupation initially garnered little media attention, until its second week when images of police repression started going viral, leading to a surge in public sympathy and support, and ever growing numbers streaming to the encampments themselves each time another protester was maced or a group of seemingly innocent protesters rounded up, beaten, and/or arrested. Occupations quickly spread around the US and other parts of the world, generating, for a moment, a proliferating series of encampments physically rooted in local territories, yet linked up with other occupations through interpersonal and online trans-local networks. Following the evictions in the US last fall, local assemblies and working groups have continued to meet—hosting discussions, planning actions and campaigns, producing media, and building and modifying organizational forms—even as the Occupy movements prepared for their public reemergence in the spring through mobilizations such as the May Day protests and mass direct actions against NATO in Chicago and the European Central Bank in Frankfurt.

Additionally, each of these uprisings has diffused through the widespread use of social media, reflecting the mutually constitutive nature of embodied and online protest. The use of social media, in particular, has allowed the Occupy movements, as in other recent mobilizations, to penetrate deeply into the social fabric and mobilize many newcomers who have never been active before in social movements. At the same time, these emerging “logics of aggregation” within the Occupy movements have resulted in a more individualized mode of participation and a form of movement that is more singularizing (e.g. the way the 99% frame can obscure internal differences) and more dependent on the long-term occupation of public space than other recent movements (Juris 2012). A particular set of tensions and strategic dilemmas have thus plagued the Occupy movements, including a divide between newer and more seasoned activists, the difficulty of recognizing and negotiating internal differences, a lack of common political and organizational principles beyond the General Assembly model, and the difficulty of transitioning to new tactics, strategies, visions, and structures in a post-eviction era. In short, activists are now faced with fundamental questions about how to build a movement capable of actually transforming the deep inequalities they have attempted to address.

In assembling this Hot Spot on Occupy we have invited contributions from anthropologists, ethnographers, and activists writing on the above themes: the mass occupation of public spaces, directly democratic practices and forms, the use of social media, the emotions and emerging subjectivities of protest, as well as the underlying political critiques and contradictions that have arisen in the movement. Similarly, in light of the global history we outline above, the range of other social movement responses to the current global economic crisis, as well as the ongoing links between struggles in the US, Europe, Latin America, and North Africa, we have been careful to include contributors conducting research beyond the US in countries such as Greece, Slovenia, Spain, Israel, Argentina, Egypt, and Canada. In so doing, we insist that Occupy must be understood in a global rather than a populist US-centric framework.

Our collaboration on this Hot Spot—which emerged from conversations around our articles on Occupy in the May 2012 edition ofAmerican Ethnologist (Juris 2012Razsa and Kurnik 2012)—also reflects our scholarly and political commitments, as well as those of our contributors. First, it was our priority to invite scholars and activists who are directly involved with these movements rather than adding to the abundant armchair punditry on Occupy. These contributions also reflect recent trends in anthropology with respect to the growing practice of activist research, militant ethnography, public anthropology, and other forms of politically committed ethnographic research, which are taking increasingly institutionalized forms with Cultural Anthropology “Hot Spots”like this one, “Public Anthropology Reviews” in American Anthropologist, recent interventions in American Ethnologist on Egypt, Wisconsin, and Occupy, as well as Current Anthropology “Current Applications.”

In addition to providing an ethnographically and analytically informed view of and from various occupations and kindred mobilizations, this Hot Spot thus provides another example of how anthropologists are making themselves politically relevant and are engaging issues of broad public concern. Given these shifts, together with the progressive inclinations of many anthropologists and the ubiquity and inherent interest of Occupy, it should come as no surprise that so many anthropologists and ethnographers from related fields, including those within and outside the academy, have played key roles in the Occupy movements and their precursors in countries such as Greece and Spain. Indeed, in their post Carles Feixa and his collaboratorsrefer to anthropologists as the “organic intellectuals” of the 15 M movement. As many of the contributions to this Hot Spot attest, a similar case might be made for the role of activist anthropologists within Occupy more generally.

As the contributions below make clear, our emphasis on participatory and politically committed research does not imply a romanticization of resistance or a refusal to confront the contradictions, limits, and exclusions of social movements, especially along axes of class, race, gender, sexuality, and citizenship. Given the disproportionate, though by no means exclusively White, middle class participation in the US Occupy movements, such critical perspectives are essential. Each of the following entries thus combines thick ethnographic description on the part of anthropologists, ethnographers, and activists who have been directly involved in the Occupy movements or other instances of mobilization during the 2011 global uprisings—either through engagement with one more encampments and/or the themes addressed by Occupy—with critical analysis of one or more of the issues outlined above.

NOTES

[1] Occupy has thus addressed many of the same themes and drawn on many of the organizational practices associated with the global justice movements of a previous era, even as it has resonated more strongly with domestic national contexts of the Global north.

[2] The people’s mic is a form of voice amplification whereby everyone in listening distance repeats a speaker’s words so that others situated further away can also hear (See Garces, this Hot Spot).

[3] For example, in the U.S. local encampments created “Inter-Occupy” groups maintain ties with other occupations, while twitter feeds, listservs, websites, and other digital tools were used to communicate and coordinate more broadly. See our digital resources page for additional links.

REFERENCES

Collins, Jane. 2012. “Theorizing Wisconsin’s 2011 Protests: Community-Based Unionism Confronts Accumulation by Dispossession.” American Ethnologist 39 (1):6–20.

Juris, Jeffrey. 2012. “Reflections on #Occupy Everywhere: Social Media, Public Space, and Emerging Logics of Aggregation.”American Ethnologist 39 (2):259-279.

Razsa, Maple and Andrej Kurnik. 2012. “The Occupy Movement in Žižek’s Hometown: Direct Democracy and a Politics of Becoming.” American Ethnologist 39 (2):238-258.

***ESSAYS***

Prefigurative Politics

Marianne Maeckelbergh, Horizontal Decision-Making across Time and Place

Chris Garces, People’s Mic and ‘Leaderful’ Charisma

Philip Cartelli, Trying to Occupy Harvard

Public Space

Zoltán Glück, Between Wall Street and Zuccotti: Occupy and the Scale of Politics

Carles Feixa, et al., The #spanishrevolution and Beyond

Dimitris Dalakoglou,  The Movement and the “Movement” of Syntagma Square

Experience and Subjectivity

Jeffrey S. Juris, The 99% and the Production of Insurgent Subjectivity

Diane Nelson, et al., Her earliest leaf’s a flower…

Maple Razsa, The Subjective Turn: The Radicalization of Personal Experience within Occupy Slovenia

Marina Sitrin, Occupy Trust: The Role of Emotion in the New Movements

Strategy and Tactics

David Graeber, Occupy Wall Street rediscovers the radical imagination

Kate Griffiths-Dingani, May Day, Precarity, Affective Labor, and the General Strike

Angelique Haugerud, Humor and Occupy Wall Street

Karen Ho, Occupy Finance and the Paradox/Possibilities of Productivity

Social Media

Alice Mattoni, Beyond Celebration: Toward a More Nuanced Assessment of Facebook’s Role in Occupy Wall Street

John Postill, Participatory Media Research and Spain’s 15M Movement

Critical Perspectives

Yvonne Yen Liu, Decolonizing the Occupy Movement

Manissa McCleave Maharawal, Fieldnotes on Union Square, Anti-Oppression, and Occupy

Uri Gordon, Israel’s “Tent Protests:” A Domesticated Mobilization

Alex Khasnabish, Occupy Nova Scotia: The Symbolism and Politics of Space

Post Normal Science: Deadlines (Climate Etc.)

Posted on August 3, 2012

by Steven Mosher

Science has changed. More precisely, in post normal conditions the behavior of people doing science has changed.

Ravetz describes a post normal situation by the following criteria:

  1. Facts are uncertain
  2. Values are in conflict
  3. Stakes are high
  4. Immediate action is required

The difference between Kuhnian normal science, or the behavior of those doing science under normal conditions, and post normal science is best illustrated by example. We can use the recent discovery of the Higgs Boson as an example. Facts were uncertain–they always are to a degree; no values were in conflict; the stakes were not high; and, immediate action was not required. What we see in that situation is those doing science acting as we expect them to, according to our vague ideal of science. Because facts are uncertain, they listen to various conflicting theories. They try to put those theories to a test. They face a shared uncertainity and in good faith accept the questions and doubts of others interested in the same field. Their participation in politics is limited to asking for money. Because values are not in conflict no theorist takes the time to investigate his opponent’s views on evolution or smoking or taxation. Because the field of personal values is never in play, personal attacks are minimized. Personal pride may be at stake, but values rarely are. The stakes for humanity in the discovery of the Higgs are low: at least no one argues that our future depends upon the outcome. No scientist straps himself to the collider and demands that it be shut down. And finally, immediate action is not required; under no theory is the settling of the uncertainty so important as to rush the result. In normal science, according to Kuhn,  we can view the behavior of those doing science as puzzle solving. The details of a paradigm are filled out slowly and deliberately.

The situation in climate science are close to the polar opposite of this. That does not mean and should not be construed as a criticism of climate science or its claims. The simple point is this: in a PNS situation, the behavior of those doing science changes. To be sure much of their behavior remains the same. They formulate theories; they collect data, and they test their theories against the data. They don’t stop doing what we notional  describe as science. But, as foreshadowed above in the description of how high energy particle physicists behave, one can see how that behavior changes in a PNS situation. There is uncertainty, but the good faith that exists in normal science, the faith that other people are asking questions because they actually want the answer is gone. Asking questions, raising doubts, asking to see proof becomes suspect in and of itself. And those doing science are faced with a question that science cannot answer: Does this person really want the answer or are they amerchant of doubt? Such a question never gets asked in normal science. Normal science doesn’t ask this question because science cannot answer it.

Because values are in conflict the behavior of those doing science changes. In normal science no one would care if Higgs was a Christian or an atheist. No one would care if he voted liberal or conservative; but because two different value systems are in conflict in climate science, the behavior of those doing science changes. They investigate each other. They question motives. They form tribes.  And because the stakes are high the behavior of those doing science changes as well. They protest; they take money from lobby groups on both sides and worse of all they perform horrendous raps on youTube. In short, they become human; while those around them canonize them or demonize them and their findings become iconized or branded as hoaxes.

This brings us to the last aspect of a PNS situation: immediate action is required. This perhaps is the most contentious aspect of PNS, in fact I would argue it is thedefining characteristic. In all PNS situations it is almost always the case the one side sees the need for action, given the truth of their theory, while the doubtersmust of necessity see no need for immediate action. They must see no need for immediate action because their values are at risk and because the stakes are high. Another way to put this is as follows. When you are in a PNS situation, all sides must deny it. Those demanding immediate action, deny it by claiming more certainty*than is present; those refusing immediate action, do so by increasing demands for certainty. This leads to a centralization and valorization of the topic of uncertainty, and epistemology becomes a topic of discussion for those doing science. That is decidedly not normal science.

The demand for immediate action, however, is broader than simply a demand that society changes. In a PNS situation the behavior of those doing science changes. One of the clearest signs that you are in PNS is the change in behavior around deadlines. Normal science has no deadline. In normal science, the puzzle is solved when it is solved. In normal science there may be a deadline to shut down the collider for maintenance. Nobody rushes the report to keep the collider running longer than it should. And if a good result is found, the schedules can be changed to accommodate the scienceBroadly speaking, science drives the schedule; the schedule doesn’t drive the science.

The climategate mails are instructive here. As one reads through the mails it’s clear that the behavior of those doing science is not what one would call disinterested patient puzzle solving. Human beings acting in a situation where values are in conflict and stakes are high will engage in behavior that they might not otherwise. Those changes are most evident in situations surrounding deadlines. The point here is not to rehash The Crutape Lettersbut rather to relook at one incident ( there are others, notably around congressional hearings ) where deadlines came into play. The deadline in question was the deadline for submitting papers for consideration. As covered in The Crutape Letters and in The Hockeystick Illusion, the actions taken by those doing science around the“Jesus Paper” is instructive. In fact, were I to rewrite the Crutape letters I would do it from the perspective of PNS, focusing on how the behavior of those doing science deviated from the ideals of openness, transparency and letting truth come on its own good time.

Climategate is about FOIA. There were two critical paths for FOIA: one sought data, the other sought the emails of scientists. Not quite normal. Not normal in that data is usually shared; not normal in that we normally respect the privacy of those doing science. But this is PNS, and all bets are off. Values and practices from other fields, such as business and government,  are imported into the culture of science: Data hoarding is defended using IP and confidentiality agreements. Demanding private mail is defended using values imported from performing business for the public. In short, one sign that a science is post normal, is the attempt to import values and procedures from related disciplines. Put another way, PNS poses the question of governance. Who runs science and how should they run it.

The “Jesus paper” in a nutshell can be explained as follows. McIntyre and McKittrick had a paper published in the beginning of 2005. That paper needed to be rebutted in order to make Briffa’s job of writing chapter 6 easier. However, there was a deadline in play. Papers had to be accepted by a date certain. At one point Steven Schneider suggested the creation of a new category, a novelty–  provisionally accepted — so that the “jesus paper” could make the deadline. McIntyre covers the issue here. One need not re-adjudicate whether or not the IPCC rules were broken. And further these rules have nothing whatsoever ever to do with the truth of the claims in that paper. This is not about the truth of the science. What is important is the importation of the concept of a deadline into the search for truth. What is important is that the behavior of those doing science changes. Truth suddenly cares about a date. Immediate action is required. In this case immediate action is taken to see to it that the paper makes it into the chapter. Normal science takes no notice of deadlines. In PNS, deadlines matter.

Last week we saw another example of deadlines and high stakes changing the behavior of those doing science. The backstory here explains .   It appears to me that the behavior of those involved changed from what I have known it to be. It changed because they perceived that immediate action was required. A deadline had to be met. Again, as with the Jesus paper, the facts surrounding the releasedo not go to the truth of the claims. In normal science, a rushed claimed might very well get the same treatment as an unrushed claim: It will be evaluated on its merits. In PNS, either the rush to meet an IPCC deadline– as in the case of the Jesus paper, or the rush to be ready for congress –as in the Watts case, is enoughfor some doubt the science.  What has been testified to in Congress by Christy, a co author, may very well be true. But in this high stakes arena, where facts are uncertain and values are in conflict, the behavior of those doing science can and does change. Not all their behavior changes. They still observe and test and report. But the manner in which they do that changes. Results are rushed and data is held in secret. Deadlines change everything. Normal science doesn’t operate this way; if it does, quality can suffer. And yet, the demand for more certainty than is needed, the bad faith game of delaying action by asking questions, precludes a naïve return to science without deadlines.

The solution that Ravetz suggests is extended peer review and a recognition of the importance of quality. In truth, the way out of a PNS situation is not that simple. The first step out of a PNS situation is the recognition that one is in the situation to begin with. Today, few people embroiled in this debate would admit that the situation has changed how they would normally behave. An admission that this isn’t working is a cultural crisis for science. No one has the standing to describe how one should conduct science in a PNS situation. No one has the standing to chart the path out of a PNS situation. The best we can do is describe what we see. Today, I observe that deadlines change the behavior of those doing science. We see that in climategate; we see that in the events of the past week. That’s doesn’t entail anything about the truth of science performed under pressure. But it should make us pause and consider if truth will be found any faster by rushing the results and hiding the data.

*I circulated a copy of this to Michael Tobis to get his reaction. MT took issue with this characterization. MT, I believe, originated the argument that our uncertainty is a reason for action. It is true that while the certainty about the science  has been a the dominant piece of the rhetoric, there has been a second thread of rhetoric that bases action in the uncertainty about sensitivity. I would call this certainty shifting. While the uncertainty about facts of sensitivity are accepted in this path of argument the certainty is shifted to certainty about values and certainty about impacts. In short, the argument becomes that while we are uncertain about sensitivity the certainty we have about large impacts and trans-generational obligations necessitates action.

Irony Seen Through the Eye of MRI (Science Daily)

ScienceDaily (Aug. 3, 2012) — In the cognitive sciences, the capacity to interpret the intentions of others is called “Theory of Mind” (ToM). This faculty is involved in the understanding of language, in particular by bridging the gap between the meaning of the words that make up a statement and the meaning of the statement as a whole.

In recent years, researchers have identified the neural network dedicated to ToM, but no one had yet demonstrated that this set of neurons is specifically activated by the process of understanding of an utterance. This has now been accomplished: a team from L2C2 (Laboratoire sur le Langage, le Cerveau et la Cognition, Laboratory on Language, the Brain and Cognition, CNRS / Université Claude Bernard-Lyon 1) has shown that the activation of the ToM neural network increases when an individual is reacting to ironic statements.

Published in Neuroimage, these findings represent an important breakthrough in the study of Theory of Mind and linguistics, shedding light on the mechanisms involved in interpersonal communication.

In our communications with others, we are constantly thinking beyond the basic meaning of words. For example, if asked, “Do you have the time?” one would not simply reply, “Yes.” The gap between what is saidand what it means is the focus of a branch of linguistics called pragmatics. In this science, “Theory of Mind” (ToM) gives listeners the capacity to fill this gap. In order to decipher the meaning and intentions hidden behind what is said, even in the most casual conversation, ToM relies on a variety of verbal and non-verbal elements: the words used, their context, intonation, “body language,” etc.

Within the past 10 years, researchers in cognitive neuroscience have identified a neural network dedicated to ToM that includes specific areas of the brain: the right and left temporal parietal junctions, the medial prefrontal cortex and the precuneus. To identify this network, the researchers relied primarily on non-verbal tasks based on the observation of others’ behavior[1]. Today, researchers at L2C2 (Laboratoire sur le Langage, le Cerveau et la Cognition, Laboratory on Language, the Brain and Cognition, CNRS / Université Claude Bernard-Lyon 1) have established, for the first time, the link between this neural network and the processing of implicit meanings.

To identify this link, the team focused their attention on irony. An ironic statement usually means the opposite of what is said. In order to detect irony in a statement, the mechanisms of ToM must be brought into play. In their experiment, the researchers prepared 20 short narratives in two versions, one literal and one ironic. Each story contained a key sentence that, depending on the version, yielded an ironic or literal meaning. For example, in one of the stories an opera singer exclaims after a premiere, “Tonight we gave a superb performance.” Depending on whether the performance was in fact very bad or very good, the statement is or is not ironic.

The team then carried out functional magnetic resonance imaging (fMRI) analyses on 20 participants who were asked to read 18 of the stories, chosen at random, in either their ironic or literal version. The participants were not aware that the test concerned the perception of irony. The researchers had predicted that the participants’ ToM neural networks would show increased activity in reaction to the ironic sentences, and that was precisely what they observed: as each key sentence was read, the network activity was greater when the statement was ironic. This shows that this network is directly involved in the processes of understanding irony, and, more generally, in the comprehension of language.

Next, the L2C2 researchers hope to expand their research on the ToM network in order to determine, for example, whether test participants would be able to perceive irony if this network were artificially inactivated.

Note:

[1] For example, Grèzes, Frith & Passingham (J. Neuroscience, 2004) showed a series of short (3.5 second) films in which actors came into a room and lifted boxes. Some of the actors were instructed to act as though the boxes were heavier (or lighter) than they actually were. Having thus set up deceptive situations, the experimenters asked the participants to determine if they had or had not been deceived by the actors in the films. The films containing feigned actions elicited increased activity in the rTPJ (right temporal parietal junction) compared with those containing unfeigned actions.

Journal Reference:

Nicola Spotorno, Eric Koun, Jérôme Prado, Jean-Baptiste Van Der Henst, Ira A. Noveck. Neural evidence that utterance-processing entails mentalizing: The case of ironyNeuroImage, 2012; 63 (1): 25 DOI:10.1016/j.neuroimage.2012.06.046

Brain Imaging Can Predict How Intelligent You Are: ‘Global Brain Connectivity’ Explains 10 Percent of Variance in Individual Intelligence (Science Daily)

ScienceDaily (Aug. 1, 2012) — When it comes to intelligence, what factors distinguish the brains of exceptionally smart humans from those of average humans?

New research suggests as much as 10 percent of individual variances in human intelligence can be predicted based on the strength of neural connections between the lateral prefrontal cortex and other regions of the brain. (Credit: WUSTL Image / Michael Cole)

As science has long suspected, overall brain size matters somewhat, accounting for about 6.7 percent of individual variation in intelligence. More recent research has pinpointed the brain’s lateral prefrontal cortex, a region just behind the temple, as a critical hub for high-level mental processing, with activity levels there predicting another 5 percent of variation in individual intelligence.

Now, new research from Washington University in St. Louis suggests that another 10 percent of individual differences in intelligence can be explained by the strength of neural pathways connecting the left lateral prefrontal cortex to the rest of the brain.

Published in the Journal of Neuroscience, the findings establish “global brain connectivity” as a new approach for understanding human intelligence.

“Our research shows that connectivity with a particular part of the prefrontal cortex can predict how intelligent someone is,” suggests lead author Michael W. Cole, PhD, a postdoctoral research fellow in cognitive neuroscience at Washington University.

The study is the first to provide compelling evidence that neural connections between the lateral prefrontal cortex and the rest of the brain make a unique and powerful contribution to the cognitive processing underlying human intelligence, says Cole, whose research focuses on discovering the cognitive and neural mechanisms that make human behavior uniquely flexible and intelligent.

“This study suggests that part of what it means to be intelligent is having a lateral prefrontal cortex that does its job well; and part of what that means is that it can effectively communicate with the rest of the brain,” says study co-author Todd Braver, PhD, professor of psychology in Arts & Sciences and of neuroscience and radiology in the School of Medicine. Braver is a co-director of the Cognitive Control and Psychopathology Lab at Washington University, in which the research was conducted.

One possible explanation of the findings, the research team suggests, is that the lateral prefrontal region is a “flexible hub” that uses its extensive brain-wide connectivity to monitor and influence other brain regions in a goal-directed manner.

“There is evidence that the lateral prefrontal cortex is the brain region that ‘remembers’ (maintains) the goals and instructions that help you keep doing what is needed when you’re working on a task,” Cole says. “So it makes sense that having this region communicating effectively with other regions (the ‘perceivers’ and ‘doers’ of the brain) would help you to accomplish tasks intelligently.”

While other regions of the brain make their own special contribution to cognitive processing, it is the lateral prefrontal cortex that helps coordinate these processes and maintain focus on the task at hand, in much the same way that the conductor of a symphony monitors and tweaks the real-time performance of an orchestra.

“We’re suggesting that the lateral prefrontal cortex functions like a feedback control system that is used often in engineering, that it helps implement cognitive control (which supports fluid intelligence), and that it doesn’t do this alone,” Cole says.

The findings are based on an analysis of functional magnetic resonance brain images captured as study participants rested passively and also when they were engaged in a series of mentally challenging tasks associated with fluid intelligence, such as indicating whether a currently displayed image was the same as one displayed three images ago.

Previous findings relating lateral prefrontal cortex activity to challenging task performance were supported. Connectivity was then assessed while participants rested, and their performance on additional tests of fluid intelligence and cognitive control collected outside the brain scanner was associated with the estimated connectivity.

Results indicate that levels of global brain connectivity with a part of the left lateral prefrontal cortex serve as a strong predictor of both fluid intelligence and cognitive control abilities.

Although much remains to be learned about how these neural connections contribute to fluid intelligence, new models of brain function suggested by this research could have important implications for the future understanding — and perhaps augmentation — of human intelligence.

The findings also may offer new avenues for understanding how breakdowns in global brain connectivity contribute to the profound cognitive control deficits seen in schizophrenia and other mental illnesses, Cole suggests.

Other co-authors include Tal Yarkoni, PhD, a postdoctoral fellow in the Department of Psychology and Neuroscience at the University of Colorado at Boulder; Grega Repovs, PhD, professor of psychology at the University of Ljubljana, Slovenia; and Alan Anticevic, an associate research scientist in psychiatry at Yale University School of Medicine.

Funding from the National Institute of Mental Health supported the study (National Institutes of Health grants MH66088, NR012081, MH66078, MH66078-06A1W1, and 1K99MH096801).

Modern culture emerged in Africa 20,000 years earlier than thought (L.A.Times)

By Thomas H. Maugh II

July 30, 2012, 1:54 p.m.

Border Cave artifactsObjects found in the archaeological site called Border Cave include a) a wooden digging stick; b) a wooden poison applicator; c) a bone arrow point decorated with a spiral incision filled with red pigment; d) a bone object with four sets of notches; e) a lump of beeswax; and f) ostrich eggshell beads and marine shell beads used as personal ornaments. (Francesco d’Errico and Lucinda Backwell/ July 30, 2012)
Modern culture emerged in southern Africa at least 44,000 years ago, more than 20,000 years earlier than anthropologists had previously believed, researchers reported Monday.

That blossoming of technology and art occurred at roughly the same time that modern humans were migrating fromAfrica to Europe, where they soon displaced Neanderthals. Many of the characteristics of the ancient culture identified by anthropologists are still present in hunter-gatherer cultures of Africa today, such as the San culture of southern Africa, the researchers said.

The new evidence was provided by an international team of researchers excavating at an archaeological site called Border Cave in the foothills of the Lebombo Mountains on the border of KwaZulu-Natal in South Africa and Swaziland. The cave shows evidence of occupation by human ancestors going back more than 200,000 years, but the team reported in two papers in the Proceedings of the National Academy of Sciences that they were able to accurately date their discoveries to 42,000 to 44,000 years ago, a period known as the Later Stone Age or the Upper Paleolithic Period in Europe.

Among the organic — and thus datable — artifacts the team found in the cave were ostrich eggshell beads, thin bone arrowhead points, wooden digging sticks, a gummy substance called pitch that was used to attach bone and stone blades to wooden shafts, a lump of beeswax likely used for the same purpose, worked pig tusks that were probably use for planing wood, and notched bones used for counting.

“They adorned themselves with ostrich egg and marine shell beads, and notched bones for notational purposes,” said paleoanthropologist Lucinda Blackwell of the University of Witwatersrand in South Africa, a member of the team. “They fashioned fine bone points for use as awls and poisoned arrowheads. One point is decorated with a spiral groove filled with red ochre, which closely parallels similar marks that San make to identify their arrowheads when hunting.”

The very thin bone points are “very good evidence” for the use of bows and arrows, said co-author Paola Villa, a curator at the University of Colorado Museum of Natural History. Some of the bone points were apparently coated with ricinoleic acid, a poison made from the castor bean. “Such bone points could have penetrated thick hides, but the lack of ‘knock-down’ power means the use of poison probably was a requirement for successful kills,” she said.

The discovery also represents the first time pitch-making has been documented in South Africa, Villa said. The process requires burning peeled bark in the absence of air. The Stone Age residents probably dug holes in the ground, inserted the bark, lit it on fire, and covered the holes with stones, she said.

The Conversion of a Climate-Change Skeptic (N.Y.Times)

OP-ED CONTRIBUTOR

By RICHARD A. MULLER

Published: July 28, 2012

Berkeley, Calif.

CALL me a converted skeptic. Three years ago I identified problems in previous climate studies that, in my mind, threw doubt on the very existence of global warming. Last year, following an intensive research effort involving a dozen scientists, I concluded that global warming was real and that the prior estimates of the rate of warming were correct. I’m now going a step further: Humans are almost entirely the cause.

My total turnaround, in such a short time, is the result of careful and objective analysis by the Berkeley Earth Surface Temperature project, which I founded with my daughter Elizabeth. Our results show that the average temperature of the earth’s land has risen by two and a half degrees Fahrenheit over the past 250 years, including an increase of one and a half degrees over the most recent 50 years. Moreover, it appears likely that essentially all of this increase results from the human emission of greenhouse gases.

These findings are stronger than those of the Intergovernmental Panel on Climate Change, the United Nations group that defines the scientific and diplomatic consensus on global warming. In its 2007 report, the I.P.C.C. concluded only that most of the warming of the prior 50 years could be attributed to humans. It was possible, according to the I.P.C.C. consensus statement, that the warming before 1956 could be because of changes in solar activity, and that even a substantial part of the more recent warming could be natural.

Our Berkeley Earth approach used sophisticated statistical methods developed largely by our lead scientist, Robert Rohde, which allowed us to determine earth land temperature much further back in time. We carefully studied issues raised by skeptics: biases from urban heating (we duplicated our results using rural data alone), from data selection (prior groups selected fewer than 20 percent of the available temperature stations; we used virtually 100 percent), from poor station quality (we separately analyzed good stations and poor ones) and from human intervention and data adjustment (our work is completely automated and hands-off). In our papers we demonstrate that none of these potentially troublesome effects unduly biased our conclusions.

The historic temperature pattern we observed has abrupt dips that match the emissions of known explosive volcanic eruptions; the particulates from such events reflect sunlight, make for beautiful sunsets and cool the earth’s surface for a few years. There are small, rapid variations attributable to El Niño and other ocean currents such as the Gulf Stream; because of such oscillations, the “flattening” of the recent temperature rise that some people claim is not, in our view, statistically significant. What has caused the gradual but systematic rise of two and a half degrees? We tried fitting the shape to simple math functions (exponentials, polynomials), to solar activity and even to rising functions like world population. By far the best match was to the record of atmospheric carbon dioxide, measured from atmospheric samples and air trapped in polar ice.

Just as important, our record is long enough that we could search for the fingerprint of solar variability, based on the historical record of sunspots. That fingerprint is absent. Although the I.P.C.C. allowed for the possibility that variations in sunlight could have ended the “Little Ice Age,” a period of cooling from the 14th century to about 1850, our data argues strongly that the temperature rise of the past 250 years cannot be attributed to solar changes. This conclusion is, in retrospect, not too surprising; we’ve learned from satellite measurements that solar activity changes the brightness of the sun very little.

How definite is the attribution to humans? The carbon dioxide curve gives a better match than anything else we’ve tried. Its magnitude is consistent with the calculated greenhouse effect — extra warming from trapped heat radiation. These facts don’t prove causality and they shouldn’t end skepticism, but they raise the bar: to be considered seriously, an alternative explanation must match the data at least as well as carbon dioxide does. Adding methane, a second greenhouse gas, to our analysis doesn’t change the results. Moreover, our analysis does not depend on large, complex global climate models, the huge computer programs that are notorious for their hidden assumptions and adjustable parameters. Our result is based simply on the close agreement between the shape of the observed temperature rise and the known greenhouse gas increase.

It’s a scientist’s duty to be properly skeptical. I still find that much, if not most, of what is attributed to climate change is speculative, exaggerated or just plain wrong. I’ve analyzed some of the most alarmist claims, and my skepticism about them hasn’t changed.

Hurricane Katrina cannot be attributed to global warming. The number of hurricanes hitting the United States has been going down, not up; likewise for intense tornadoes. Polar bears aren’t dying from receding ice, and the Himalayan glaciers aren’t going to melt by 2035. And it’s possible that we are currently no warmer than we were a thousand years ago, during the “Medieval Warm Period” or “Medieval Optimum,” an interval of warm conditions known from historical records and indirect evidence like tree rings. And the recent warm spell in the United States happens to be more than offset by cooling elsewhere in the world, so its link to “global” warming is weaker than tenuous.

The careful analysis by our team is laid out in five scientific papers now online atBerkeleyEarth.org. That site also shows our chart of temperature from 1753 to the present, with its clear fingerprint of volcanoes and carbon dioxide, but containing no component that matches solar activity. Four of our papers have undergone extensive scrutiny by the scientific community, and the newest, a paper with the analysis of the human component, is now posted, along with the data and computer programs used. Such transparency is the heart of the scientific method; if you find our conclusions implausible, tell us of any errors of data or analysis.

What about the future? As carbon dioxide emissions increase, the temperature should continue to rise. I expect the rate of warming to proceed at a steady pace, about one and a half degrees over land in the next 50 years, less if the oceans are included. But if China continues its rapid economic growth (it has averaged 10 percent per year over the last 20 years) and its vast use of coal (it typically adds one new gigawatt per month), then that same warming could take place in less than 20 years.

Science is that narrow realm of knowledge that, in principle, is universally accepted. I embarked on this analysis to answer questions that, to my mind, had not been answered. I hope that the Berkeley Earth analysis will help settle the scientific debate regarding global warming and its human causes. Then comes the difficult part: agreeing across the political and diplomatic spectrum about what can and should be done.

Richard A. Muller, a professor of physics at the University of California, Berkeley, and a former MacArthur Foundation fellow, is the author, most recently, of “Energy for Future Presidents: The Science Behind the Headlines.”

*   *   *

Climate change study forces sceptical scientists to change minds (The Guardian)

Earth’s land shown to have warmed by 1.5C over past 250 years, with humans being almost entirely responsible

Leo Hickman
guardian.co.uk, Sunday 29 July 2012 14.03 BST

Prof Richard MullerProf Richard Muller considers himself a converted sceptic following the study’s surprise results. Photograph: Dan Tuffs for the Guardian

The Earth’s land has warmed by 1.5C over the past 250 years and “humans are almost entirely the cause”, according to a scientific study set up to address climate change sceptics’ concerns about whether human-induced global warming is occurring.

Prof Richard Muller, a physicist and climate change sceptic who founded the Berkeley Earth Surface Temperature (Best) project, said he was surprised by the findings. “We were not expecting this, but as scientists, it is our duty to let the evidence change our minds.” He added that he now considers himself a “converted sceptic” and his views had undergone a “total turnaround” in a short space of time.

“Our results show that the average temperature of the Earth’s land has risen by 2.5F over the past 250 years, including an increase of 1.5 degrees over the most recent 50 years. Moreover, it appears likely that essentially all of this increase results from the human emission of greenhouse gases,” Muller wrote in an opinion piece for the New York Times.

Can scientists in California end the war on climate change?
Study finds no grounds for climate sceptics’ concerns
Video: Berkeley Earth tracks climate change
Are climate sceptics more likely to be conspiracy theorists?

The team of scientists based at the University of California, Berkeley, gathered and merged a collection of 14.4m land temperature observations from 44,455 sites across the world dating back to 1753. Previous data sets created by Nasa, the US National Oceanic and Atmospheric Administration, and the Met Office and the University of East Anglia’s climate research unit only went back to the mid-1800s and used a fifth as many weather station records.

The funding for the project included $150,000 from the Charles G Koch Charitable Foundation, set up by the billionaire US coal magnate and key backer of the climate-sceptic Heartland Institute thinktank. The research also received $100,000 from the Fund for Innovative Climate and Energy Research, which was created by Bill Gates.

Unlike previous efforts, the temperature data from various sources was not homogenised by hand – a key criticism by climate sceptics. Instead, the statistical analysis was “completely automated to reduce human bias”. The Best team concluded that, despite their deeper analysis, their own findings closely matched the previous temperature reconstructions, “but with reduced uncertainty”.

Last October, the Best team published results that showed the average global land temperature has risen by about 1C since the mid-1950s. But the team did not look for possible fingerprints to explain this warming. The latest data analysis reached much further back in time but, crucially, also searched for the most likely cause of the rise by plotting the upward temperature curve against suspected “forcings”. It analysed the warming impact of solar activity – a popular theory among climate sceptics – but found that, over the past 250 years, the contribution of the sun has been “consistent with zero”. Volcanic eruptions were found to have caused short dips in the temperature rise in the period 1750–1850, but “only weak analogues” in the 20th century.

“Much to my surprise, by far the best match came to the record of atmospheric carbon dioxide, measured from atmospheric samples and air trapped in polar ice,” said Muller. “While this doesn’t prove that global warming is caused by human greenhouse gases, it is currently the best explanation we have found, and sets the bar for alternative explanations.”

Muller said his team’s findings went further and were stronger than the latest report published by the Intergovernmental Panel on ClimateChange.

In an unconventional move aimed at appeasing climate sceptics by allowing “full transparency”, the results have been publicly released before being peer reviewed by the Journal of Geophysical Research. All the data and analysis is now available to be freely scrutinised at the Bestwebsite. This follows the pattern of previous Best results, none of which have yet been published in peer-reviewed journals.

When the Best project was announced last year, the prominent climate sceptic blogger Anthony Watts was consulted on the methodology. He stated at the time: “I’m prepared to accept whatever result they produce, even if it proves my premise wrong.” However, tensions have since arisen between Watts and Muller.

Early indications suggest that climate sceptics are unlikely to fully accept Best’s latest results. Prof Judith Curry, a climatologist at the Georgia Institute of Technology who runs a blog popular with climate sceptics and who is a consulting member of the Best team, told the Guardian that the method used to attribute the warming to human emissions was “way over-simplistic and not at all convincing in my opinion”. She added: “I don’t think this question can be answered by the simple curve fitting used in this paper, and I don’t see that their paper adds anything to our understanding of the causes of the recent warming.”

Prof Michael Mann, the Penn State palaeoclimatologist who has faced hostility from climate sceptics for his famous “hockey stick” graph showing a rapid rise in temperatures during the 20th century, said he welcomed the Best results as they “demonstrated once again what scientists have known with some degree of certainty for nearly two decades”. He added: “I applaud Muller and his colleagues for acting as any good scientists would, following where their analyses led them, without regard for the possible political repercussions. They are certain to be attacked by the professional climate change denial crowd for their findings.”

Muller said his team’s analysis suggested there would be 1.5 degrees of warming over land in the next 50 years, but if China continues its rapid economic growth and its vast use of coal then that same warming could take place in less than 20 years.

“Science is that narrow realm of knowledge that, in principle, is universally accepted,” wrote Muller. “I embarked on this analysis to answer questions that, to my mind, had not been answered. I hope that the Berkeley Earth analysis will help settle the scientific debate regarding global warming and its human causes. Then comes the difficult part: agreeing across the political and diplomatic spectrum about what can and should be done.”

Computers Can Predict Effects of HIV Policies, Study Suggests (Science Daily)

ScienceDaily (July 27, 2012) — Policymakers in the fight against HIV/AIDS may have to wait years, even decades, to know whether strategic choices among possible interventions are effective. How can they make informed choices in an age of limited funding? A reliable, well-calibrated, predictive computer simulation would be a great help.

A visualization generated by an agent-based model of New York City’s HIV epidemic shows the risky interactions of unprotected sex or needle sharing among injection drug users (red), non-injection drug users (blue) and non-users (green). (Credit: Brandon Marshall/Brown University)

Policymakers struggling to stop the spread of HIV grapple with “what if” questions on the scale of millions of people and decades of time. They need a way to predict the impact of many potential interventions, alone or in combination. In two papers to be presented at the 2012 International AIDS Society Conference in Washington, D.C., Brandon Marshall, assistant professor of epidemiology at Brown University, will unveil a computer program calibrated to model accurately the spread of HIV in New York City over a decade and to make specific predictions about the future of the epidemic under various intervention scenarios.

“It reflects what’s seen in the real world,” said Marshall. “What we’re trying to do is identify the ideal combination of interventions to reduce HIV most dramatically in injection drug users.”

In an analysis that he’ll present on July 27, Marshall projects that with no change in New York City’s current programs, the infection rate among injection drug users will be 2.1 per 1,000 in 2040. Expanding HIV testing would drop the rate only 12 percent to 1.9 per 1,000; increasing drug treatment would reduce the rate 26 percent to 1.6 per 1,000; providing earlier delivery of antiretroviral therapy and better adherence would drop the rate 45 percent to 1.2 per 1,000; and expanding needle exchange programs would reduce the rate 34 percent to 1.4 per 1,000. Most importantly, doing all four of those things would cut the rate by more than 60 percent, to 0.8 per 1,000.

Virtual reality, real choices

The model is unique in that it creates a virtual reality of 150,000 “agents,” a programming term for simulated individuals, who in the case of the model, engage in drug use and sexual activity like real people.

Like characters in an all-too-serious video game, the agents behave in a world governed by biological rules, such as how often the virus can be transmitted through encounters such as unprotected gay sex or needle sharing.

With each run of the model, agents accumulate a detailed life history. For example, in one run, agent 89,425, who is male and has sex with men, could end up injecting drugs. He participates in needle exchanges, but according to the built-in probabilities, in year three he shares needles multiple times with another injection drug user with whom he is also having unprotected sex. In the last of those encounters, agent 89,425 becomes infected with HIV. In year four he starts participating in drug treatment and in year five he gets tested for HIV, starts antiretroviral treatment, and reduces the frequency with which he has unprotected sex. Because he always takes his HIV medications, he never transmits the virus further.

That level of individual detail allows for a detailed examination of transmission networks and how interventions affect them.

“With this model you can really look at the microconnections between people,” said Marshall, who began working on the model as a postdoctoral fellow at Columbia University and has continued to develop it since coming to Brown in January. “That’s something that we’re really excited about.”

To calibrate the model, Marshall and his colleagues found the best New York City data they could about how many people use drugs, what percentage of people were gay or lesbian, the probabilities of engaging in unprotected sex and needle sharing, viral transmission, access to treatment, treatment effectiveness, participation in drug treatment, progression from HIV infection to AIDS, and many more behavioral, social and medical factors. They also continuously calibrated it until the model could faithfully reproduce the infection rates among injection drug users that were known to occur in New York between 1992 and 2002.

And they don’t just run the simulation once. They run it thousands of times on a supercomputer at Brown to be sure the results they see are reliable.

Future applications

At Brown, Marshall is continuing to work on other aspects of the model, including an analysis of the cost effectiveness of each intervention and their combinations. Cost is, after all, another fact of life that policymakers and public health officials must weigh.

And then there’s the frustrating insight that the infection rate, even with four strengthened interventions underway, didn’t reduce the projected epidemic by much more than half.

“I actually expected something larger,” Marshall said. “That speaks to how hard we have to work to make sure that drug users can access and benefit from proven interventions to reduce the spread of HIV.”

Marshall’s collaborators on the model include Magdalena Paczkowski, Lars Seemann, Barbara Tempalski, Enrique Pouget, Sandro Galea, and Samuel Friedman.

The National Institutes of Health and the Lifespan/Tufts/Brown Center for AIDS Research provide financial support for the model’s continued development.

A Century Of Weather Control (POP SCI)

Posted 7.19.12 at 6:20 pm – http://www.popsci.com

 

Keeping Pilots Updated, November 1930

It’s 1930 and, for obvious reasons, pilots want regular reports on the weather. What to do? Congress’s solution was to give the U.S. Weather Bureau cash to send them what they needed. It was a lot of cash, too: $1.4 million, or “more than one third the sum it spend annually for all of its work.”

About 13,000 miles of airway were monitored for activity, and reports were regularly sent via the now quaintly named “teletype”–an early fax machine, basically, that let a typed message be reproduced. Pilots were then radioed with the information.

From the article “Weather Man Makes the Air Safe.”

 

Battling Hail, July 1947

We weren’t shy about laying on the drama in this piece on hail–it was causing millions in damage across the country and we were sick of it. Our writer says, “The war against hail has been declared.” (Remember: this was only two years after World War II, which was a little more serious. Maybe our patriotism just wouldn’t wane.)

The idea was to scatter silver iodide as a form of “cloud seeding”–turning the moisture to snow before it hails. It’s a process that’s still toyed with today.

From the article “The War Against Hail.”

 

Hunting for a Tornado “Cure,” March 1958

1957 was a record-breaking year for tornadoes, and PopSci was forecasting even rougher skies for 1958. As described by an official tornado watcher: ‘”They’re coming so fast and thick … that we’ve lost count.'”

To try to stop it, researchers wanted to learn more. Meteorologists asked for $5 million more a year from Congress to be able to study tornadoes whirling through the Midwest’s Tornado Alley, then, hopefully, learn what they needed to do to stop them.

From the article “What We’re Learning About Tornadoes.”

 

Spotting Clouds With Nimbus, November 1963

Weather satellites were a boon to both forecasters and anyone affected by extreme weather. The powerful Hurricane Esther was discovered two days before anything else spotted it, leaving space engineers “justifiably proud.” The next satellite in line was the Nimbus, which Popular Science devoted multiple pages to covering, highlighting its ability to photograph cloud cover 24 hours a day and give us better insight into extreme weather.

Spoiler: the results really did turn out great, with Nimbus satellites paving the way for modern GPS devices.

From the article “The Weather Eye That Never Blinks.”

 

Saving Money Globally With Forecasts, November 1970

Optimism for weather satellites seemed to be reaching a high by the ’70s, with Popular Science recounting all the disasters predicted–how they “saved countless lives through early hurricane warnings”–and now even saying they’d save your vacation.

What they were hoping for then was an accurate five-day forecast for the world, which they predicted would save billions and make early warnings even better.

From the article “How New Weather Satellites Will Give You More Reliable Forecasts.”

 

Extreme Weather Alerts on the Radio, July 1979

Those weather alerts that come on your television during a storm–or at least one radio version of those–were documented byPopular Science in 1979. But rather than being something that anyone could tune in to, they were specialized radios you had to purchase, which seems like a less-than-great solution to the problem. But at this point the government had plans to set up weather monitoring stations near 90 percent of the country’s population, opening the door for people to find out fast what the weather situation was.

From the article “Weather-Alert Radios–They Could Save Your Life.”

 

Stopping “Bolts From the Blue,” May 1990

Here Popular Science let loose a whooper for anyone with a fear of extreme weather: lightning kills a lot more people every year than you think, and sometimes a lightning bolt will come and hit you even when there’s not a storm. So-called “bolts from the blue” were a part of the story on better predicting lightning, a phenomenon more manic than most types of weather. Improved sensors played a major part in better preparing people before a storm.

From the article “Predicting Deadly Lightning.”

 

Infrared Views of Weather, August 1983

Early access to computers let weather scientists get a 3-D, radar-based view of weather across the country. The system culled information from multiple sources and placed it in one viewable display. (The man pictured looks slightly bored for how revolutionary it is.) The system was an attempt to take global information and make it into “real-time local predictions.”

From the article “Nowcasting: New Weather Computers Pinpoint Deadly Storms.”

 

Modernizing the National Weather Service, August 1997

A year’s worth of weather detection for every American was coming at the price of “a Big Mac, fries, and a Coke,” the deputy director of the National Weather Service said in 1997. The computer age better tied together the individual parts of weather forecasting for the NWS, leaving a unified whole that could grab complicated meteorological information and interpret it in just a few seconds.

From the article “Weather’s New Outlook.”

 

Modeling Weather With Computers, September 2001

Computer simulations, we wrote, would help us predict future storms more accurately. But it took (at the time) the largest supercomputer around to give us the kinds of models we wanted. Judging by the image, we might’ve already made significant progress on the weather modeling front.

Researchers Produce First Complete Computer Model of an Organism (Science Daily)

ScienceDaily (July 21, 2012) — In a breakthrough effort for computational biology, the world’s first complete computer model of an organism has been completed, Stanford researchers reported last week in the journal Cell.

The Covert Lab incorporated more than 1,900 experimentally observed parameters into their model of the tiny parasite Mycoplasma genitalium. () (Credit: Illustration by Erik Jacobsen / Covert Lab)

A team led by Markus Covert, assistant professor of bioengineering, used data from more than 900 scientific papers to account for every molecular interaction that takes place in the life cycle of Mycoplasma genitalium, the world’s smallest free-living bacterium.

By encompassing the entirety of an organism in silico, the paper fulfills a longstanding goal for the field. Not only does the model allow researchers to address questions that aren’t practical to examine otherwise, it represents a stepping-stone toward the use of computer-aided design in bioengineering and medicine.

“This achievement demonstrates a transforming approach to answering questions about fundamental biological processes,” said James M. Anderson, director of the National Institutes of Health Division of Program Coordination, Planning and Strategic Initiatives. “Comprehensive computer models of entire cells have the potential to advance our understanding of cellular function and, ultimately, to inform new approaches for the diagnosis and treatment of disease.”

The research was partially funded by an NIH Director’s Pioneer Award from the National Institutes of Health Common Fund.

From information to understanding

Biology over the past two decades has been marked by the rise of high-throughput studies producing enormous troves of cellular information. A lack of experimental data is no longer the primary limiting factor for researchers. Instead, it’s how to make sense of what they already know.

Most biological experiments, however, still take a reductionist approach to this vast array of data: knocking out a single gene and seeing what happens.

“Many of the issues we’re interested in aren’t single-gene problems,” said Covert. “They’re the complex result of hundreds or thousands of genes interacting.”

This situation has resulted in a yawning gap between information and understanding that can only be addressed by “bringing all of that data into one place and seeing how it fits together,” according to Stanford bioengineering graduate student and co-first author Jayodita Sanghvi.

Integrative computational models clarify data sets whose sheer size would otherwise place them outside human ken.

“You don’t really understand how something works until you can reproduce it yourself,” Sanghvi said.

Small is beautiful

Mycoplasma genitalium is a humble parasitic bacterium known mainly for showing up uninvited in human urogenital and respiratory tracts. But the pathogen also has the distinction of containing the smallest genome of any free-living organism — only 525 genes, as opposed to the 4,288 of E. coli, a more traditional laboratory bacterium.

Despite the difficulty of working with this sexually transmitted parasite, the minimalism of its genome has made it the focus of several recent bioengineering efforts. Notably, these include the J. Craig Venter Institute’s 2008 synthesis of the first artificial chromosome.

“The goal hasn’t only been to understand M. genitalium better,” said co-first author and Stanford biophysics graduate student Jonathan Karr. “It’s to understand biology generally.”

Even at this small scale, the quantity of data that the Stanford researchers incorporated into the virtual cell’s code was enormous. The final model made use of more than 1,900 experimentally determined parameters.

To integrate these disparate data points into a unified machine, the researchers modeled individual biological processes as 28 separate “modules,” each governed by its own algorithm. These modules then communicated to each other after every time step, making for a unified whole that closely matched M. genitalium‘s real-world behavior.

Probing the silicon cell

The purely computational cell opens up procedures that would be difficult to perform in an actual organism, as well as opportunities to reexamine experimental data.

In the paper, the model is used to demonstrate a number of these approaches, including detailed investigations of DNA-binding protein dynamics and the identification of new gene functions.

The program also allowed the researchers to address aspects of cell behavior that emerge from vast numbers of interacting factors.

The researchers had noticed, for instance, that the length of individual stages in the cell cycle varied from cell to cell, while the length of the overall cycle was much more consistent. Consulting the model, the researchers hypothesized that the overall cell cycle’s lack of variation was the result of a built-in negative feedback mechanism.

Cells that took longer to begin DNA replication had time to amass a large pool of free nucleotides. The actual replication step, which uses these nucleotides to form new DNA strands, then passed relatively quickly. Cells that went through the initial step quicker, on the other hand, had no nucleotide surplus. Replication ended up slowing to the rate of nucleotide production.

These kinds of findings remain hypotheses until they’re confirmed by real-world experiments, but they promise to accelerate the process of scientific inquiry.

“If you use a model to guide your experiments, you’re going to discover things faster. We’ve shown that time and time again,” said Covert.

Bio-CAD

Much of the model’s future promise lies in more applied fields.

CAD — computer-aided design — has revolutionized fields from aeronautics to civil engineering by drastically reducing the trial-and-error involved in design. But our incomplete understanding of even the simplest biological systems has meant that CAD hasn’t yet found a place in bioengineering.

Computational models like that of M. genitalium could bring rational design to biology — allowing not only for computer-guided experimental regimes, but also for the wholesale creation of new microorganisms.

Once similar models have been devised for more experimentally tractable organisms, Karr envisions bacteria or yeast specifically designed to mass-produce pharmaceuticals.

Bio-CAD could also lead to enticing medical advances — especially in the field of personalized medicine. But these applications are a long way off, the researchers said.

“This is potentially the new Human Genome Project,” Karr said. “It’s going to take a really large community effort to get close to a human model.”

Stanford’s Department of Bioengineering is jointly operated by the School of Engineering and the School of Medicine.

Anarchists attack science (Nature)

Armed extremists are targeting nuclear and nanotechnology workers.

Leigh Phillips
28 May 2012

Investigations of the shooting of nuclear-engineering head Roberto Adinolfi have confirmed the involvement of an eco-anarchist group. P. RATTINI/AFP/GETTY

A loose coalition of eco-anarchist groups is increasingly launching violent attacks on scientists.

A group calling itself the Olga Cell of the Informal Anarchist Federation International Revolutionary Front has claimed responsibility for the non-fatal shooting of a nuclear-engineering executive on 7 May in Genoa, Italy. The same group sent a letter bomb to a Swiss pro-nuclear lobby group in 2011; attempted to bomb IBM’s nanotechnology laboratory in Switzerland in 2010; and has ties with a group responsible for at least four bomb attacks on nanotechnology facilities in Mexico. Security authorities say that such eco-anarchist groups are forging stronger links.

On 11 May, the cell sent a four-page letter to the Italian newspaper Corriere della Sera claiming responsibility for the shooting of Roberto Adinolfi, the chief executive of Ansaldo Nucleare, the nuclear-engineering subsidiary of aerospace and defence giant Finmeccanica. Believed by authorities to be genuine, the letter is riddled with anti-science rhetoric. The group targeted Adinolfi because he is a “sorcerer of the atom”, it wrote. “Adinolfi knows well that it is only a matter of time before a European Fukushima kills on our continent.”

“Science in centuries past promised us a golden age, but it is pushing us towards self-destruction and total slavery,” the letter continues. “With this action of ours, we return to you a tiny part of the suffering that you, man of science, are pouring into this world.” The group also threatened to carry out further attacks.

The Italian Ministry of the Interior has subsequently beefed up security at thousands of potential political, industrial and scientific targets. The measures include assigning bodyguards to 550 individuals.

The Olga Cell, named after an imprisoned Greek anarchist, is part of the Informal Anarchist Federation, which, in April 2011, claimed responsibility for sending a parcel bomb that exploded at the offices of the Swiss nuclear lobby group, Swissnuclear, in Olten. A letter found in the remains of the bomb demanded the release of three individuals who had been detained for plotting an attack on IBM’s flagship nanotechnology facility in Zurich earlier that year. In a situation report published this month, the Swiss Federal Intelligence Service explicitly linked the federation to the IBM attack.

The Informal Anarchist Federation argues that technology, and indeed civilization, is responsible for the world’s ills, and that scientists are the handmaidens of capitalism. “Finmeccanica means bio- and nanotechnology. Finmeccanica means death and suffering, new frontiers of Italian capitalism,” the letter reads.

Gathering momentum
The cell says that it is uniting with eco-anarchist groups in other countries, including Mexico, Chile, Greece and the United Kingdom. Mexico has already seen similar attacks: in August 2011, a group called Individuals Tending Towards Savagery sent a parcel bomb that wounded two nanotechnology researchers at the Monterrey Institute of Technology. One received burns to his legs and a perforated eardrum and the other had his lung pierced by shrapnel (G. Herrera Corral Nature 476,373; 2011). The package contained enough explosive to collapse part of the building, according to police, but failed to detonate properly.

Earlier that year, the same group sent two bombs to the nanotechnology facility at the Polytechnic University of the Valley of Mexico. One was intercepted before anyone could be harmed, but the second detonated, injuring a security guard. It is not clear how closely the group is tied to the Informal Anarchist Federation, but in online forums the two bodies offer “direct support” for each other’s activities and talk of a “blossoming” of a more organized eco-anarchist movement.

In the wake of the Mexican bombings, the Monterrey Institute installed metal detectors, began to use police sniffer dogs and started random inspections of vehicles and packages. After a letter bomb addressed to a nanotechnology researcher at the Polytechnic University of Pachuca in Hidalgo exploded in December last year, the institute installed a perimeter fence and scanners, and campuses across the state heightened security measures.

Italian police investigating the shooting say that they are concerned about the rise in violent action by anarchist groups amid Europe’s economic crisis. On 23 May, for example, members of the Informal Anarchist Federation attacked railway signals in Bristol, UK, causing severe transport delays. An online message from the group said that the targets had been chosen to disrupt employees of the Ministry of Defence and defence-technology businesses in the area, including Raytheon and QinetiQ.

The Swiss report also noted signs of “an increasing degree of international networking between perpetrators”. The level of risk to scientists depends on their field of work, says Simon Johner, a spokesman for the Swiss Federal Intelligence Service. “We are not able to tell them what to do. We can only make them aware of the dangers. It’s up to institutions to take preventative actions.” The agency is working with police forces, businesses and research communities to assess and tackle the threat.

“These people do not represent mainstream opinion. But I am still pretty frightened by this violence,” says Michael Hagmann, a biochemist and head of corporate communications for the Swiss Federal Laboratories for Materials Science and Technology near Zurich, a public-sector partner of the IBM facility that also does nanotechnology research.

“Just a few weeks after the attempted bombing, we were due to have a large conference on nanotechnology and we were really quite nervous” about going ahead with it, Hagmann says. “But we concluded that the public discussion was more important and didn’t want to scare people by having 20 police guarding us. It would have sent the wrong message.”

Nature 485, 561 (31 May 2012) doi:10.1038/485561a

*   *   *

Published online 22 August 2011 | Nature 476, 373 (2011) | doi:10.1038/476373a

Column: World View

Stand up against the anti-technology terrorists

Home-made bombs are being sent to physicists in Mexico. Colleagues around the world should ensure their own security, urges Gerardo Herrera Corral.

Gerardo Herrera Corral

My elder brother, Armando Herrera Corral, was this month sent a tube of dynamite by terrorists who oppose his scientific research. The home-made bomb, which was in a shoe-box-sized package labelled as an award for his personal attention, exploded when he pulled at the adhesive tape wrapped around it. My brother, director of the technology park at the Monterrey Institute of Technology in Mexico, was standing at the time, and suffered burns to his legs and a perforated eardrum. More severely injured by the blast was his friend and colleague Alejandro Aceves López, whom my brother had gone to see in his office to share a cup of coffee and open the award. Aceves López was sitting down when my brother opened the package; he took the brunt of the explosion in his chest, and shrapnel pierced one of his lungs.

Both scientists are now recovering from their injuries, but they were extremely fortunate to survive. The bomb failed to go off properly, and only a fraction of the 20-centimetre-long cylinder of dynamite ignited. The police estimate that the package contained enough explosive to take down part of the building, had it worked as intended.

The next day, I, too, was sent a suspicious package. I have been advised by the police not to offer details of why the package was judged of concern, but it arrived by an unusual procedure, and on a Sunday. It tested positive for explosives, and was taken away by the bomb squad, which declared a false alarm after finding that the parcel contained only books. My first reaction was to leave the country. Now, I am confused as to how I should respond.

As an academic scientist, why was my brother singled out in this way? He does not work in a field that is usually considered high-risk for terrorist activity, such as medical research on animals. He works on computer science, and Aceves López is an expert in robotics. I am a high-energy physicist and coordinate the Mexican contribution to research using the Large Hadron Collider at CERN, Europe’s particle-physics laboratory; I have worked in the field for 15 years.

An extremist anarchist group known as Individuals Tending to Savagery (ITS) has claimed responsibility for the attack on my brother. This is confirmed by a partially burned note found by the authorities at the bomb site, signed by the ITS and with a message along the lines of: “If this does not get to the newspapers we will produce more explosions. Wounding or killing teachers and students does not matter to us.”

In statements posted on the Internet, the ITS expresses particular hostility towards nano­technology and computer scientists. It claims that nanotechnology will lead to the downfall of mankind, and predicts that the world will become dominated by self-aware artificial-intelligence technology. Scientists who work to advance such technology, it says, are seeking to advance control over people by ‘the system’. The group praises Theodore Kaczynski, the Unabomber, whose anti-technology crusade in the United States in 1978–95 killed three people and injured many others.

The group’s rhetoric is absurd, but I urge colleagues around the world to take the threat that it poses to researchers seriously. Information gathered by Mexican federal authorities and Interpol link it to actions in countries including Spain, France and Chile. In April this year, the ITS sent a bomb — similar to the one posted to my brother — to the head of the Nanotechnology Engineering Division at the Polytechnic University of Mexico Valley in Tultitlan, although that device did not explode. In May, the university received a second parcel bomb, with a message reading: “This is not a joke: last month we targeted Oscar Camacho, today the institution, tomorrow who knows? Open fire on nanotechnology and those who support it!”

“I believe that terror should not succeed in establishing fear and imposing conduct.”

The scientific community must be made aware of such organizations, and of their capacity for destruction. Nanotechnology-research institutes and departments, companies and professional associations must beef up their security procedures, particularly on how they receive and accept parcels and letters.

I would like to stand up and speak in this way because I believe that terror should not succeed in establishing fear and imposing conduct that takes us far from the freedom we enjoy. I would like the police to take these events seriously; they are becoming a real threat to society. I would also like to express my solidarity with the Monterrey Institute of Technology — the institution that gave me both financial support to pursue my undergraduate studies and high-level academic training.

To oppose technology is not an unacceptable way to think. We may well debate the desirability of further technical development in our society. Yet radical groups such as the ITS overlook a crucial detail: it is not technology that is the problem, but how we use it. After Alfred Nobel invented dynamite he became a rich man, because it found use in mining, quarrying, construction and demolition. But people can also decide to put dynamite into a parcel and address it to somebody with the intention of killing them.

Gerardo Herrera Corral is a physicist at the Research and Advanced Studies Centre of the National Polytechnic Institute of Mexico in Mexico City.

Scientists Read Monkeys’ Inner Thoughts: Brain Activity Decoded While Monkeys Avoid Obstacle to Touch Target (Science Daily)

ScienceDaily (July 19, 2012) — By decoding brain activity, scientists were able to “see” that two monkeys were planning to approach the same reaching task differently — even before they moved a muscle.

The obstacle-avoidance task is a variation on the center-out reaching task in which an obstacle sometimes prevents the monkey from moving directly to the target. The monkey must first place a cursor (yellow) on the central target (purple). This was the starting position. After the first hold, a second target appeared (green). After the second hold an obstacle appeared (red box). After the third hold, the center target disappeared, indicating a “go” for the monkey, which then moved the cursor out and around the obstacle to the target. (Credit: Moran/Pearce)

Anyone who has looked at the jagged recording of the electrical activity of a single neuron in the brain must have wondered how any useful information could be extracted from such a frazzled signal.

But over the past 30 years, researchers have discovered that clear information can be obtained by decoding the activity of large populations of neurons.

Now, scientists at Washington University in St. Louis, who were decoding brain activity while monkeys reached around an obstacle to touch a target, have come up with two remarkable results.

Their first result was one they had designed their experiment to achieve: they demonstrated that multiple parameters can be embedded in the firing rate of a single neuron and that certain types of parameters are encoded only if they are needed to solve the task at hand.

Their second result, however, was a complete surprise. They discovered that the population vectors could reveal different planning strategies, allowing the scientists, in effect, to read the monkeys’ minds.

By chance, the two monkeys chosen for the study had completely different cognitive styles. One, the scientists said, was a hyperactive type, who kept jumping the gun, and the other was a smooth operator, who waited for the entire setup to be revealed before planning his next move. The difference is clearly visible in their decoded brain activity.

The study was published in the July 19th advance online edition of the journal Science.

All in the task

The standard task for studying voluntary motor control is the “center-out task,” in which a monkey or other subject must move its hand from a central location to targets placed on a circle surrounding the starting position.

To plan the movement, says Daniel Moran, PhD, associate professor of biomedical engineering in the School of Engineering & Applied Science and of neurobiology in the School of Medicine at Washington University in St. Louis, the monkey needs three pieces of information: current hand and target position and the velocity vector that the hand will follow.

In other words, the monkey needs to know where his hand is, what direction it is headed and where he eventually wants it to go.

A variation of the center-out task with multiple starting positions allows the neural coding for position to be separated from the neural coding for velocity.

By themselves, however, the straight-path, unimpeded reaches in this task don’t let the neural coding for velocity to be distinguished from the neural coding for target position, because these two parameters are always correlated. The initial velocity of the hand and the target are always in the same direction.

To solve this problem and isolate target position from movement direction, doctoral student Thomas Pearce designed a novel obstacle-avoidance task to be done in addition to the center-out task.

Crucially, in one-third of the obstacle-avoidance trials, either no obstacle appeared or the obstacle didn’t block the monkey’s path. In either case, the monkey could move directly to the target once he got the “go” cue.

The population vector corresponding to target position showed up during the third hold of the novel task, but only if there was an obstacle. If an obstacle appeared and the monkey had to move its hand in a curved trajectory to reach the target, the population vector lengthened and pointed at the target. If no obstacle appeared and the monkey could move directly to the target, the population vector was insignificant.

In other words, the monkeys were encoding the position of the target only when it did not lie along a direct path from the starting position and they had to keep its position “in mind” as they initially moved in the “wrong” direction.

“It’s all,” Moran says, “in the design of the task.”

And then some magic happens

Pearce’s initial approach to analyzing the data from the experiment was the standard one of combining the data from the two monkeys to get a cleaner picture.

“It wasn’t working,” Pearce says, “and I was frustrated because I couldn’t figure out why the data looked so inconsistent. So I separated the data by monkey, and then I could see, wow, they’re very different. They’re approaching this task differently and that’s kind of cool.”

The difference between the monkey’s’ styles showed up during the second hold. At this point in the task, the target was visible, but the obstacle had not yet appeared.

The hyperactive monkey, called monkey H, couldn’t wait. His population vector during that hold showed that he was poised for a direct reach to the target. When the obstacle was then revealed, the population vector shortened and rotated to the direction he would need to move to avoid the obstacle.

The smooth operator, monkey G, in the meantime, idled through the second hold, waiting patiently for the obstacle to appear. Only when it was revealed did he begin to plan the direction he would move to avoid the obstacle.

Because he didn’t have to correct course, monkey G’s strategy was faster, so what advantage was it to monkey H to jump the gun? In the minority of trials where no obstacle appeared, monkey H approached the target more accurately than monkey G. Maybe monkey H is just cognitively adapted to a Whac-A-Mole world. And monkey G, when caught without a plan, was at a disadvantage.

Working with the monkeys, the scientists had been aware that they had very different personalities, but they had no idea this difference would show up in their neural recordings.

“That’s what makes this really interesting,” Moran says.