Arquivo da tag: Incerteza

The Political Effects of Existential Fear (Science Daily)

ScienceDaily (Oct. 20, 2011) — Why did the approval ratings of President George W. Bush — who was perceived as indecisive before September 11, 2001 — soar over 90 percent after the terrorist attacks? Because Americans were acutely aware of their own deaths. That is one lesson from the psychological literature on “mortality salience” reviewed in a new article called “The Politics of Mortal Terror.”

The paper, by psychologists Florette Cohen of the City University of New York’s College of Staten Island and Sheldon Solomon of Skidmore College, appears in October’s Current Directions in Psychological Science, a journal published by the Association for Psychological Science.

The fear people felt after 9/11 was real, but it also made them ripe for psychological manipulation, experts say. “We all know that fear tactics have been used by politicians for years to sway votes,” says Cohen. Now psychological research offers insight into the chillingly named “terror management.”

The authors cite studies showing that awareness of mortality tends to make people feel more positive toward heroic, charismatic figures and more punitive toward wrongdoers. In one study, Cohen and her colleagues asked participants to think of death and then gave them statements from three fictional political figures. One was charismatic: he appealed to the specialness of the person and the group to which she belonged. One was a technocrat, offering practical solutions to problems. The third stressed the value of participation in democracy. After thinking about death, support for the charismatic leader shot up eightfold.

Even subliminal suggestions of mortality have similar effects. Subjects who saw the numbers 911 or the letters WTC had higher opinions of a Bush statement about the necessity of invading Iraq. This was true of both liberals and conservatives.

Awareness of danger and death can bias even peaceful people toward war or aggression. Iranian students in a control condition preferred the statement of a person preaching understanding and the value of human life over a jihadist call to suicide bombing. But primed to think about death, they grew more positive toward the bomber. Some even said that they might consider becoming a martyr.

As time goes by and the memory of danger and death grows fainter, however, “morality salience” tends to polarize people politically, leading them to cling to their own beliefs and demonize others who hold opposing beliefs — seeing in them the cause of their own endangerment.

The psychological research should make voters wary of emotional political appeals and even of their own emotions in response, Cohen says. “We encourage all citizens to vote with their heads rather than their hearts. Become an educated voter. Look at the candidate’s positions and platforms. Look at who you are voting for and what they stand for.”

Radiação vaza em indústria nuclear no Rio (Correio Braziliense)

JC e-mail 4367, de 19 de Outubro de 2011.

Ocorreram três vazamentos dentro da Fábrica de Combustível Nuclear, pertencente ao governo federal, em Resende (RJ). Dois deles, envolvendo substâncias químicas. Outro, urânio enriquecido altamente radioativo. A empresa admite “falhas”, mas descarta danos a funcionários e ao meio ambiente

Produto radioativo vaza em indústria nuclear de Resende (RJ). A empresa, pertencente ao governo federal, confirma o caso, reconhece “falhas” em equipamentos, mas descarta danos aos funcionários e ao meio ambiente

Engenheiros e técnicos de segurança do trabalho detectaram três vazamentos dentro da Fábrica de Combustível Nuclear (FCN), em Resende (RJ), dois deles envolvendo substâncias químicas e um de urânio enriquecido (UO2), elemento altamente radioativo. A constatação dos vazamentos foi comunicada pelos engenheiros e técnicos a seus superiores por e-mails internos. O Correio teve acesso a cópias desses e-mails.

O pó de urânio vazou de um equipamento chamado homogeneizador e caiu no piso da sala. O episódio foi registrado em 14 de julho de 2009. Em janeiro de 2010, o alarme de atenção da fábrica foi acionado em razão do vazamento de gás liquefeito usado no forno que queima os excessos de gases resultantes da produção de pastilhas de urânio. E, em julho deste ano, um engenheiro suspeitou do vazamento de amônia e comunicou o ocorrido aos gerentes.

Os três casos não representaram riscos aos trabalhadores, ao meio ambiente e ao funcionamento da fábrica, garantem a diretoria da fábrica – pertencente ao governo federal – e a presidência da Comissão Nacional de Energia Nuclear (Cnen), órgão responsável pela fiscalização de atividades radioativas no Brasil. “O urânio ficou numa sala confinada, hermeticamente fechada, não foi para o meio ambiente”, diz o diretor de Produção de Combustível Nuclear da FCN, Samuel Fayad Filho. Ele reconhece “falhas” nos equipamentos e diz que “todos os procedimentos foram tomados” em relação aos problemas detectados. “Não há vazamento de material radioativo em Resende”, assegura.

O Correio consultou especialistas para saber o que significam as informações que circularam internamente na FCN. Para o engenheiro nuclear Aquilino Senra, “é evidente que houve uma falha”. “Não era para o pó de UO2 sair dessa prensa”, diz o engenheiro nuclear, vice-diretor do Instituto Alberto Luiz Coimbra de Pós-Graduação e Pesquisa de Engenharia (Coppe), da Universidade Federal do Rio de Janeiro. “É uma anormalidade clara o vazamento de UO2 da prensa e a presença da substância no solo.”

Em relação ao vazamento de gás liquefeito, Aquilino afirma que “gás vazado não é boa coisa”. “Detectores existem para isso, mas o ponto é por que o gás vazou.” O Correio ouviu também um técnico ligado à Presidência da República, sob a condição de anonimato: “Não me parece um problema grave, pois a Presidência não foi avisada”, diz.

Funções – A FCN é um conjunto de fábricas responsáveis pela montagem do elemento combustível, pela fabricação do pó e da pastilha de urânio e por uma pequena parte do enriquecimento de urânio. O mineral é extraído em Caetité (BA). O processo de enriquecimento é feito quase todo fora do país, mas parte dele já ocorre na FCN. Cabe à fábrica, além dessa pequena fatia do enriquecimento, produzir as pastilhas que serão utilizadas na geração de energia nuclear pelas usinas Angra 1 e Angra 2, em Angra dos Reis (RJ).

Hoje, a FCN é responsável pelo enriquecimento de 10% do urânio necessário para Angra 1 e de 5% para Angra 2, segundo Samuel Fayad. A FCN faz parte da estatal Indústrias Nucleares do Brasil (INB), subordinada ao Ministério de Ciência, Tecnologia e Inovação (MCT).

O episódio do vazamento de pó de urânio foi relatado por um técnico de segurança do trabalho às coordenações superiores. A Cnen confirmou ao Correio o alerta. “O fato é irrelevante em termos de segurança. O referido pó foi identificado em área controlada, dentro de ambiente com contenção para material radioativo, não afetando trabalhadores da unidade ou o meio ambiente”, sustenta o órgão, por meio da assessoria de imprensa.

Crise – O setor de geração de energia nuclear vive um conflito e uma crise dentro do governo federal. O presidente da Comissão Nacional de Energia Nuclear (Cnen), Angelo Padilha, assumiu o cargo em 7 de julho, depois de o ministro da Ciência e Tecnologia, Aloizio Mercadante, demitir Odair Dias Gonçalves. Odair perdeu o cargo após revelações de que a usina Angra 2 operou por 10 anos sem licença definitiva e de que o Brasil passou a importar urânio em razão de licenças travadas. Até agora, a Agência Reguladora de Energia Nuclear é apenas um projeto, em razão de conflitos dentro do setor. A agência vai retirar da Cnen – principal acionista das Indústrias Nucleares do Brasil – a função de regulação e fiscalização.

Onças-pintadas ajudam a preservar Caatinga (Valor Econômico)

JC e-mail 4366, de 18 de Outubro de 2011.

Mapear quantas são, como vivem e por onde andam as onças-pintadas da Caatinga permitirá conhecer o efeito da transposição do São Francisco sobre a região.

É bem ali, onde a onça bebe água, que se arma o laço. Em setembro, no auge da seca na Caatinga, foram dez armadilhas na região de Sento Sé, município do norte baiano, às margens do lago de Sobradinho. Cinco pesquisadores, 30 dias, água racionada, nada de luz elétrica, computador ou telefone, e R$ 22 mil de investimento. No fim da expedição, nenhuma onça-pintada ganhou colar com GPS. Mas a frustração dos cientistas dá logo lugar ao planejamento da nova campanha. É nesse compasso que vão perseguindo a criação de uma espécie de “índice-onça de sustentabilidade”, que está relacionado com uma das principais, e mais polêmicas, obras do Programa de Aceleração do Crescimento (PAC), a transposição do rio São Francisco.

Tanto interesse nesse gato hiperbólico – é o maior felino das Américas, o terceiro maior do mundo depois do tigre e do leão, e dono da mordida mais potente entre seus parentes – transcende a biologia. Onça-pintada só vive onde tem água e é predador importante, que regula ecossistemas. Não deixa, por exemplo, a população de capivaras, veados ou ratos explodir. No topo da cadeia alimentar, é uma espécie guarda-chuva. “Protegendo a onça-pintada, está se protegendo todas as outras”, diz o veterinário Ronaldo Gonçalves Morato, 44 anos, um dos poucos especialistas em onças do país das onças.

Na base do estudo está a proposta de se criar, no coração do Semiárido, um corredor de fauna. A tentativa é construir uma área de proteção que leve em conta o potencial econômico da região. Um dos elementos é o Parque Nacional Boqueirão da Onça, em estudo há dez anos. Teria 800 mil hectares e seria a maior unidade de conservação fora da Amazônia. Mas enquanto o governo não resolve se cria ou não o parque, o valor e a grilagem das terras aumentam. Há também o interesse do Ministério das Minas e Energia, que vê na região um bom potencial eólico.

O governo busca consenso para garantir alguma proteção ao terceiro e mais castigado bioma brasileiro. Menos de 2% da Caatinga é área protegida. Mais de 45% da vegetação foi desmatada e a região sofre desertificação. “Temos a visão de que a Caatinga é pobre e pronto. Mas existem paisagens fantásticas e recursos naturais mal aproveitados”, diz Morato. “Explorar a Caatinga com um bom programa turístico, seria bem interessante.”

A diversidade biológica é rica, mesmo com escassez de água. Há centenas de espécies de pássaros, répteis e anfíbios. As paisagens são belas e variadas, há pinturas rupestres e frutas que dão doces exóticos. Na seca, a vegetação fica sem folhas, para gastar menos energia. “O pessoal chama esse cenário de mata branca. É só chover que, três dias depois, está tudo verde. É maravilhoso”, encanta-se Morato.

O parque, que não sai do papel, tomaria 45% do município de Sento Sé, região bem pouco povoada de gente e talvez bem povoada por onças. A pintada, que se espalhava pela Caatinga nos tempos de Lampião, hoje está restrita a 25% do bioma. Os pesquisadores acreditam que existam cinco grandes populações de onças-pintadas no Semiárido, um ou dois animais a cada 100 km2 – em Cáceres, no Pantanal, a densidade é bem mais alta, média de sete onças a cada 100 km2. As estimativas falam em 300 a 400 animais na Caatinga.

Mapear, com alguma precisão, quantas são, como vivem e por onde andam as onças-pintadas do sertão nordestino é ter um indicador ambiental para saber, depois, o quanto a transposição do São Francisco afetou a região. Se as onças-pintadas continuarem por lá depois da obra, é sinal positivo. O projeto faz parte do Programa de Revitalização da Bacia do São Francisco, coordenado pelo Ministério do Meio Ambiente em parceria com o da Integração Regional. Também conseguiu recursos na BM&F Bovespa. Onças, principalmente as pintadas, são animais glamourosos.

A majestade da espécie-símbolo da fauna brasileira, impressa nas cédulas de R$ 50, é inversamente proporcional ao que se conhece sobre o animal. “Nem sabemos o quanto uma onça-pintada vive”, diz Morato. “A cada pergunta que respondemos, surge uma nova.” Ele começou a carreira fazendo estágio no zoológico de Sorocaba, em São Paulo, o suficiente para perceber que queria mesmo era estudar animais em vida livre.

Morato trabalha com onças há 20 anos, há seis é o coordenador do Centro Nacional de Pesquisa e Conservação de Mamíferos Carnívoros (Cenap), instituto que estuda uma lista de 26 espécies – de lobos-guará a ariranhas. Só de felinos são oito espécies, entre onças pintadas e pardas, jaguatiricas e gatos-do-mato. Dá para ver da estrada o painel gigante de uma onça-pintada nos vidros da sede do Cenap, em Atibaia. O centro foi criado há 17 anos e é um braço do Instituto Chico Mendes de Conservação da Biodiversidade (ICMBio).

Os investimentos no projeto Ecologia e Conservação da Onça-Pintada no Médio São Francisco, ou simplesmente Onças da Caatinga, é de R$ 800 mil em quatro anos. A primeira campanha de captura para colocação do colar foi em 2010, e também não teve êxito. Não é fácil pegar um bicho desses. Os laços de aço são montados perto das áreas que elas costumam frequentar. “A gente identifica os pontos onde as onças passam, e deixamos os laços. Mas às vezes elas andam ao lado do laço e a gente só vê os rastros no dia seguinte. É difícil.”

Difícil é pouco. Para um mês de acampamento em setembro, levaram 300 litros de água por pessoa. Não há estradas, carro não chega, as pedras cortam os pneus. Equipamentos e água são levados a pé. Banho, só de caneca.

Quando dá sorte e a onça cai no laço, os pesquisadores lançam o dardo anestésico e começam a medir o animal: peso, tamanho, tamanho da pata, análise dos dentes. É a hora de colocar o colar com telemetria que pesa 800 gramas e tem um GPS instalado em uma caixinha, na parte da frente. Cada animal tem frequência própria. Depois, programam de quanto em quanto tempo o pesquisador receberá as informações por onde anda a onça – de duas em duas horas, por exemplo. Uma vez por semana, os dados são enviados ao e-mail do cientista pela empresa que administra o satélite. O colar pode ser programado para cair do pescoço depois de determinado período, e ser recolhido. “Fica, por exemplo, 400 dias na onça, e aí cai”, explica Morato.

“A tecnologia favoreceu muito o nosso trabalho”, diz ele. O avanço tecnológico tem seu preço, nada disso é barato. O Cenap usa colares da suíça Televilt, cada um a US$ 3.800. O contrato anual do satélite são outros US$ 1.200 por colar. Hoje existem 40 equipamentos do gênero em onças-pintadas no Brasil. Ao recolher várias informações sobre o comportamento do animal – desde como e para onde se desloca, quais ambientes procura, como se alimenta – os cientistas desenham o tamanho da “área de vida” da onça. “Vou vislumbrando o ambiente que posso sugerir para preservação”, explica Morato.

O “Onças na Caatinga” levantou recursos na BVS&A, portal da Bovespa que lista projetos sociais e ambientais. “Quem tiver interesse pode entrar lá, escolher o que acha interessante, e doar”, diz Sonia Favaretto, diretora de sustentabilidade da Bolsa. A iniciativa resultou em R$ 150 mil em dois anos. O Cenap trabalhou em parceria com a ONG Pró-Carnívoros, que ajuda a viabilizar os projetos de pesquisa.

O papel de regulador ecológico da onça-pintada não é o único. “Com a perda de espécies, perdem-se ambientes, ficamos mais expostos a catástrofes”, aponta Morato. A redução de predadores representa aumento das presas e mais pressão sobre a vegetação. “Isso, a longo prazo, diminui o estoque de carbono”, lembra. Morato defende que é preciso refletir sobre o valor econômico das onças-pintadas e o apelo turístico que representam.

Economics has met the enemy, and it is economics (Globe and Mail)

Adam Smith is considered the founding father of modern economics. - Adam Smith is considered the founding father of modern economics.

Adam Smith is considered the founding father of modern economics.

ira basen

From Saturday’s Globe and Mail
Published Saturday, Oct. 15, 2011 6:00AM EDT
Last updated Tuesday, Oct. 18, 2011 8:41AM EDT

After Thomas Sargent learned on Monday morning that he and colleague Christopher Sims had been awarded the Nobel Prize in Economics for 2011, the 68-year-old New York University professor struck an aw-shucks tone with an interviewer from the official Nobel website: “We’re just bookish types that look at numbers and try to figure out what’s going on.”

But no one who’d followed Prof. Sargent’s long, distinguished career would have been fooled by his attempt at modesty. He’d won for his part in developing one of economists’ main models of cause and effect: How can we expect people to respond to changes in prices, for example, or interest rates? According to the laureates’ theories, they’ll do whatever’s most beneficial to them, and they’ll do it every time. They don’t need governments to instruct them; they figure it out for themselves. Economists call this the “rational expectations” model. And it’s not just an abstraction: Bankers and policy-makers apply these formulae in the real world, so bad models lead to bad policy.

Which is perhaps why, by the end of that interview on Monday, Prof. Sargent was adopting a more realistic tone: “We experiment with our models,” he explained, “before we wreck the world.”

Rational-expectations theory and its corollary, the efficient-market hypothesis, have been central to mainstream economics for more than 40 years. And while they may not have “wrecked the world,” some critics argue these models have blinded economists to reality: Certain the universe was unfolding as it should, they failed both to anticipate the financial crisis of 2008 and to chart an effective path to recovery.

The economic crisis has produced a crisis in the study of economics – a growing realization that if the field is going to offer meaningful solutions, greater attention must be paid to what is happening in university lecture halls and seminar rooms.

While the protesters occupying Wall Street are not carrying signs denouncing rational-expectations and efficient-market modelling, perhaps they should be.

They wouldn’t be the first young dissenters to call economics to account. In June of 2000, a small group of elite graduate students at some of France’s most prestigious universities declared war on the economic establishment. This was an unlikely group of student radicals, whose degrees could be expected to lead them to lucrative careers in finance, business or government if they didn’t rock the boat. Instead, they protested – not about tuition or workloads, but that too much of what they studied bore no relation to what was happening outside the classroom walls.

They launched an online petition demanding greater realism in economics teaching, less reliance on mathematics “as an end in itself” and more space for approaches beyond the dominant neoclassical model, including input from other disciplines, such as psychology, history and sociology. Their conclusion was that economics had become an “autistic science,” lost in “imaginary worlds.” They called their movement Autisme-economie.

The students’ timing is notable: It was the spring of 2000, when the world was still basking in the glow of “the Great Moderation,” when for most of a decade Western economies had been enjoying a prolonged period of moderate but fairly steady growth.

Some economists were daring to think the unthinkable – that their understanding of how advanced capitalist economies worked had become so sophisticated that they might finally have succeeded in smoothing out the destructive gyrations of capitalism’s boom-and-bust cycle. (“The central problem of depression prevention has been solved,” declared another Nobel laureate, Robert Lucas of the University of Chicago, in 2003 – five years before the greatest economic collapse in more than half a century.)

The students’ petition sparked a lively debate. The French minister of education established a committee on economic education. Economics students across Europe and North America began meeting and circulating petitions of their own, even as defenders of the status quo denounced the movement as a Trotskyite conspiracy. By September, the first issue of the Post-Autistic Economic Newsletter was published in Britain.

As The Independent summarized the students’ message: “If there is a daily prayer for the global economy, it should be, ‘Deliver us from abstraction.’”

It seems that entreaty went unheard through most of the discipline before the economic crisis, not to mention in the offices of hedge funds and the Stockholm Nobel selection committee. But is it ringing louder now? And how did economics become so abstract in the first place?

The great classical economists of the late 18th and early 19th centuries had no problem connecting to the real world – the Industrial Revolution had unleashed profound social and economic changes, and they were trying to make sense of what they were seeing. Yet Adam Smith, who is considered the founding father of modern economics, would have had trouble understanding the meaning of the word “economist.”

What is today known as economics arose out of two larger intellectual traditions that have since been largely abandoned. One is political economy, which is based on the simple idea that economic outcomes are often determined largely by political factors (as well as vice versa). But when political-economy courses first started appearing in Canadian universities in the 1870s, it was still viewed as a small offshoot of a far more important topic: moral philosophy.

In The Wealth of Nations (1776), Adam Smith famously argued that the pursuit of enlightened self-interest by individuals and companies could benefit society as a whole. His notion of the market’s “invisible hand” laid the groundwork for much of modern neoclassical and neo-liberal, laissez-faire economics. But unlike today’s free marketers, Smith didn’t believe that the morality of the market was appropriate for society at large. Honesty, discipline, thrift and co-operation, not consumption and unbridled self-interest, were the keys to happiness and social cohesion. Smith’s vision was a capitalist economy in a society governed by non-capitalist morality.

But by the end of the 19th century, the new field of economics no longer concerned itself with moral philosophy, and less and less with political economy. What was coming to dominate was a conviction that markets could be trusted to produce the most efficient allocation of scarce resources, that individuals would always seek to maximize their utility in an economically rational way, and that all of this would ultimately lead to some kind of overall equilibrium of prices, wages, supply and demand.

Political economy was less vital because government intervention disrupted the path to equilibrium and should therefore be avoided except in exceptional circumstances. And as for morality, economics would concern itself with the behaviour of rational, self-interested, utility-maximizing Homo economicus. What he did outside the confines of the marketplace would be someone else’s field of study.

As those notions took hold, a new idea emerged that would have surprised and probably horrified Adam Smith – that economics, divorced from the study of morality and politics, could be considered a science. By the beginning of the 20th century, economists were looking for theorems and models that could help to explain the universe. One historian described them as suffering from “physics envy.” Although they were dealing with the behaviour of humans, not atoms and particles, they came to believe they could accurately predict the trajectory of human decision-making in the marketplace.

In their desire to have their field be recognized as a science, economists increasingly decided to speak the language of science. From Smith’s innovations through John Maynard Keynes’s work in the 1930s, economics was argued in words. Now, it would go by the numbers.

The turning point came in 1947, when Paul Samuelson’s classic book Foundations of Economic Analysis for the first time presented economics as a branch of applied mathematics. Without “the invigorating kiss of mathematical method,” Samuelson maintained, economists had been practising “mental gymnastics of a particularly depraved type,” like “highly trained athletes who never run a race.” After Samuelson, no economist could ever afford to make that mistake.

And that may have been the greatest mistake of all: In a post-crisis, 2009 essay in The New York Times Magazine, Princeton economist and Nobel laureate Paul Krugman wrote, “The central cause of the profession’s failure was the desire for an all-encompassing, intellectually elegant approach that gave economists a chance to show off their mathematical prowess.”

Of course, nothing says science like a Nobel Prize. Prizes in chemistry, physics and medicine were first awarded in 1901, long before anyone would have thought that economics could or should be included. But by the late 1960s, the central bank of Sweden was determined to change that, and when the Nobel family objected, the bank agreed to put up the money itself, making it the only one of the prizes to be funded by taxpayers.

Officially, then, it is known as the Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel – but that title is rarely used. On Monday morning, Prof. Sargent and Princeton University Prof. Sims were widely reported to have won the Nobel Prize in Economics.

The confusion is understandable, and deliberate, according to Philip Mirowski, an economic historian at the University of Notre Dame. “It’s part of the PR trick,” Prof. Mirowski argues. Awarding the economics prize immediately after the prizes for physics, chemistry and medicine helps to place economics on the same level as those other natural sciences.

The prize also has helped to transform one particular ideology into economic orthodoxy. Prof. Mirowski, who is co-writing a book on the history of the economics prize, notes that throughout the 1970s and 1980s, economists whose work supported neoclassical, pro-market, laissez-faire ideas won a disproportionate number of those honours, as well as support from the increasing numbers of well-funded think tanks and foundations that cleaved to the same lines. People who rejected those ideas, or were skeptical of the natural sciences model, were quickly marginalized, and their road to academic advancement often blocked.

The result was a homogenization of economic thought that Prof. Mirowski believes “has been pretty deleterious for economics on the whole.”

The road to hell is paved with good intentions,

rational expectations and efficient markets

Many critics of neo-classical economics argue that it has a powerful pro-market bias that’s provided an intellectual justification for politicians ideologically disposed to reduce government involvement in the economy.

The rational-expectations model, for example, assumes that consumers and producers all inform themselves with all available data, understand how the world around them operates and will therefore respond to the same stimulus in essentially the same way. That allows economists to mathematically forecast how these “representative” consumers and producers would behave.

During a recession, say, a well-meaning government might want to enhance benefits for the unemployed. Prof. Sargent, for one, would caution against that, because a “rational” unemployed worker might then calculate that it’s better to reject a lower-paying job. He’s blamed much of the chronically high unemployment in some European countries on the presence of an army of voluntarily unemployed workers, and spoken out against the Obama administration’s recent efforts to extend unemployment benefits.

Indeed, under the rational-expectations model, most market interventions by governments and central banks wind up looking counterproductive.

Meanwhile, the efficient-markets hypothesis, developed by University of Chicago economist Eugene Fama in the 1970s, has dominated thinking about financial markets. It posits that the prices of stocks and other financial assets are always “efficient” because they accurately reflect all the available information about economic fundamentals.

By this reasoning, there can be no speculative price bubbles or busts in the stock or housing markets, and speculators with evil intentions cannot successfully manipulate markets. Conveniently, since markets are self-stabilizing, there’s no need for government regulation of them.

Critics point out that both these theories tend to ignore what John Maynard Keynes called the “animal spirits” – playing down human irrationality, inefficiency, venality and ignorance. Those are qualities that are hard to plug into a mathematical equation that purports to model human behaviour.

These models also have failed to take into account the profound changes wrought by globalization, and the growing importance of banks, hedge funds and other financial institutions. Yet they have successfully provided a “scientific” cover for an anti-regulatory political agenda that is popular on Wall Street and in some Washington political circles.

Inside jobs: Pay no attention to that banker behind the curtain

The Great Depression of the 1930s led many economists of the day to question some of their discipline’s most fundamental assumptions and produced a decades-long heyday for Keynesian economics. So far, the Great Recession has led to less of a fundamental shift.

Notre Dame’s Prof. Mirowski believes that more rethinking is necessary. “Everyone thought the banks would have to change their behaviour, but they got bailed out and nothing changed. The economics profession has also been bailed out because it is so highly interlinked with the financial profession, so of course they don’t change. Why would they change?”

Indeed, economics may be the dismal science, but there is nothing dismal about the payoffs for those at the top of the heap serving as advisers and consultants and sitting on various boards. Unlike some disciplines, economics has no guidelines governing conflict of interest and disclosure.

In 2010, the Academy Award-winning documentary Inside Job exposed several disturbing examples of academic economists calling for deregulation while working for financial-services companies. And in a study of 19 prominent financial economists, published last year by the Political Economy Research Institute at the University of Massachusetts Amherst, 13 were found to own stock or sit on the boards of private financial institutions, but in only four cases were those affiliations revealed when they testified or wrote op-eds concerning financial regulation.

This year, the American Economics Association agreed to set up a committee to investigate whether economists should develop ethical guidelines similar to those already in place for sociologists, psychologists, statisticians and anthropologists.

But there appears to be little enthusiasm for the idea among mainstream economists. Prof. Lucas of the University of Chicago, in an interview with The New York Times, objected: “What disciplines economics, like any science, is whether your work can be replicated. It either stands up or it doesn’t. Your motivations and whatnot are secondary.”

Several billion pennies for their thoughts

The critics, however, are more numerous and considerably better financed than the French students a decade ago. In October, 2009, billionaire financier George Soros said that “the current paradigm has failed.” He resolved to help save economics from itself. He pledged $50-million toward the establishment of the New York-based Institute for New Economic Thinking (INET), with a mandate to promote changes in economic theory and practice through conferences, grants and campaigns for graduate and undergraduate education reforms.

Perry Mehrling, a professor of economics at New York’s Columbia University, is the chair of the curriculum task force at INET. He says his graduate students at Columbia are growing increasingly frustrated by at the tendency to define the discipline by its tools instead of its subject matter – like the students in Paris a decade ago, they find little relationship between the mathematical models in class and the world outside the door.

Prof. Mehrling believes that economics education has become far too insular. Never mind cross-disciplinary study – even courses in economic history and the history of economic thought have all but disappeared, so students spend almost no time reading Smith, Keynes or other past masters.

“It’s not just that we’re not listening to sociologists,” Prof. Mehrling laments. “We’re not even listening to economists.”

He says he has no problem with teaching efficient-markets and rational-expectations theories, but as hypothesis, not catechism. “I object to the idea that these are articles of faith and if you don’t accept them, you are not a member of the tribe. These things need to be questioned and we need a broader conversation.”

The challenge, as Columbia University economist Joseph Stiglitz said at the opening conference of INET, is that “we need better theories of persistent deviations from rationality.”

Some of those theories are coming from the rapidly growing field of behavioural economics, which borrows insights about human motivation from cognitive psychology: A paper titled The Hubris Hypothesis of Corporate Takeovers, for example, examines how the egos of ambitious chief executive officers can lead them to pursue takeovers, even when all available evidence suggests that the move could be a disaster.

It is not yet clear how such new approaches can evolve into workable models, but they hint at what a post-autistic economics might look like.

Prof. Mehrling is cautiously optimistic. “There’s a recognition that things we thought were true aren’t necessarily true,” he argues, “and the world is more complicated and interesting than we thought – so all bets are off, and that’s exciting intellectually.”

Change comes slowly in academia. The few jobs that are available don’t generally go to people who challenge orthodoxy. But over the next decade, as the post-crash crop of economics students make their impact felt in government, business and schools, the lessons learned may well seep into the mainstream.

Theories based on assumptions of rationality, efficiency and equilibrium in the marketplace are likely to be treated with a great deal more skepticism. Homo economicus is a lot more anxious, irrational, unpredictable and complex than most economists believed. And, as Adam Smith recognized, he has a moral and ethical dimension that should not be ignored.

Today, the Post-Autistic Economic Network continues to publish its newsletter, now known as the Real-World Economic Review. It remains a thorn in the side of mainstream economics. In an editorial in January, 2010, the editors called for major economics organizations to censure those economists who “through their teachings, pronouncements and policy recommendations facilitated the global financial collapse” and pointed to the “continuing moral crisis within the economics profession.”

It is unlikely that Prof. Sargent will acknowledge any of this when he travels to Stockholm to accept his (sort of) Nobel Prize in December. Nor is he likely to speak about what role, if any, his models really might have played in “wrecking the world.”

But he did make one concession in his interview with the Nobel website this week: “Many of the practical problems are ahead of where the models are,” he admitted. “That’s life.”

Ira Basen is a radio producer, journalist and educator based in Toronto.

ANTHROPOLOGIES OF FORECASTING AS ANTHROPOLOGIES OF THE FUTURE

2011 American Anthropological Association Meeting, Montreal

4-0230 ANTHROPOLOGIES OF FORECASTING AS ANTHROPOLOGIES OF THE FUTURE
Friday, November 18, 2011: 08:00-11:45

Organizers: Renzo Taddei (Federal University of Rio de Janeiro) and Karen E Pennesi (University of Western Ontario)
Chairs: Karen E Pennesi (University of Western Ontario)
Discussants: Ben Orlove (Columbia University) and Renzo Taddei (Federal University of Rio de Janeiro)

08:00
Future Forecasting and the End of Relativism: A Challenge for Anthropology
Will Rollason (Brunel University)

08:15
Forecasting Credit: Living Proleptically In the Brazilian Amazon
Jeremy M Campbell (Roger Williams University)

08:30
The Secret Life of Forecasts: Examining the Production and Use of Tornado Warnings As Social Processes
Heather Lazrus (National Center for Atmospheric Research), Amy Nichols (University of Oklahoma) and Stephanie Hoekstra (University of Oklahoma)

08:45
Functions and Interpretations of Ambiguous Language In Predictions
Karen E Pennesi (University of Western Ontario)

09:00
On the Simulation of Deforestation Scenarios In Making REDD Carbon Market
Shaozeng Zhang (University of California, Irvine)

09:15
Discussant
Ben Orlove (Columbia University)

09:30
Discussion

09:45
Break

10:00
Forecasting As History: Japan’s Modern Earthquakes
Kerry Smith (Brown University)

10:15
Articulating Future Health Effects and Climate Change: Collaborative Modeling Systems and Future Epistemologies
Brandon J Costelloe-Kuehn (Rensselaer Polytechnic Institute)

10:30
Looking After Today: Resource Depletion and Hydrocarbon Culture In Industrial Trinidad
Jacob Campbell (University of Arizona)

10:45
Social Impact Assessment and the Anthropology of the Future In Canada’s Oilsands
Clinton N Westman (University of Saskatchewan)

11:00
Clouds In the Forecast: The Future, Climate Science, and Humanitarian Aid
Soo-Young Kim (Columbia University)

11:15
Discussant
Renzo Taddei (Federal University of Rio de Janeiro)

11:30
Discussion

Rick Perry officials spark revolt after doctoring environment report (The Guardian)

Scientists ask for names to be removed after mentions of climate change and sea-level rise taken out by Texas officials

Suzanne Goldenberg, US environment correspondent
guardian.co.uk, Friday 14 October 2011 13.05 BST

Republican presidential hopeful Texas Gov. Rick Perry

Rick Perry’s administration deleted references to climate change and sea-level rise from the report. Photograph: Evan Vucci/AP

Officials in Rick Perry’s home state of Texas have set off a scientists’ revolt after purging mentions of climate change and sea-level rise from what was supposed to be a landmark environmental report. The scientists said they were disowning the report on the state of Galveston Bay because of political interference and censorship from Perry appointees at the state’s environmental agency.

By academic standards, the protest amounts to the beginnings of a rebellion: every single scientist associated with the 200-page report has demanded their names be struck from the document. “None of us can be party to scientific censorship so we would all have our names removed,” said Jim Lester, a co-author of the report and vice-president of the Houston Advanced Research Centre.

“To me it is simply a question of maintaining scientific credibility. This is simply antithetical to what a scientist does,” Lester said. “We can’t be censored.” Scientists see Texas as at high risk because of climate change, from the increased exposure to hurricanes and extreme weather on its long coastline to this summer’s season of wildfires and drought.

However, Perry, in his run for the Republican nomination, has elevated denial of science, from climate change to evolution, to an art form. He opposes any regulation of industry, and has repeatedly challenged the authority of the Environmental Protection Agency.

Texas is the only state to refuse to sign on to the federal government’s new regulations on greenhouse gas emissions. “I like to tell people we live in a state of denial in the state of Texas,” said John Anderson, an oceanography at Rice University, and author of the chapter targeted by the government censors.

That state of denial percolated down to the leadership of the Texas Commission on Environmental Quality. The agency chief, who was appointed by Perry, is known to doubt the science of climate change. “The current chair of the commission, Bryan Shaw, commonly talks about how human-induced climate change is a hoax,” said Anderson.

But scientists said they still hoped to avoid a clash by simply avoiding direct reference to human causes of climate change and by sticking to materials from peer-reviewed journals. However, that plan began to unravel when officials from the agency made numerous unauthorised changes to Anderson’s chapter, deleting references to climate change, sea-level rise and wetlands destruction.

“It is basically saying that the state of Texas doesn’t accept science results published in Science magazine,” Anderson said. “That’s going pretty far.”

Officials even deleted a reference to the sea level at Galveston Bay rising five times faster than the long-term average – 3mm a year compared to .5mm a year – which Anderson noted was a scientific fact. “They just simply went through and summarily struck out any reference to climate change, any reference to sea level rise, any reference to human influence – it was edited or eliminated,” said Anderson. “That’s not scientific review that’s just straight forward censorship.”

Mother Jones has tracked the changes. The agency has defended its actions. “It would be irresponsible to take whatever is sent to us and publish it,” Andrea Morrow, a spokeswoman said in an emailed statement. “Information was included in a report that we disagree with.”

She said Anderson’s report had been “inconsistent with current agency policy”, and that he had refused to change it. She refused to answer any questions. Campaigners said the censorship by the Texas state authorities was a throwback to the George Bush era when White House officials also interfered with scientific reports on climate change.

In the last few years, however, such politicisation of science has spread to the states. In the most notorious case, Virginia’s attorney general Ken Cuccinelli, who is a professed doubter of climate science, has spent a year investigating grants made to a prominent climate scientist Michael Mann, when he was at a state university in Virginia.

Several courts have rejected Cuccinelli’s demands for a subpoena for the emails. In Utah, meanwhile, Mike Noel, a Republican member of the Utah state legislature called on the state university to sack a physicist who had criticised climate science doubters.

The university rejected Noel’s demand, but the physicist, Robert Davies said such actions had had a chilling effect on the state of climate science. “We do have very accomplished scientists in this state who are quite fearful of retribution from lawmakers, and who consequently refuse to speak up on this very important topic. And the loser is the public,” Davies said in an email.

“By employing these intimidation tactics, these policymakers are, in fact, successful in censoring the message coming from the very institutions whose expertise we need.”

Seeing Value in Ignorance, College Expects Its Physicists to Teach Poetry (N.Y. Times)

By ALAN SCHWARZ

ANNAPOLIS, Md. — Sarah Benson last encountered college mathematics 20 years ago in an undergraduate algebra class. Her sole experience teaching math came in the second grade, when the first graders needed help with their minuses.

Sarah Benson has a Ph.D. in art history and a master’s in comparative literature, but this year she is teaching geometry. Shannon Jensen for The New York Times
And yet Ms. Benson, with a Ph.D. in art history and a master’s degree in comparative literature, stood at the chalkboard drawing parallelograms, constructing angles and otherwise dismembering Euclid’s Proposition 32the way a biology professor might treat a water frog. Her students cared little about her inexperience. As for her employers, they did not mind, either: they had asked her to teach formal geometry expressly because it was a subject about which she knew very little.

It was just another day here at St. John’s College, whose distinctiveness goes far beyond its curriculum of great works: Aeschylus and Aristotle, Bacon and Bach. As much of academia fractures into ever more specific disciplines, this tiny college still expects — in fact, requires — its professors to teach almost every subject, leveraging ignorance as much as expertise.

“There’s a little bit of impostor syndrome,” said Ms. Benson, who will teach Lavoisier’s “Elements of Chemistry” next semester. “But here, it’s O.K. that I don’t know something. I can figure it out, and my job is to help the students do the same thing. It’s very collaborative.”

Students in Ms. Benson’s class discussing Euclid.Shannon Jensen for The New York Times

Or as St. John’s president, Chris Nelson (class of 1970), put it with a smile only slightly sadistic: “Every member of the faculty who comes here gets thrown in the deep end. I think the faculty members, if they were cubbyholed into a specialization, they’d think that they know more than they do. That usually is an impediment to learning. Learning is born of ignorance.”

Students who attend St. John’s — it has a sister campus in Santa Fe, N.M., with the same curriculum and philosophies — know that their college experience will be like no other. There are no majors; every student takes the same 16 yearlong courses, which generally feature about 15 students discussing Sophocles or Homer, and the professor acting more as catalyst than connoisseur.

What they may not know is that their professor — or tutor in the St. John’s vernacular — might have no background in the subject. This is often the case for the courses that freshmen take. For example, Hannah Hintze, who has degrees in philosophy and woodwind performance, and whose dissertation concerned Plato’s “Republic,” is currently leading classes on observational biology and Greek.

“Some might not find that acceptable, but we explore things together,” said Ryan Fleming, a freshman in Ms. Benson’s Euclid class. “We don’t have someone saying, ‘I have all the answers.’ They’re open-minded and go along with us to see what answers there can be.”

Like all new tutors, Ms. Benson, 42, went through a one-week orientation in August to reacquaint herself with Euclid, and to learn the St. John’s way of teaching. She attends weekly conferences with more seasoned tutors.

Her plywood-floor classroom in McDowell Hall is as almost as dim and sparse as the ones Francis Scott Key (valedictorian of the class of 1796) studied in before the college’s original building burned down in 1909. Eight underpowered ceiling lights barely illuminated three walls of chalkboards. While even kindergarten classrooms now feature interactive white boards and Wi-Fi connected iPads, not one laptop or cellphone was visible; the only evidence of contemporary life was the occasional plastic foam coffee cup.

The discussion centered not on examples and exercises, but on the disciplined narrative of Euclid’s assertions, the aesthetic economy of mathematical argument. When talk turned to Proposition 34 of Book One, which states that a parallelogram’s diagonal divides it into equal areas, not one digit was used or even mentioned. Instead, the students debated whether Propositions 4 and 26 were necessary for Euclid’s proof.

When a student punctuated a blackboard analysis with, “The self-evident truth that these triangles will be equal,” the subliminal reference to the Declaration of Independence hinted at the eventual braiding of the disciplines by both students and tutors here. So, too, did a subsequent discussion of how “halves of equals are equals themselves,” evoking the United States Supreme Court’s logic in endorsing segregation 2,200 years after Euclid died.

Earlier in the day, in a junior-level class taught by a longtime tutor about a portion of Newton’s seminal physics text “Principia,” science and philosophy became as intertwined as a candy cane’s swirls. Students discussed Newton’s shrinking parabolic areas as if they were voting districts, and the limits of curves as social ideals.

One student remarked, “In Euclid before, he talked a lot about what is equal and what isn’t. It seems here that equality is more of a continuum — we can get as close as we want, but never actually get there.” A harmony of Tocqueville was being laid over Newton’s melody.

The tutor, Michael Dink, graduated from St. John’s in 1975 and earned his master’s degree and Ph.D. in philosophy from the Catholic University of America. Like most professors here, he long ago traded the traditional three-course academic career — writing journal articles, attending conferences and teaching a specific subject — for the intellectual buffet at St. John’s. His first year included teaching Ptolemy’s “Almagest,” a treatise on planetary movements, and atomic theory. He since has taught 15 of the school’s 16 courses, the exception being sophomore music.

“You have to not try to control things,” Mr. Dink said, “and not think that what’s learned has to come from you.”

This ancient teaching method could be making a comeback well beyond St. John’s two campuses. Some education reformers assert that teachers as early as elementary school should lecture less at the blackboard while students silently take notes — the sage-on-the-stage model, as some call it — and foster more discussion and collaboration among smaller groups. It is a strategy that is particularly popular among schools that use technology to allow students to learn at their own pace.

Still, not even the most rabid reformer has suggested that biology be taught by social theorists, or Marx by mathematicians. That philosophy will continue to belong to a school whose president has joyfully declared, “We don’t have departmental politics — we don’t have departments!”

Anthony T. Grafton, a professor of history at Princeton and president of the American Historical Association, said he appreciated the approach.

“There’s no question that people are becoming more specialized — it’s natural for scholars to cover a narrow field in great depth rather than many at the same time,” he said. “I admire how St. John’s does it. It sounds both fun and scary.”

Risco é coisa séria (JC)

JC e-mail 4364, de 14 de Outubro de 2011.

Artigo de Francisco G. Nóbrega enviado ao JC Email pelo autor.

A sociedade moderna está banhada em comunicação. Como “boa notícia não é notícia”, a lente psicológica humana registra sempre um cenário pior que a realidade. A percepção usual é que os riscos de todos os tipos aumentam dia a dia. A redução global da violência, por exemplo, é tema do livro recente do psicólogo da Universidade Harvard, Steven Pinker (http://www.samharris.org/blog/item/qa-with-steven-pinker). Ao arrepio do senso comum, ele demonstra, objetivamente, que estamos progredindo neste quesito.

Mas nossa mente não descança em sua aguda capacidade de detectar outras fontes de risco. Temos alguns campeões de audiência: energia nuclear para eletricidade, alimentos geneticamente modificados e aquecimento global catastrófico e antropogênico. O dano potencial das três ameaças mencionadas, objetivamente, não se concretizou de maneira alguma, embora a terceira ameaça deva se realizar no futuro, segundo seus defensores. As pessoas se encantam com o automóvel e seus acessórios, cada vez mais atraentes. Não se pensa em baní-lo, apesar de resultar em cerca de 40.000 mortos e inúmeros incapacitados cada ano, só no Brasil. David Ropeik, que pertence ao Centro Harvard para Análise de Risco, explica como facilmente se distorce o perigo real de situações. Quanto mais afastadas do senso comum (como radiação e plantas geneticamente modificadas), mais facilmente são manipuladas, por ignorância ou interesses outros, apavorando o cidadão comum. Ropeik explica como este medo sem sentido passa a ser um fator de estresse e um risco objetivo para a saúde das pessoas, devendo ser evitado.

Dentro desse universo, são justificadas as preocupações do Dr. Ferraz (“O feijão nosso de cada dia”, Jornal da Ciência, 6/10/2011). Ele é membro da CTNBio, atua na setorial vegetal/ambiental e sua área de concentração é em agroecologia, o que explica, pelo menos em parte, suas dúvidas. No entanto essas preocupações não têm a consistência sugerida pelo autor e a análise da CTNBio, que resultou na aprovação deste feijão, é confiável.

A comissão se pauta sempre pelas diretivas da legislação que são amplas, para dar conta de todas as possibilidades de risco para os consumidores e meio ambiente. No entanto o corpo técnico existe exatamente para atuar de maneira seletiva e consciente, examinando caso a caso. Os testes são examinadas com o rigor que a modificação introduzida na planta exige para plena segurança. Se as modificações são consideradas sem qualquer risco significativo, os testes são avaliados à luz deste fato.

Testes com muitos animais, altamente confiáveis estatisticamente, seriam exigidos pela comissão na eventualidade de uma planta transgênica produzir, por exemplo, uma molécula pesticida não protéica que seria em tudo semelhante a uma droga produzida pela indústria farmacêutica. Isto poderá acontecer em certo momento, já que as plantas têm capacidade de produzir os mais variados pesticidas naturais para se defenderem na natureza. A substância seria absorvida no intestino e se disseminaria por órgãos e tecidos, possivelmente exercendo efeitos sistêmicos e localizados que exigem avaliação. Isso já aconteceu, sem querer, com uma batata produzida por melhoramento convencional nos EUA. Seu consumo levou a mal estar e foi recolhida apressadamente: portava altos níveis de glicoalcalóides tóxicos para o homem, o que explicava sua excelente resistência às pragas da cultura.

No caso do feijão Embrapa, nenhuma molécula não protéica nova é produzida e o pequeno RNA que interfere com a replicação do vírus, caso alguém venha a ingerir folhas e caules, será um entre centenas ou milhares de RNAs que ingerimos diariamente com qualquer produto vegetal. O RNA introduzido, no entanto, não foi detectado no grão do feijão cozido, usando técnicas extremamente poderosas.

As variações detectadas, se estatisticamente significativas (concentração de vitamina B2 ou cisteína por exemplo) não representam risco algum. A técnica clássica de cultura de tecidos, usada para gerar variedades de qualidade em horticultura e propagação de árvores, reconhecidamente resulta em variações naturais que introduzem certas modificações desejáveis e algumas indesejáveis, que o melhorista depois seleciona. É a variação somaclonal, que também afeta os clones geneticamente modificados na sua fase de seleção.

Portanto, é no mínimo ingênuo dizer que o feijão Embrapa 5.1 “deveria ser idêntico” a variedade de origem pois as manipulações necessárias para gerar o transgênico resultam em certas alterações que, se irrelevantes, são ignoradas e se deletérias são descartadas pelos cientistas. Se fizermos as mesmas análises, cujos resultados preocupam alguns, com as muitas variedades convencionais consumidas no país, as diferenças serão impressionantes e irrelevantes para a questão “segurança”.

Como já foi comentado anteriormente, não existe base factual (bioquímica ou genética) para imaginar que o feijão Embrapa apresente risco maior do que um feijão comum ou melhorado por mutagênese química ou física, que por sinal, não é supervisionado nutricional e molecularmente antes de sua comercialização. Sem base biológica, os testes tornam-se formalidades supérfluas e o ruído experimental, principalmente com amostras pequenas, quase inevitavelmente vai gerar resultados que são irrelavantes a menos que se amplie muito o número de animais (para amostras controle e transgênicas) além de ser prudente incluir animais alimentados com outros feijões convencionais para uma idéia realista do significado das variações detectadas. Imaginem o custo dessa busca “caça fantasma”, desencadeada simplesmente devido a uma aplicação pouco esclarecida do princípio da precaução. As preocupações sem base racional, levantadas a todo momento pelos que temem a tecnologia, se aplicariam com maior lógica aos produtos convencionais.

Caso isso aconteça, do dia para a noite estaria inviabilizada a produção agrícola do planeta. Por que não fazer estudos com Rhizobium e nodulação em todos os feijões comercializados? Por que não conduzir estudos nutricionais de longo prazo com os alimentos convencionais derivados de mutagênese? Qual a razão lógica que exclui essas preocupações com as plantas convencionais? Ou a razão seria metafísica? A alteração introduzida seria “contra a natureza”, algo como o pecado original, que, em muitas interpretações, consistiu apenas em comer o fruto da “árvore do conhecimento”? Recentemente 41 cientistas suecos da área vegetal lançaram um manifesto contra a sobre-regulação da genética moderna na Europa (reproduzido no blog GenPeace: genpeace.blogspot.com). Os autores observam que, fazendo um paralelo com as exigências para os produtos farmacêuticos, a “lógica da legislação atual sugere que apenas drogas produzidas por meio de engenharia genética deveriam ser avaliadas quando a efeitos indesejáveis”.

Instilar o medo com base em suposições não ajuda a proteger a população ou o meio ambiente. Marie Curie teria dito “Na vida nada deve ser temido. Mas tudo deve ser compreendido”. Considero irresponsável usar o “princípio da precaução” como alguns o fazem. Inclusive a OMS caiu nesta armadilha, classificando os telefones celulares no grupo 2B de risco para causar câncer. A radiação destes equipamentos é cerca de um milhão de vezes inferior à energia que pode produzir radicais livres e gerar dano ao DNA. A classe 2B inclui o risco de câncer relativo ao café, resíduos da queima de combustíveis fósseis e uso de dentadura…. O que a WHO manteve viva, irresponsavelmente, é a justificativa para a dúvida, que vai legitimar pesquisas caras e irrelevantes, cujo resultado será inconclusivo, como o mega estudo anterior. Incrivelmente mais perigoso é o uso do celular enquanto se dirige.

Francisco G. da Nóbrega é professor da Universidade de São Paulo (USP).

Secitece promove I Fórum “Ceará Faz Ciência” (Funcap)

POR ADMIN, EM 13/10/2011

Com a Assessoria de Comunicação da Secitece

O evento será realizado nos dias 17 e 18 de outubro, no auditório do Planetário do Centro Dragão do Mar de Arte e Cultura.

Nos dias 17 e 18 de outubro, a Secretaria da Ciência Tecnologia e Educação Superior (Secitece), realizará o “I Fórum Ceará Faz Ciência”, com o tema” Mudanças climáticas, desastres naturais e prevenção de riscos”. A iniciativa integra a programação estadual da Semana Nacional de Ciência e Tecnologia.

O secretário da Ciência e Tecnologia, René Barreira, fará a abertura do evento, dia 17, às 17h, no auditório do Planetário Rubens de Azevedo. Na ocasião, será prestada homenagem ao pesquisador cearense Expedito Parente, conhecido como o pai do biodiesel, que faleceu em setembro.

No dia 18, partir das 9h, as atividades serão retomadas com as seguintes palestras: “Onda gigante no litoral brasileiro. É possível?”, com o prof. Francisco Brandão, chefe do Laboratório de Sismologia da Coordenadoria Estadual de Defesa Civil, e “As quatro estações do ano no Ceará: perceba suas interferências na fisiologia e no meio ambiente”, ministrada por Dermeval Carneiro, prof. de Física e Astronomia, presidente da Sociedade Brasileira dos Amigos da Astronomia e diretor do Planetário Rubens de Azevedo – Dragão do Mar.

No período da tarde, a partir das 14h30, será a vez da palestra “Desastres Naturais: como prevenir e atuar em situações de risco”, com o Tenente Coronel Leandro Silva Nogueira, secretário Executivo da Coordenadoria Estadual de Defesa Civil. Para finalizar o Fórum, a engenheira agrônoma do Departamento de Recursos Hidricos e Meio Ambiente da Fundação Cearense de Meteorologia e Recursos Hídricos (Funceme), Sonia Barreto Perdigão, ministrará palestra sobre “Mudanças Climáticas e Desertificação no Ceará”, às 16h30.

Os interessados em participar do I Fórum “Ceará Faz Ciência”, a ser realizado nos dias 17 e 18/10, no Dragão do Mar, em Fortaleza, devem fazer sua pré-inscrição. O formulário a ser preenchido está disponível no site da Secitece. A participação é gratuita.

Serviço
I Fórum Ceará Faz Ciência
Data: 17 e 18 de outubro de 2011
Local: Auditório do Planetário Rubens de Azevedo
Informações: (85) 3101-6466
Inscrições gratuitas.

Medida da discórdia (quântica) (Fapesp)

Pesquisadores brasileiros medem diretamente pela primeira vez propriedade que pode se mostrar muito importante para o desenvolvimento da computação quântica (montagem:Ag.FAPESP)

14/10/2011

Por Elton Alisson

Agência FAPESP – A fragilidade das propriedades quânticas, que desaparecem devido à interação com o meio ambiente, a temperatura finita ou em corpos macroscópicos, representa um dos maiores obstáculos para o desenvolvimento dos desejados computadores quânticos, máquinas ultravelozes que seriam capazes de realizar simultaneamente e, em questão de segundos, operações que os computadores convencionais demorariam bilhões de anos para efetuar.

Um grupo de físicos brasileiros mediu experimentalmente de forma direta, pela primeira vez, uma propriedade que pode ser útil para o desenvolvimento da computação quântica.

Derivados do projeto “Informação quântica e decoerência”, apoiado pela FAPESP por meio do Programa Jovens Pesquisadores em Centros Emergentes, os resultados dos experimentos foram publicados em 30 de setembro na Physical Review Letters.

Em 9 de agosto, o grupo havia publicado na mesma revista um artigo em que descreveram como conseguiram medir a chamada discórdia quântica à temperatura ambiente.

Introduzido em 2001, o conceito de discórdia quântica indica a correlação não clássica entre duas entidades, como núcleos, elétrons, spins e fótons, que implica em características que não podem ser observadas em sistemas clássicos.

Até então se acreditava que essa grandeza quântica só poderia ser medida em sistemas muito bem controlados ou a baixíssimas temperaturas e isolados do meio ambiente, uma vez que qualquer interferência seria capaz de destruir a ligação entre os objetos quânticos, que era atribuída unicamente a um fenômeno físico chamado emaranhamento – o que dificultaria a concepção de um computador quântico.

“Entretanto, medimos experimentalmente essa correlação (discórdia) quântica e demonstramos que ela está presente onde não se esperava e que esse fenômeno pode ser explorado mesmo à temperatura ambiente, em situações em que há muito ruído térmico”, disse Roberto Menezes Serra, professor da Universidade Federal do ABC (UFABC) e coordenador do projeto, à Agência FAPESP.

Para medir a discórdia quântica, os pesquisadores trabalharam com uma molécula de clorofórmio, que possui um átomo de carbono, um de hidrogênio, e três de cloro.

Utilizando técnicas de ressonância magnética nuclear, eles codificaram um bit quântico no spin do núcleo do hidrogênio e outro no de carbono, em um cenário em que eles não estavam emaranhados, e demonstraram que é possível medir as correlações quânticas entre os dois spins nucleares.

Por intermédio do experimento, desenvolveram um método prático para medir correlações quânticas (a discórdia quântica) através de uma grandeza física, denominada “testemunha ocular”, que permite a observação direta do caráter quântico da correlação de um sistema. “Isso demonstrou de forma inequívoca a natureza quântica dos testes de princípios realizados em ressonância magnética nuclear à temperatura ambiente. Esses resultados podem abrir caminho para outras aplicações em informação quântica à temperatura embiente”, disse Serra.

No trabalho publicado no novo artigo, os pesquisadores brasileiros mediram outro fenômeno que haviam previsto, denominado mudança súbita de comportamento da discórdia quântica.

O efeito descreve a alteração de comportamento da discórdia quântica quando o sistema físico em que ela está presente entra em contato com o meio ambiente, causando uma perda de coerência do sistema (um fenômeno conhecido como decoerência). Nessa situação, a discórdia quântica pode permanecer constante e insensível ao ruído térmico durante um determinado tempo e, depois, começar a decair.

“Conhecer as sutilezas do comportamento dinâmico desse sistema é importante porque, se utilizarmos a discórdia quântica para obter alguma vantagem em algum processo, como de metrologia ou de processamento de informação, precisamos saber o quão robusto esse aspecto quântico é em relação a essa perda de coerência para conhecer por quanto tempo o dispositivo pode funcionar bem e quais erros devem ser corrigidos”, explicou Serra.

Referência mundial

Até há alguns anos, os cientistas achavam que o emaranhamento fosse uma propriedade essencial para obtenção de ganhos em um sistema quântico, como a maior capacidade para a troca de informações entre objetos quânticos. Recentemente, descobriu-se que essa propriedade não é necessariamente fundamental para a vantagem quântica em processamento de informação, porque há protocolos em que a vantagem quântica é obtida em sistemas não emaranhados. Dessa forma, conjectura-se que a discórdia quântica é que poderia estar associada às vantagens de um sistema quântico .

Em função disso, tanto a discórdia como o emaranhamento passaram a ser reconhecidos como úteis para a realização de tarefas em um computador quântico. No entanto, sistemas não emaranhados dotados de discórdia teriam a vantagem de ser mais robustos à ação do meio externo, uma vez que o emaranhamento pode desaparecer subitamente, em um fenômeno chamado “morte súbita”.

“Nosso maior interesse, no momento, é avançar na compreensão da origem da vantagem dos computadores quânticos. Se soubermos isso, poderemos construir dispositivos mais eficientes, consumindo menos recursos para controlar sua coerência”, disse Serra.

De acordo com o pesquisador, o grupo de físicos brasileiros foi o primeiro a utilizar técnicas de ressonância magnética nuclear para medir a discórdia quântica de forma direta e se tornou referência mundial na área.

Para realizar as medições, o grupo de pesquisadores da UFABC se associou inicialmente ao grupo liderado pelo professor Tito José Bonagamba, do Instituto de Física da Universidade de São Paulo (USP), campus de São Carlos, que coordenou os primeiros experimentos por meio do projeto“Manipulação de spins nucleares através de técnicas de ressonância magnética e quadrupolar nuclear”, também realizado com apoio da FAPESP.

Os experimentos mais recentes foram realizados por meio de uma colaboração entre os pesquisadores da UFABC e da USP de São Carlos com um grupo de pesquisa do Centro Brasileiro de Pesquisas Físicas (CBPF), no Rio de Janeiro, liderado pelo professor Ivan Oliveira. Os pesquisadores também contaram com o apoio do Instituto Nacional de Ciência e Tecnologia de Informação Quântica (INCT-IQ).

“Nesse momento, estão sendo desenvolvidos no CBPF métodos para lidar com sistemas de três e quatro bits quânticos que, associados às técnicas que desenvolvemos para medir a discórdia quântica e outras propriedades, permitirão testarmos protocolos mais complexos em ciência da informação quântica como, por exemplo, de metrologia e de máquinas térmicas quânticas”, contou Serra.

Os artigos Experimentally Witnessing the Quantumness of Correlations e Environment-Induced Sudden Transition in Quantum Discord Dynamics, de Serra e outros (doi: 10.1103/PhysRevLett.107.070501 e 10.1103/PhysRevLett.107.140403), ), publicados naPhysical Review Letters, podem ser lidos emlink.aps.org/doi/10.1103/PhysRevLett.107.070501 elink.aps.org/doi/10.1103/PhysRevLett.107.140403.

Tim Ingold: Projetando ambientes para a vida – um esboço (Blog Noquetange)

Projetando ambientes para a vida – um esboço*
Por Maycon Lopes
10/10/2011

Imbuído de pensar uma antropologia do vir-a-ser, uma antropologia do devir, quer dizer, aquela que não seja sobre as coisas, mas que se mova com elas, Ingold esboçou, no que os organizadores chamaram desde o início da série de conferências na UFMG de sua “grande conferência”, críticas e proposições para trilharmos o futuro. Trilhar não se trata de percorrer um caminho pré-definido; é deixar pegadas no seu percorrer, marcar com trilho, traçar. O traçado é como um desenho, um projeto, e o ato de fazê-lo já nos desloca da condição de “meros usuários” do design. Para Ingold, os designs têm de falhar, para que o futuro possa deles se apropriar, destruí-los. Eles poderiam ser pensados como previsões – e toda previsão é errada. Ou, seguindo a linha de análise deleuziana, o design poderia ser compreendido como uma tentativa de controlar o devir.

Tim Ingold propõe que ele (o design) seja concebido, no âmbito de um processo vital cuja essência é de abertura e improvisação, como um aspecto, menos como meta pré-determinada que como a continuidade de um andamento. Neste sentido, o design seria produção de futuros e não definição de. Essa ideia contudo contrasta – e esse é o ponto, creio eu, de Ingold e desse post – com a forma como tem sido predominantemente compreendida a natureza no discurso tecnocientífico: com objetivos precisos, o ambiente seria nada mais que um meio, uma coisa manipulável, vida sequestrada tendo em vista a atingir determinados fins. A natureza dos cientistas e dos criadores de política é conhecida através de cálculos, gráficos, imagens independentes daquelas do mundo que conhecemos (ou mundo fenomenal) e com o qual estamos familiarizados pelo próprio habitar. Essa dissociação artificial, que para nós aparece na figura do “globo”, espaço a que não sentimos pertencer, em contraposição com a terra, que de fato habitamos, é um modo nada adequado de abordar as constantes ameaças sofridas pela natureza. A mesma dissociação provoca uma lacuna entre o mundo diário e o mundo projetado pelos instrumentos de conhecimento a que me referi anteriormente, opondo conhecimento do habitante a conhecimento científico, como se os cientistas não habitassem mundo.

Uma expressão muito em voga como “desenvolvimento sustentável”, em geral usada tanto por políticos como por grandes corporações com intuito de proteger o lucro, é amparada por registros contábeis, ou pela perspectiva, segundo Tim Ingold do ex-habitante. Nós outros, habitantes, não temos acesso a essa linguagem contábil, e somos assim furtados da responsabilidade de cuidar do meio ambiente, sendo dele (verticalmente) expelidos, em vez de fazer do mesmo um projeto comum, pela via do que Ingold denominou de “projetar ambientes para a vida”. Repousaria pois na unidade da vida esse elo ontológico, unidade esta que nem o catálogo taxonômico “biodiversidade” e nem a concepção kantiana de superfície – palco das nossas habilidades – dão conta. Tim Ingold se esforça, em nome de uma vida social sempre indivisível da vida ecológica (se é que é possível já assim polarizá-las – ressalta Ingold), por uma genealogia da unidade da vida, uma partilha histórica entre sociedade e natureza, sendo a última em geral concebida como facticidade, coisa bruta do mundo.

Para Ingold os conceitos são inerentemente políticos, e deste modo é interessante para alguns distinguir humanos de inumanos, que, embora estejam num único mundo, apenas os primeiros, pelo viés da “ação humana”, são passíveis de construir. Seriam assim os humanos “menos naturais”, todavia envolvidos mutuamente ao longo do mundo orgânico. Que pensar a respeito do vento, do sol, das árvores e suas raízes (onde residiria o seu caminhar)? Ele propõe, a fim de evitar – e agravar – essa infeliz dicotomia, a concepção de ambiente como uma zona de envolvimento mútuo, cujo relacionamento entre os seres se dá justamente por feixes de linhas, como luz, como ar, e caminhos. Contra as tentativas coercitivas de suprimir o ambiente cobrindo-o de superfícies duras/impermeáveis, Ingold oferece o rolar sobre o mundo e não através do. Segundo ele, o rolar sobre significa o nosso envolvimento com o ambiente, a nossa própria experiência, que difere do global da tecnociência. Aqui se situa o design, mas não o design que inova, e sim o design que improvisa. A inovação seria oriunda de uma leitura de “trás pra frente”, já a improvisação uma leitura do ler para a frente, por onde o mundo se desdobra. Toda improvisação para o antropólogo consiste em criatividade, e criatividade implica já crescimento. O design não prevê, o design antecipa.

Assim a sua ideia é a de caminhar com o mundo, “crescer junto”, mas não num mundo pré-ordenado e sim um mundo incipiente. O design não é uma pré-figura, mas um traço, um desenho, uma linha para uma caminhada, no entanto sempre passível de fuga do enredo como personagens de um romance – com vida própria. Ingold então defende o projetar como um verbo intransitivo, responsável – ao contrário do que pensava o pintor Paul Klee, do julgamento da forma como morte – por atribuir vida. Para a proposta de Timothy Ingold, finalmente, seria necessário o aumento da flexibilidade dos habitantes de mundo, em que tensão seria convertida em conversa, em diálogo, em projeto.

http://noquetange.wordpress.com/2011/10/10/timingold/

O tempo da meteorologia (Tome Ciência)

A meteorologia é muito mais do que dar uma olhada na previsão do tempo quando se planeja uma viagem de fim de semana. No momento em que o aquecimento global é uma ameaça, e as grandes catástrofes climáticas tornam-se cada vez mais frequentes, ressalta-se a importância e a responsabilidade dos meteorologistas. O aumento do conhecimento e as inovações tecnológicas nessa área permitem hoje prever com certa antecedência e precisão os fenômenos do clima. E retirar rapidamente pessoas de áreas de risco pode salvar muitas vidas. O tema deste debate foi sugerido pela Sociedade Brasileira de Meteorologia, instituição vinculada à Sociedade Brasileira para o Progresso da Ciência – a SBPC.

Participantes:

Carlos Afonso Nobre, secretário de Políticas e Programas de Pesquisa e Desenvolvimento do Ministério da Ciência e Tecnologia (MCT), dirigiu por mais de 10 anos o Centro de Previsão de Tempo e Estudos Climáticos do Instituto Nacional de Pesquisas Espaciais (INPE) e participa da criação, em 2011, do Centro Nacional de Monitoramento e Alerta de Desastres Naturais.

Maria Gertrudes Justi da Silva, coordenadora do curso de meteorologia da Universidade Federal do Rio de Janeiro (UFRJ). Ex-presidente da Sociedade Brasileira de Meteorologia faz parte do Conselho de Coordenação das Atividades de Meteorologia, Climatologia e Hidrologia no Governo Federal.

José Marques é o presidente do Conselho Deliberativo da Sociedade Brasileira de Meteorologia. Foi da primeira turma de meteorologistas formados em universidade brasileira, graduado em 1967 pela UFRJ. Até então os cursos eram só no exterior, onde depois, na França, ele fez o pós-doutorado.

Ednaldo Oliveira dos Santos, professor adjunto do Departamento de Ciências Ambientais do Instituto de Florestas da Universidade Federal Rural do Rio de Janeiro(UFRRJ), é presidente da União Nacional dos Estudiosos em Meteorologia e representante da América do Sul no comitê internacional que estuda educação sem distância de meteorologia. É também pesquisador associado do Instituto Virtual Internacional de Mudanças Globais, da COPPE/UFRJ.

Nobel de Química vai para cristal que “não devia existir” (Folha de São Paulo); Nobel para cristais inusitados (Fapesp)

JC e-mail 4359, de 06 de Outubro de 2011.

Israelense mostrou que estrutura cristalina pode ser formada por padrões complexos que nunca se repetem.

Os meticulosos cadernos de laboratório do israelense Daniel Shechtman permitem datar com precisão a descoberta que acaba de render a ele o Prêmio Nobel em Química deste ano. Foi na manhã de 8 de abril de 1982 que ele usou uma série de pontos de interrogação para marcar sua surpresa com o que estava vendo no microscópio: um cristal que não deveria existir.

Para o comitê do Nobel, ele “modificou a concepção fundamental do que é um objeto sólido”, mostrando que os átomos podem se organizar em estruturas de grande complexidade, que não se repetem. Por isso, embora o achado ainda tenha pouca aplicação prática, ele foi considerado digno do prêmio.

Para Nivaldo Speziali, presidente da Sociedade Brasileira de Cristalografia, o ganhador mostrou “que a periodicidade estrutural [a repetição regular das mesmas estruturas] não é necessária na definição de cristal”. Há exemplos de materiais artificiais e naturais com os quasicristais (como são chamados) do israelense. A arte medieval bolou estruturas parecidas.

Teimosia – Shechtman precisou de muita persistência, pois a grande maioria dos cientistas duvidou de seus achados. Um deles era Linus Pauling, ganhador do Nobel em 1954, conta Speziali. Por conta das reações negativas, o israelense chegou a ser expulso do laboratório onde trabalhava nos EUA. Hoje ele está no Instituto de Tecnologia de Israel, em Haifa.

Em entrevista dada ao comitê do Nobel, Shechtman disse que sua descoberta lhe ensinou que “o bom cientista é humilde a ponto de estar disposto a considerar novidades inesperadas e violações de leis estabelecidas”.

Os quasicristais descobertos são, em sua maioria, criados artificialmente quando uma liga metálica derretida é esfriada rapidamente em uma superfície giratória. Sua estrutura tridimensional dificulta a propagação de ondas, o que define suas características peculiares. Eles são maus condutores de calor e de eletricidade, têm baixa fricção e aderência, mas são altamente resistentes e, por isso, prometem grande aplicabilidade.

Seriam bons para aço reforçado, lâminas e agulhas cirúrgicas, frigideiras e motores a diesel. Mas poucas aplicações concretas já foram desenvolvidas devido ao alto custo de produção deles. Arte islâmica já trazia padrões dos quasicristais

AIQ – O ano de 2011 é celebrado como o Ano Internacional da Química, e o Prêmio Nobel em Química dado a um físico coroa o aspecto interdisciplinar da área. A descoberta dos quasicristais, por exemplo, tem relações com a física, com a engenharia de materiais, com a matemática e até com as artes não figurativas, sem falar na própria química, é claro. O padrão não repetitivo presente nos quasicristais tem raízes matemáticas antigas. A razão das distâncias entre os átomos nesses materiais está sempre relacionada à proporção áurea, descrita pelo matemático Fibonacci no século 13 e conhecida já na Antiguidade.

Na década de 1970, Roger Penrose usou a proporção áurea para produzir mosaicos aperiódicos, imagens compostas de combinações de formas geométricas que são infinitamente variadas. Os mosaicos da arte islâmica medieval, como o do palácio de Alhambra, na Espanha, também têm o mesmo padrão dos mosaicos de Penrose e dos quasicristais.

*  *  *

Nobel para cristais inusitados

06/10/2011

Agência FAPESP – O ganhador do prêmio Nobel de Química de 2011 é Dan Shechtman, do Instituto de Tecnologia de Israel (Technion), pela descoberta dos quase-cristais. O anúncio foi feito nesta quarta-feira (05/10) pela Academia Real de Ciências da Suécia.

Diferente dos cristais, os quase-cristais são formas estruturais ordenadas, mas em padrões que não se repetem. Suas configurações não contam com as simetrias dos cristais e eram consideradas impossíveis até serem descobertas por Shechtman.

Na manhã de 8 de abril de 1982, enquanto examinava uma liga de alumínio e manganês em um microscópio eletrônico, o cientista viu uma imagem que contradizia as leis da natureza e inicialmente duvidou do que havia observado.

Mais difícil foi convencer a comunidade científica de que se tratava de uma importante descoberta. Um dos que duvidaram foi Linus Pauling, ganhador do Nobel de Química em 1954.

Em toda matéria sólida, até então se achava que os átomos se agrupavam dentro de cristais em padrões simétricos repetidos periódica e constantemente. Para os cientistas, essa repetição era fundamental de modo a se obter um cristal.

A imagem vista por Shechtman mostrava algo diferente: que átomos em um cristal poderiam ser agrupados em um padrão que simplesmente não se repetiria jamais. A descoberta foi tão polêmica que o próprio cientista foi convidado a deixar o grupo de pesquisa do qual fazia parte. O diretor do laboratório até mesmo lhe deu um manual de cristalografia, aconselhando-o a estudar mais.

Mas o tempo e outras pesquisas mostraram que Shechtman estava certo e sua descoberta acabou alterando o conceito e o conhecimento sobre a matéria sólida.

Mosaicos não periódicos, como os medievais encontrados em construções islâmicas – tal qual o palácio de Alhambra, na Espanha, ou a mesquita Darb-i Imam, no Irã, ajudaram os cientistas a entender como os quase-cristais se parecem no nível atômico.

Assim como os quase-cristais, esses mosaicos têm padrões regulares, que seguem regras matemáticas, mas nunca se repetem.

Depois da descoberta de Shechtman, outros cientistas produziram diversos tipos de quase-cristais em laboratório. Na natureza, essas formas inusitadas também são encontradas. Foram observadas em amostras de minerais de um rio na Rússia e em um tipo de aço feito na Suécia.

Quase-cristais estão sendo experimentados nos mais variados produtos, de frigideiras e motores a diesel.

Shechtman receberá 10 milhões de coroas suecas (cerca de R$ 2,8 milhões) em cerimônia em dezembro, em Estocolmo.

Making Funny with Climate Change (The Yale Forum on Climate Change & The Media)

Keith Kloor   September 30, 2011

Comedy may be able to make inroads with audiences in ways that ‘serious journalism’ often cannot. With an issue as serious as climate science suggests, communicators should not shy from taking the risks of injecting humor as appropriate.

 

Last week, Colorado-based science journalist Michelle Nijhuis lamented the standard environmental news story. She wrote:

“Environmental journalists often feel married to the tragic narrative. Pollution, extinction, invasion: The stories are endless, and endlessly the same. Our editors see the pattern and bury us in the back pages; our readers see it and abandon us on the subway or in the dentist’s office.”

 

Commentary 

A welcome exception to this rule, Nijhuis noted, was New Yorker writer Ian Frazier, who has injected humor into the many environmentally themed nonfiction pieces he’s penned over the years.

This might also be the key to the success of Carl Hiaasen‘s best-selling novels. There is nothing new about the sleazy politics and environmental destruction that are regular themes of his books. But it gets digested through wickedly funny scenes and lampooned characters. There are no sacred cows, either. Tree huggers and traditional eco-villains get equally caricatured.

Writers have had a harder time using humor to communicate global warming. In the non-fiction universe, there are no Ian Fraziers tackling the issue in a quirky, sideways manner. Journalists in mainstream media treat the topic somberly and dutifully. Exhaustion may be setting in for some. Recently NPR’s Robert Krulwich wrote:

“I got a call the other day from some producer I very much admire. They wanted to talk about a series next year on global warming and I thought, why does this subject make me instantly tired? Global warming is important, yes; controversial, certainly; complicated (OK by me); but somehow, even broaching this subject makes me feel like someone’s putting heavy stones in my head.”

But if reporters are getting jaded, TV writers and comedians are eagerly joining the fray. Recent satirical novels by acclaimed writers, such as Jonathan Franzen and Ian McEwan have also tackled climate change.

Whether any of these pop culture and high-minded literary endeavors is influencing attitudes is impossible to know. Still, some climate communicators see humor as their best chance to make climate issues resonate with the public at large, though the tact can be a double-edged sword, as one climate campaigner notes:

“Humor’s capacity for radical imagination creates a mental space for potential change but also comes with a loss of control as it breaks taboos and turns the order of reality upside down and inside out. Indeed, because of this ability to destabilize the established order, George Orwell stated that every joke is a tiny revolution. It denudes power of its authority, which is true of those that we oppose but also those that we cherish. Using humor to communicate on climate change means that scientists and environmentalists lose the monopoly on framing climate change and even risk becoming the butt of the joke. However uncomfortable, this may be necessary if we truly want the public at large to take ownership of the issue.”

That some attempts at humor can backfire has already been demonstrated. But if the stakes are as high as climate science suggests, then that’s a risk climate communicators should not be afraid to take.

Keith Kloor

Keith Kloor is a New York City-based freelance journalist who writes often about the environment and climate change. (E-mail: keith@yaleclimatemediaforum.org)

A Map of Organized Climate Change Denial (Dot Earth, N.Y. Times)

October 2, 2011, 3:51 PM

By ANDREW C. REVKIN

Oct. 3, 9:00 p.m. | Updated 
A chart of “key components of the climate change denial machine” has been produced by Riley E. Dunlap, regents professor of sociology at Oklahoma State University, and Aaron M. McCright, an associate professor of sociology at Michigan State University. The diagram below (reproduced here with permission) is from a chapter the two researchers wrote on organized opposition to efforts to curb greenhouse gases for the new Oxford Handbook of Climate Change and Society.
That there are such well-financed and coordinated efforts is not contentious. And this is not the first attempt to map them.

But it’s important to keep in mind that not everyone skeptical of worst-case predictions of human-driven climate disruption, or everyone opposed to certain climate policies, is part of this apparatus.

And there’s plenty to chart on the other edge of the climate debate — thosegroups and outlets pursuing a traditional pollution-style approach to greenhouse gases.

[Oct. 3, 9:00 p.m. | Updated As it happens, the blogger behind Australian Climate Madness has posted a skeptics’ map of “the climate alarmism machine.” (see below) I think some, though by no means all, aspects of the map are not bad. But, as with so much of the climate debate, it is an overdrawn, overblown caricature of reality.]

It’s also important to examine whether a world without such efforts — in which citizens had a clear view of both what is known, and uncertain, about the human factor in shaping climate-related risks — would appreciably change. Some insist the answer is yes. Given the deep-rooted human bias tothe near and now and other aspects of our “inconvenient mind,” I’m not nearly so sure (although this doesn’t stop me from working on this challenge, of course).

Some issues with an anthropology of climate change (Imponderabilia)

By Heid Jerstad
Imponderabilia
Spring ’10 – Issue 2

Introduction: Climate change is something everyone comes across in their personal and day-to-day lives. This article explores some of the possible reasons why anthropology has been slow in taking up this issue and analogies are drawn with the postcolonial and feminist critiques of anthropology.

Some issues with an anthropology of climate change

Is there a stigma in anthropology about climate issues? Do you see this title and think ‘well, I switch off my lights, but this has no place in academia?’ I would like to reflect a little on why this might be so. As students we learn about the ‘personal as political’ in gender theory. I think the issue of climate change (and the related, but not identical, issue of peak oil) may be a fairly close parallel to the attention given to gender issues in anthropology during the 1980s. Both feminism and the climate change movement are political movements in society, wanting to change the way people live their lives. So why is climate change only present on the margins of anthropological research?

Several scholars have issued calls to action, arguing that this area needs further research (Rayner 1989, Battersbury 2008, Crate and Nuttall 2009). So far, however, it has been hard for anthropologists to directly engage with the issue of climate change. I propose in the following to discuss and examine several reasons for this.

Firstly, anthropology has in the past few decades focused on subjectivities of difference (Moore 2009). That is to say on minorities, colonial power imbalances and sexualities, to give a few examples. The theory developed to deal with these identity and power issues is then perhaps badly suited to address phenomena that are affecting the entire globe. All human societies seem to be experiencing some impact, regardless of which categories of difference they might fall into. In some cases, the social, economic and ecological impact of other, non-climatic changes – for instance the effect of mining and tubewells on the groundwater in Rajasthan (Jerstad 2009) – combines with climatic effects to ‘exacerbate . . . existing problems’ (Crate and Nuttall 2009:11). To comprehend this interaction, socially oriented analysis is required. The ethnographic focus of the anthropologist, sharpened as it has been by highlighting issues of difference, can contribute to more complete understandings of the complex agricultural, linguistic, ritual, local-global, differentiated forces and effects operating on various scales and infrastructures. Such research – on the societal effects of climate change – can benefit from the theory base of anthropology, and subjectivities of difference would certainly have their place in such an analysis.

Secondly, the issue of climate change forces contact between academic anthropology and the ‘hard’ sciences and ‘development.’ Each of these points of contact proves problematic in its own way.

‘Science’ has been set aside by mainstream anthropology to the degree that there is a set of ‘replacement’ parallels within the discipline – such as medical anthropology and ethnobiology. But it is within western science that the majority of the research on climate change has been done. Here scientists have become activists and found their scientific material to have ethical relevance. What they lack is an understanding of how climatic effects will impact human societies around the world existing under very different ecological and social conditions.

‘Development’ – though sometimes the site of fruitful collaboration with anthropology – operates under very different assumptions from anthropology (Mosse 2006). The tendency in development is to use climate change as an excuse to deal with existing problems such as drought or extreme weather events. Yet here there is a risk that climate change will be sidelined by governments and other internal social institutions as ‘just another issue’ for the development agencies to deal with.

Thirdly, a reluctance to engage politically, which is not new in the discipline, seems to contribute to anthropologists’ reluctance to tackle climate change as an issue. Could doing fieldwork today while ignoring ecological issues be seen as equivalent to doing fieldwork in the 1930s while ignoring the colonial presence? Both situations are political, placing anthropologists between the countries that fund them and those that provide the data for their work – countries that are themselves caught up in global power relationships. In the colonial instance, the anthropologist was often from the country colonising their area of study. Today issues of power relations are far more complex, but this is all the more reason not to ignore them. I am suggesting not only to place climate change in the ethics or methodology section of a monograph with reference to political relationships and logistical issues, but also to reflect on cultural relationships with the ‘weather,’ how it is changing and how these relationships in turn may be affected. In Crates’ work with the Sakha people of Siberia (2008), she introduces her call for anthropologists to become advocates with a story of the ‘bull of winter’ losing its horns and hence its strength, signalling spring. This meteorological model no longer meshes with experienced reality for the Sakha, highlighting the cultural implications of climatic change beyond ‘mere’ agricultural or economic effects (Vedwan and Rhoades 2001).

Another analogy, touched on in the introduction, is with gender. Problematising the gendered dimension of societies is a political act, but a necessary one in order to avoid the passive politics of unquestioningly reinforcing the status quo. An anthropological study of Indian weddings without mention of the hijras – cross-dressing dancers (Nanda 1990) – for instance, might leave the reader with the general impression that gender/sexuality in India is uniformly dualistic. In the same way, leaving energy relations to economists and political scientists is itself a political act. The impacts of climate change on humans, though mediated by wind and weather, are as social as gender relations, and are products of a particular set of power relations (Hornborg 2008). By ignoring them, anthropologists risk becoming passive supporters of this system.

An anthropology of climate change is emerging (Grodzins Gold 1998, Rudiak-Gould 2009), and anthropologists must reflect on and orient themselves in relation to this. Villagers and other informants are affected by drought, floods, storms and more subtle meteorological changes that are hard to pinpoint as climate-change caused but can be assumed to be climate-change exacerbated. Would anthropological work in these areas and on these issues primarily benefit aid organisations? I don’t think so. Giving academic credibility to problems people are facing can allow governments, corporations and other bodies to act and change policy in a world where the word of a villager tends to carry very little weight.

Bibliography

Battersbury, Simon. 2008. Anthropology and Global Warming: The Need for Environmental Engagement. Australian Journal of Anthropology 19 (1)

Crate, S. A. and Nuttall, 2009. Anthropology and Climate Change: From encounters to actions. Walnut Creek, CA: Left Coast Press.

Crate, S. A. 2008. “Gone the Bull of Winter? Grappling with the Cultural Implications of and Anthropology’s Role(s) in Global Climate Change.” Current Anthropology, 49 (4), 569.

Gold, Ann Grodzins. 1998. “Sin and Rain: Moral Ecology in Rural North India.” In Lance E. Nelson ed. Purifying the Earthly Body of God: Religion and Ecology in Hindu India. Albany: State University of New York Press, 165-195.

Hornberg, A. 2008. Machine fetishism and the consumer’s burden. Anthropology Today, 24 (5).

Jerstad, H. 2009. Climate Change in the Jaisamand Catchment Area: Vulnerability and Adaptation. Unpublished report for SPWD.

Mosse, D. 2006. Anti-social anthropology? Objectivity, objection and the ethnography of public policy and professional communities. Journal of the Royal Anthropological Institute (N.S.). 12 (4), 935-956.

Moore, Henrietta 20th Oct 2009 SOAS departmental seminar.

Nanda, S. 1990. Neither man nor woman: the hijras of India. Wadsworth: Open University Press.

Rayner, S. 1989. Fiddling While the Globe Warms? Anthropology Today 5 (6)

Rudiak-Gould, P. 2009. The Fallen Palm: Climate Change and Culture Change in the Marshall Islands. VDM Verlag.

Vedwan and Rhoades, 2001 Climate change in the western Himalayas of India: a study of local perception and response. Climate research, 19, 109-117.

Heid Jerstad is a Norwegian-English MA Res student at SOAS. After completing a BA in arch and anth at Oxford, she went to India and worked on the impacts of climate change in southern Rajasthan. She is now attempting to pursue related issues in her dissertation. In her spare time she volunteers in a Red Cross shop, hosts dinner parties and fights with her sword.

Futures Impossible : a new methodology to study world events (Boingboing.net)

By Jacques Vallee at 11:36 am Thursday, Sep 15

NeckercubeeeeThe study of the future, as a scientific and intellectual endeavor, used to be driven by the careful extrapolation of trends, as in Herman Kahn’s Year 2000, or the forecasting of complex interaction among many variables, as in the Club of Rome’s Limits to Growth and Paul Ehrlich’s Population Bomb. The technologies behind these studies relied on the mathematical tools of operations research developed during World War Two and on methods for the aggregation of expert opinion such as the Delphi Technique, developed at Rand and the Institute for the Future.

The scenarios and forecasts built on this technical base were supplemented by the study of a few extreme hypothetical situations known as “wild cards” or “black swans” (major earthquake in Tokyo, terrorist attack in New York, asteroid strike in Western Europe) designed to stretch the borders of the crisis management maps and to stimulate our collective thought process—while remaining within the domain of the Possible.

Such techniques for describing the future and anticipating its opportunities and dangers have largely become obsolete because of the acceleration of technology itself and the increasing vulnerability of our society to chaotic processes that are not well behaved under most classic models.

 In the world of the 21st century, the situations faced by decision-makers in government and industry are of a wholly different nature. In an economic environment where General Motors could go bankrupt in one week, and Lehman Brothers in one afternoon, the extrapolation of trends and the wisdom of experts are still relevant, but a new methodology is needed to deal with unforeseen discontinuities. Neither of the above catastrophes was a “wild card” in anyone’s scenario. No classical futurist could imagine such discontinuities because the tools to anticipate and describe them were not available: they were truly “impossible,” just as the Fukushima nuclear disaster was deemed “impossible” by the General Electric experts who built the plant and the Japanese authorities who managed it. Similarly, as a society, we seem to be incapable of imagining healthy, positive “impossibilities” such as reconciliation in Palestine, an end to terrorism, or a world without starvation.

At the Institute for the Future, a team headed up by Bob Johansen, Kathi Vian and myself has begun to develop a typology of Impossible Futures, starting from four classes of events:

A. Some futures are deemed impossible because they would require an extraordinary convergence of several scenarios, each of which has very low probability. The bankruptcy of General Motors (Fortune One!) in one week is a case in point.

B. Some futures are deemed impossible because they would require the convergence of several scenarios on time scales that violate our knowledge of reality. The failure of the Madoff funds, for example, was deemed impossible by his investors, all of whom were successful financial experts. It happened because two low-probability events converged: (1) regulatory authorities repeatedly refused to act every time the illegal scheme was brought to their attention, and (2) the subprime crisis dried up sources of funds overnight, exposing the fraudulent structure.

C. Some futures are deemed impossible because they would require the convergence of several scenarios, including forces or components that do not exist within accepted knowledge. In A.E.Van Vogt’s novel The World of null-A (for non-Aristotelian), a secret agent named Gosseyn is repeatedly assassinated. Each time, he is reincarnated in a new body held in reserve by his masters in special sarcophagi, endowed with increased abilities. A future when Gosseyn could exist lies outside the natural limits of our scientific knowledge and culture.

D. There are futures that are deemed impossible because we simply cannot imagine them. In Saddam Hussein’s culture there was no scenario in which U.S. forces could see the movement of his forces even at night, through clouds or through dust storms. Most nations still have no concept for devices that could detect underground cavities invisible from the air or from space. Even in modern American culture, the fact that remote classified facilities can be detected, visited, and accurately described by mental powers alone remains beyond accepted concepts.

To a decision-maker in business or government, simply describing such impossible future scenarios is not helpful in the absence of a methodology for detecting, understanding, and mitigating their practical effects. What is needed is a deeper grid that can be used as an overlay to highlight radical discontinuities in technology, geopolitics, social behavior or economic patterns. We believe that such a tool needs to be developed if we want to survive the new realities where worldviews collide at an accelerated pace.

The Folly of Prediction: Full Transcript (Freakonomics.com)

FREAKONOMICS

06/30/2011 | 4:58 pm

Stephen J. DUBNER: What does it mean to be a witch exactly in Romania? Are these people that we know here as psychics or fortunetellers, or are they different somehow?

Vlad MIXICH: I don’t know how is the fortuneteller in the United States. But here generally they are a woman of different ages. They can–they say they can cure some diseases. They can bring back your husband or your wife. Or they can predict your future.

DUBNER: Who is a typical client for a witch?

MIXICH: There are quite a lot of politicians who are going to witches. You know the French president, Nicolas Sarkozy, he went to witches last year. And our president in Romania, and very important politicians from different parties, they are going to witches. Some of them they were obliged to recognize they went to witches. Some of them it’s an off-the-record information. But me being a journalist, I know that information.

DUBNER: Vlad Mixich is a reporter in Bucharest, the capital of Romania. He knows a good bit about the witches there.

MIXICH: Quite a lot of them they are quite rich. They have very big houses with golden rooftops. A lot of the Romanians, they are living in small apartments in blocks. So, just going in such a building will give you a sense of majesty and respect.

DUBNER: But the Romanian witch industry has been under attack. First came a proposed law to regulate and tax the witches. It passed in one chamber of Parliament before stalling out. But then came another proposal arguing that witches should be penalized if the predictions they make don’t turn out to be true.

MIXICH: So if you are one of my clients, and if I’m a fortune teller, if I fail to predict your future, I pay a quite substantial fine to the state, or if this happens many times, I will even go to jail. The punishment is between six months and three years in jail.

DUBNER: What’s being proposed in Romania is revolutionary. It strikes me because we typically don’t hold anybody accountable for bad predictions. So, I’m wondering in Romania, let’s say, if a politician makes a bad prediction, do they get fined or penalized in any way?

MIXICH: No, not at all. In fact this is one of the hobbies of our president. He’s doing a lot of predictions, which are not coming true, of course. And after that he is reelected! Or his popularity is rising, like the sun in the morning, you know? No, anyone can do publicly a lot of predictions here in eastern Europe and not a single hair will move from his or her head.

DUBNER: C’mon people, that doesn’t seem fair, does it? I don’t care if you’re anti-witch or pro-witch or witch-agnostic. Why should witches be the only people held accountable for bad predictions? What about politicians and money managers and sports pundits? And what about you?

[THEME]

ANNOUNCER: From WNYC and APM, American Public Media, this is Freakonomics Radio. Today: The Folly of Prediction. Here’s your host, Stephen Dubner.

DUBNER: All of us are constantly predicting the future, whether we think about it or not. Right now, some small part of your brain is trying to predict what this show is going to be about. How do you do that? You factor in what you’ve heard so far. What you know about Freakonomics. Maybe you know a lot, maybe you’ve never heard of it, you might think it’s some kind of communicable disease! When you predict the future, you look for cognitive cues, for data, for guidance. Here’s where I go for guidance.

Steven LEVITT: I think to an economist, the best explanation for why there are so many predictions is that the incentives are set up in order to encourage predictions.

DUBNER: That’s Steve Levitt. He’s my Freakonomics friend and co-author, an economist at the University of Chicago.

LEVITT: So, most predictions we remember are ones which were fabulously, wildly unexpected and then came true. Now, the person who makes that prediction has a strong incentive to remind everyone that they made that crazy prediction which came true. If you look at all the people, the economists, who talked about the financial crisis ahead of time, those guys harp on it constantly. “I was right, I was right, I was right.” But if you’re wrong, there’s no person on the other side of the transaction who draws any real benefit from embarrassing you by bring up the bad prediction over and over. So there’s nobody who has a strong incentive, usually, to go back and say, Here’s the list of the 118 predictions that were false. I remember growing up, my mother, who is somewhat of a psychic–

DUBNER: Wait, somewhat of a psychic?

LEVITT: She’s a self-proclaimed psychic. And she would predict a stock market crash every single year.

DUBNER: And she’s been right a couple times.

LEVITT: And she has been. She’s been right twice in the last 15 years, and she would talk a lot about the times she was right. I would have to remind her about the 13 times that she was wrong. And without any sort of market mechanism or incentive for keeping the prediction makers honest, there’s lots of incentive to go out and to make these wild predictions. And those are the ones that are remembered and talked about. Think of about one of the predictions that you hear echoed more often than just about any one is Joe Namath’s famous pronouncement about how the Jets were going to win the Super Bowl. And it was unexpected. And it happened. And if the Jets had lost the Super Bowl, nobody would remember that Joe Namath made that pronouncement.

DUBNER: And conversely, you can probably find at least one player on every team that’s lost the Super Bowl in the last forty years that did predict that his team would win.

LEVITT: That’s probably right. That’s exactly right. Now, the flip side, which is perhaps surprising, is that in many cases the goal of prediction is to be completely within the pack. And so I see this a lot with pension fund managers, or endowment managers, which is if something goes wrong then as long as everybody else made the same prediction, you can’t be faulted very much.

DUBNER: Pension managers. Football players. Psychic moms. Romanian witches. Who doesn’t try to predict the future these days?

[SOUND MONTAGE OF PREDICTIONS]

DUBNER: And you know the worst thing? There’s almost nobody keeping track of all those predictions! Nobody … except for this guy …

Philip TETLOCK: Well, I’m a research psychologist, who …

DUBNER: Don’t forget your name, though.

TETLOCK: I’m Phil Tetlock and I’m a research psychologist. I spent most of career at the University of California, Berkeley, and I recently moved to the University of Pennsylvania where I’m cross- appointed in the Wharton School and the psychology department.

DUBNER: Philip Tetlock has done a lot of research on cognition and decision-making and bias, pretty standard stuff for an Ivy League psych PhD. But what really fascinates him is prediction.

TETLOCK: There are a lot of psychologists who believe that there is a hard-wired human need to believe that we live in a fundamentally predictable and controllable universe. There’s also a widespread belief among psychologists that people try hard to impose causal order on the world around them, even when those phenomena are random.

DUBNER: This hardwired human need, as Tetlock puts it, has created what he calls a prediction industry. Now, don’t sneer. You’re part of it, too.

TETLOCK: I think there are many players in what you might count the prediction industry. In some sense we’re all players in it. Whenever we go to a cocktail party, or a colloquium, or whatever where opinions are being shared, we frequently make likelihood judgments about possible futures. And the truth or falsity of particular claims about futures. The prediction business is a big business on Wall Street, and we have futures markets and so forth designed to regulate speculation in those areas. Obviously, government has great interest in prediction. They create large intelligence agency bureaucracies and systems to help them achieve some degree of predictability in a seemingly chaotic world.

DUBNER: Let me read something that you have said or written in the past. “This determination to ferret out order from chaos has served our species well. We’re all beneficiaries of our great collective successes in pursuit of deterministic regularities in messy phenomena — agriculture, antibiotics, and countless other inventions.” So talk to me for a moment about the value of prediction. Obviously there’s much has been gained, much to be gained. Do we overvalue prediction though, perhaps?

TETLOCK: I think there’s an asymmetry of supply and demand. I think there is an enormous demand for accurate predictions in many spheres of life in which we don’t have the requisite expertise to deliver. And when you have that kind of gap between demand and real supply you get the infusion of fake supply.

DUBNER: “Fake supply.” I like this guy, this Philip Tetlock. He’s not an economist, but he knows the laws of supply and demand can’t just be revoked. So if there’s big demand for prediction in all realms of life, and not enough real supply to satisfy it, what does this “fake supply” sound like?

[SOUND MONTAGE OF COULDS]

DUBNER: There’s a punditocracy out there, a class of people who predict ad nauseam, often on television. They can be pretty good at making their predictions tough to audit.

TETLOCK: It’s the art of appearing to go out on a limb without actually going out on a limb. For example, the word “could,” something “could” happen, the room you happen to be sitting in could be struck by a meteor in the next 23 seconds. That makes perfect sense, but the probability of course is point zero, zero, zero, zero, et cetera, one. It’s not zero, but it’s extremely low. In fact, the word “could,” the possible meanings people attach to it range from a 0.01 to a .6, which covers more than half the probability scale right there.

DUBNER: Look, nobody likes a weasel. So more than 20 years ago, Tetlock set out to conduct one of the largest empirical studies, ever, of predictions. He chose to focus on predictions about political developments around the world. He enlisted some of the world’s foremost experts — the kind of very smart people who have written definitive books, who show up on CNN or on the Times’s op-ed page.

TETLOCK: In the end we had close to three hundred participants. And they were very sophisticated political observers. Virtually all of them had some post-graduate education. Roughly two-thirds of them had PhDs. They were largely political scientists, but there were some economists and a variety of other professionals as well.

DUBNER: And they all participated in your study anonymously, correct?

TETLOCK: That was a very important condition for obtaining cooperation.

DUBNER: Now, if they were not anonymous then presumably we would recognize some of their names, these are prominent people at political science departments, economics departments at I’m guessing some of the better universities around the world, is that right?

TETLOCK: Well, I don’t want to say too much more, but I think you would recognize some of them, yes. I think some of them had substantial Google counts.

SJD NARR: The study became the basis of a book Tetlock published a few years ago, called “Expert Political Judgment.” There were two major rounds of data collection, the first beginning in 1988, the other in 1992. These nearly 300 experts were asked to make predictions about dozens of countries around the world. The questions were multiple choice. For instance: In Democracy X — let’s says it’s England — should we expect that after the next election, the current majority party will retain, lose, or strengthen its status? Or, for Undemocratic Country Y — Egypt, maybe — should we expect the basic character of the political regime to change in the next five years? In the next 10 years? and if so, in what direction? And to what effect? The experts made predictions within their areas of expertise, and outside; and they were asked to rate their confidence for their predictions. So after tracking the accuracy of about 80,000 predictions by some 300 experts over the course of 20 years, Philip Tetlock found:

TETLOCK: That experts thought they knew more than they knew.That there was a systematic gap between subjective probabilities that experts were assigning to possible futures and the objective likelihoods of those futures materializing.

DUBNER: Let me translate that for you. The experts were pretty awful. And you think: awful compared to what? Did they beat a monkey with a dartboard?

TETLOCK: Oh, the monkey with a dartboard comparison, that comes back to haunt me all the time. But with respect to how they did relative to, say, a baseline group of Berkeley undergraduates making predictions, they did somewhat better than that. Did they do better than an extrapolation algorithm? No, they did not. They did for the most part a little bit worse than that. How did they do relative to purely random guessing strategy? Well, they did a little bit better than that, but not as much as you might hope.

DUBNER: That “extrapolation algorithm” that Tetlock mentioned? That’s simply a computer programmed to predict “no change in current situation.” So it turned out these smart, experienced, confident experts predicted the political future about as well, if not slightly worse, than the average daily reader of The New York Times.

TETLOCK: I think the most important takeaway would be that the experts are, they think they know more than they do. They were systematically overconfident. Some experts were really massively overconfident. And we are able to identify those experts based on some of their characteristics of their belief system and their cognitive style, their thinking style.

DUBNER: OK. So now we’re getting into the nitty-gritty of what makes people predict well or predict poorly. What are the characteristics then of a poor predictor?

TETLOCK: Dogmatism.

DUBNER: It can be summed up that easily?

TETLOCK: I think so. I think an unwillingness to change one’s mind in a reasonably timely way in response to new evidence. A tendency, when asked to explain one’s predictions, to generate only reasons that favor your preferred prediction and not to generate reasons opposed to it.

DUBNER: And I guess what’s striking to me and I’d love to hear what you had to say about this is that it’s easy to provide one word, prediction, to many, many, many different realms in life. But those realms all operate very differently — so politics is different from economics, and predicting a sports outcome is different than predicting, you know, an agricultural outcome. It seems that we don’t distinguish so much necessarily and that there’s this modern sense almost that anything can be and should be able to be predicted. Am I kind of right on that, or no?

TETLOCK: I think there’s a great deal of truth to that. I think it is very useful in talking about the predictability of the modern world to distinguish those aspects of the world that show a great deal of linear regularity and those parts of the world that seems to be driven by complex systems that are decidedly nonlinear and decidedly difficult if not impossible to predict.

DUBNER: Talk to me about a few realms that generally are very, very hard to predict, and a few realms that generally are much easier.

TETLOCK: Predicting Scandinavian politics is a lot easier than predicting Middle Eastern politics.

DUBNER: Yes, that was the first one that came to my mind too! All right, but keep going.

TETLOCK: The thing about the radically unpredictable environments is that they often appear for long periods of time to be predictable. So, for example, if you had been a political forecaster predicting regime longevity in the Middle East, you would have done extremely well predicting in Egypt that Mubarak would continue to be the president of Egypt year after year after year in much the same way that if you had been a Sovietologist you would have done very well in the Brezhnev era predicting continuity. There’s an aphorism I quote in the “Expert Political Judgment” book from Karl Marx. I’m obviously not a Marxist but it’s a beautiful aphorism that he had which was that, “When the train of history hits a curve, the intellectuals fall off.”

DUBNER: Coming up: Who do you predict we’ll hear from next — a bunch of people who are awesomely good at predicting the future? Yeah, right. Maybe later. First, we’ll hear some more duds — from Wall Street, the NFL, and … the cornfield.

[UNDERWRITING]

ANNOUNCER: From American Public Media and WNYC, this is Freakonomics Radio. Here’s your host, Stephen Dubner.

DUBNER: So Phillip Tetlock has sized up the people who predict the future–geopolitical change, for instance–and determined that they’re not very good at predicting the future. He also tells us that their greatest flaw is dogmatism–sticking to their ideologies even when presented with evidence that they’re wrong. You buy that? I buy it. Politics is full of ideology; why shouldn’t the people who study politics be a least a little bit ideological? So let’s try a different set of people, people who make predictions that, theoretically at least, have nothing to do with ideology. Let’s go to Wall Street.

[SOUND EFFECT: WALL STREET MONTAGE]

Christina FANG: I’m Christina Fang, a Professor of Management at New York University’s business school.

DUBNER: Christina Fang, like Philip Tetlock, is fascinated with prediction:

FANG: Well, I guess generally forecasting about anything, about technology, about a product, whether it will be successful, about whether an idea, a venture idea could take off, a lot of things, not just economic but also business in general.

DUBNER: Fang wasn’t interested in just your street-level predictions, though. She wanted to know about the Big Dogs, the people who make bold economic predictions that carry price tags in the many millions or even billions of dollars. Along with a fellow researcher, Jerker Denrell, Fang gathered data from the Wall Street Journal’s Survey of Economic Forecasts. Every six months, the paper asked about 50 top economists to predict a set of macroeconomic numbers — unemployment, inflation, gross national product, things like that. Fang audited seven consecutive surveys, with an eye toward a particular question: when someone correctly predicts an extreme event — a market crash, maybe, or a sudden spike in inflation — what does that say about his overall forecasting ability?

FANG: In the Wall Street Journal survey if you look at the extreme outcomes, either extremely bad outcomes and extremely good outcomes, you see that those people who correctly predicted either extremely good or extremely bad outcomes, they’re likely to have overall lower level of accuracy. In other words, they’re doing poorer in general.

SJD NARR: Uh-oh. You catching this?

FANG: Those people who happen to predict accurately the extreme events, we also look at their–they happen to also have a lower overall level of accuracy.

DUBNER: So I can be right on the big one but if I’m right on the big one I generally will tend to be more often wrong than the average person.

FANG: On average–

DUBNER: On average.

FANG: Across everyday predictions as well. And our research suggests that for someone who has successfully predicted those events, we are going to predict that they are not likely to repeat their success very often. In other words, their overall capability is likely to be not as impressive as their apparent success seems to be.

DUBNER: So the people who make big, bold, correct predictions are in general worse than average at predicting the economic future. Now, why is this a problem? Maybe they’re just like home-run hitters — y’know, a lot of strikeouts but a lot of power too. All right, I’ll tell you why it’s a problem. Actually, I’ll have Steve Levitt tell you.

LEVITT: The incentives for prediction makers are to make either cataclysmic or utopian predictions, right? Because you don’t get attention if I say that what’s going to happen tomorrow is exactly as what’s going to happen today…

DUBNER: You don’t get on TV.

LEVITT: I don’t get on TV. If it happens to come true, who cares? I don’t get any credit for it coming true either.

DUBNER: There’s a strong incentive to make extreme predictions; because, seriously, who tunes in to hear some guy say that “Next year will be pretty much like last year”? And once you have been right on an extreme forecast — let’s say you predicted the 2008 market crash and the Great Recession — even if you were predicting it every year, like Steve Levitt’s mother — you’ll still be known as The Guy Who Called the Big One. And even if all your followup predictions are wrong, you still got the Big One right. Like Joe Namath.

All right, look. Predicting the economy? Predicting the political future? Those are hard. Those are big, complex systems with lots of moving parts. So how about football? If you’re an NFL expert, how hard can it be to forecast, say, who the best football teams will be in a given year? We asked Freakonomics researcher Hayes Davenport to run the numbers for us:

Hayes DAVENPORT: Well, I looked at the past three years of expert picking from the major NFL prediction outlets, which are USA Today, SportsIllustrated.com and ESPN.com. We looked at a hundred and five sets of picks total. They’re picking division winners for each year, as well as the wild card for that year. So they’re basically picking the whole playoff picture for that year.

DUBNER: So talk about just kind of generally the degree of difficulty of making this kind of a pick.

DAVENPORT: Well, if you’re sort of an untrained animal, making NFL picks, you’re going to have about a twenty-five percent chance of picking each division correctly because there are only four teams.

DUBNER: All right so Hayes, you’re saying that an untrained animal would be about twenty five percent accurate if you pick one out of four. But what about a trained animal, like a me, a casual fan? How do I do compared to the experts?

DAVENPORT: Right. So if you’re cutting off the worst team in each division, if you’re not picking among those you’ll be right, thirty-three percent of the time, one in three, and the experts are right about thirty-six percent of the time, so just a little better than that.

DUBNER: OK, so if you’re saying they’re picking about thirty-six percent accuracy, and I or someone by chance would pick at about thirty three-percent accuracy. So that’s a three percentage point improvement, or about a ten percent better, maybe we should say, you know, that’s not bad. If you beat the stock market by ten percent every year you’d be doing great. So are these NFL pundits being thirty-six percent right being really wonderful or–

DAVENPORT: I wouldn’t say that because there’s a specific fallacy these guys are operating from, which is they tend to rely much too heavily on the previous year’s standings in making their picks for the following year. They play it very conservatively. But there’s a very high level of parity in the NFL right now, so that’s not exactly how it works.

DUBNER: Tell me some of the pundits who whether by luck or brilliance and hard work turn out to be really, really good.

DAVENPORT: Sure. There are two guys from ESPN who are sort of far ahead of the field. One is Pat Yasinskas, and the other is John Clayton, who is pretty well known; he makes a lot of appearances on SportsCenter and he’s kind of a, nebbish-y professorial type. And they perform much better than everyone else because they’re excellent wild-card pickers. They’re the only people who have correctly predicted both wild card teams in a conference in a season. But they’re especially good because they actually play it much safer than everyone else.

DUBNER: Now you say that they are very good. Persuade me that they’re good and not lucky.

DAVENPORT: I can’t do that. There’s a luck factor involved in all of these predictions. For example, if you pick the Patriots in 2008 and Tom Brady gets injured, and they drop out of the playoffs, there’s very little you can do to predict that. So injuries will mess with prediction all the time. And other turnover rates in football that are sort of unpredictable. So there’s a luck factor to all of this.

DUBNER: So whether it’s football experts calling Sunday’s game or economists forecasting the economy, or political pundits looking for the next revolution, we’re talking about accuracy rates that barely beat a coin toss. But maybe all these guys deserve a break. Maybe it’s just inherently hard to predict the future of other human beings. They’re so malleable; so unpredictable! So how about a prediction where human beings are incidental to the main action?

Joe PRUSACKI: I’m Joe Prusacki and I am the Director of Statistics Division with USDA’s National Agricultural Statistics Service, or NASS for short.

DUBNER: You grew up on a farm, yeah?

PRUSACKI: Uh-huh: Yep, I grew up in–I always call it “deep southern” Illinois. I’m sitting here in Washington DC and where I grew up in Illinois is further south than where I’m sitting today. We raised…we had corn, soybeans and raised hogs.

DUBNER: You’ve heard of Anna Wintour, right? The fabled editor of Vogue magazine? Joe Prusacki is kinda like Anna Wintour for farmers. He puts out publications that are read by everyone who’s anyone in the industry — titles like “Acreage” and “Prospective Plantings” and “Crop Production.” Prusacki’s reports carry running forecasts of crop yields for cotton, soybeans, wheat and corn.

PRUSACKI: Most of the time our monthly forecasts are probably within I can guarantee you within five percent and most of the time I can say within two to three percent of the final. And someone would say that’s seems very good. But in the agricultural world, the users expect us to be much more precise in our forecasts.

DUBNER: So how does this work? How does the USDA forecast something as vast as the agricultural output of American farmers?

PRUSACKI: Like at the beginning of March, we will conduct a large survey of farmers and ranchers across the United States and sample size this time, this year was about 85,000.

DUBNER: The farmers are asked how many acres they plan to devote to each crop. Corn, let’s say. Then, in late July, the USDA sends out a small army of “enumerators” into roughly 1,900 cornfields in 10 states. These guys mark off plots of corn, 20 feet long by two rows across.

PRUSACKI: They’re randomly placed. We have randomly selected fields, in random location within field. So you may get a sample that’s maybe 20 paces into the field and 40 rows over and you may get one that’s 250 paces into the field and 100 rows over.

DUBNER: The enumerators look at every plant in that plot.

PRUSACKI: And then they’ll count what they see or anticipate to be ears based on looking at the plant.

DUBNER: A month later, they go back out again and check the cornstalks, check the ears.

PRUSACKI: Well, you could have animal loss, animal might chew the plant off, the plant may die. So all along we’re updating the number of plants, all along we’re updating the number of ears. The other thing we need, you need an estimate of ear weight or fruit weight.

DUBNER: So they go out again, cut off a bunch of ears and weigh them. But wait: still not done. After the harvest, there’s one more round of measurement.

PRUSACKI: Once the field is harvested, and the machine has gone through the field, the enumerator will go back out to the field, they’ll lay out another plot–just beyond the harvest area where we were–and they will go through and pick up off the ground any kernels that are left on the ground, pieces of ears of corn and such on the ground so we get a measure of harvest loss.

DUBNER: So this sounds pretty straightforward, right? Compared to predicting something like the political or economic future, estimating corn yield based on constant physical measurements of corn plants is pretty simple. Except for one thing. It’s called the weather. Weather remains so hard to predict in the long term that the USDA doesn’t even use forecasts; it uses historic averages instead.

DUBNER: So Joe, talk to me about what happened last year with the USDA corn forecast. You must have known this was coming from me. So the Wall Street Journal’s headline was: “USDA Flubs in Predicting Corn Crops.” Explain what happened.

PRUSACKI: Well, this is the weather factor that came into play. It turned out pretty hot and pretty dry in most of the growing region. And I had asked a few folks that are out and about in Iowa what happened. They said this is just a really strange year. We just don’t know. Now, when if someone says did we flub it? I don’t know. It was the forecast based on the information I had as for August 1. Now, September 1, I had a different set of information. October 1, I had a different set of information. Could we have did a better job?

DUBNER: A lot of people thought they could have. Last June, the USDA lowered its estimate of corn stockpiles; and in October, it cut its estimate of corn yield. After the first report, the price of corn spiked 9 percent. The second report? Another 6 percent. Joe Prusacki got quite a few e-mails:

PRUSACKI: OK, the first one is, this was: “Thanks a lot for collapsing the grain market today with your stupid…and the word is three letters, begins with an “a” and then it has two dollar signs … USDA report.

“As bad as the stench of dead bodies in Haiti must be, it can’t even compare to the foul stench of corruption emanating from our federal government in Washington DC.”

DUBNER: It strikes me that there’s room for trouble here in that your forecasts are used by a lot of different people who engage in a lot of different markets, and your research can move markets. I’m wondering what kind of bribes maybe come your way?

PRUSACKI: It’s interesting, I have people that call, we call them ‘fishersThey call maybe a day or two days before when we’re finishing our work and it’s like I tell them, I say, “Why do you do this? We’ve had this discussion before.” There’s a couple things, one I sign a confidentiality statement every year that says I shall not release any information before it’s due time or bad things happen. It’s a $100,000 fine or time in prison. It’s like the dollar fine, OK. It’s the prison part that bothers me!

DUBNER: But there’s got to be a certain price at which–so let’s say I offered you, I came to you and I said–Joe, $10 million for a 24-hour head start on the corn forecast.

PRUSACKI: I’m not going to do it. Trust me, somebody would track me down.

DUBNER: I hear you.

PRUSACKI: Again, the prison time, it bothers me.

DUBNER: All right, so Joe Prusacki probably can’t be bought. And the USDA is generally considered to do a pretty good job with crop forecasts. But: look how hard the agency has to work, measuring corn fields row by row, going back to look for animal loss and harvest loss. And still, its projection, which is looking only a few months into the future, can get thrown totally out of whack by a little stretch of hot, dry weather. That dry spell was essentially a random event, kind of like Tom Brady’s knee getting smashed. I hate to tell you this but the future is full of random events. That’s why it’s so hard to predict. That’s why it can be scary. Do we know this? Of course we know it. Do we believe it? Mmmmm.

Some scholars say that our need for prediction is getting worse — or, more accurately, that we get more upset now when the future surprises us. After all, as the world becomes more rational and routinized, we often know what to expect. I can get a Big Mac not only in New York but in Beijing, too — and they’ll taste pretty much the same. So when you’re used to that, and when things don’t go as expected — watch out.

Our species has been trying to foretell the future forever. Oracles and goat entrails and roosters pecking the dirt. The oldest religious texts are filled with prediction. I mean, look at the afterlife! What is that if not a prediction of the future? A prediction that, as far as I can tell, can never be categorically refuted or confirmed. A prediction so compelling that it remains all these years later a concept around which billions of people organize their lives. So what do you see when you gaze into the future? A yawning chasm of random events — or do you look for a neat pattern, even if no such pattern exists?

Nassim TALEB: It’s much more costly for someone to not detect a pattern.

DUBNER: That’s Nassim Taleb, the author of “Fooled By Randomness” and “The Black Swan.”

TALEB: It’s much costlier for us — as a race, to make the mistake of not seeing a leopard than having the illusion of pattern and imagining a leopard where there is none. And that error, in other words, mistaking the non-random for the random, which is what I call the “one-way bias.” Now that bias works extremely well, because what’s the big deal of getting out of trouble? It’s not costing you anything. But in the modern world, it is not quite harmless. Illusions of certainty makes you think that things that haven’t exhibited risk, for example the stock market, are riskless. We have the turkey problem — the butcher feeds the turkey for a certain number of days, and then the turkey imagines this is permanent.

DUBNER: “The butcher feeds the turkey and the turkey imagines this is permanent.” So you’ve got to ask yourself: who am I? The butcher? Or the turkey? Coming up: hedgehogs and foxes — and a prediction that does work. Here’s a hint: if you like this song, [MUSIC], you’ll probably like this one too: [MUSIC].

[UNDERWRITING]

ANNOUNCER: From American Public Media and WNYC, this is Freakonomics Radio.

DUBNER: Hey, guess what, Sunshine? Al Gore didn’t win Florida. Didn’t become president either. Try walking that one back. So we are congenital predictors, but our predictions are often wrong. What then? How do you defend your bad predictions? I asked Philip Tetlock what all those political experts said when he showed them their results. He had already stashed their excuses in a neat taxonomy:

TETLOCK: So, if you thought that Gorbachev for example, was a fluke, you might argue, well my understanding of the Soviet political system is fundamentally right, and the Soviet Politburo, but for some quirky statistical aberration of the Soviet Politburo would have gone for a more conservative candidate. Another argument might be, well I predicted that Canada would disintegrate, that Quebec would secede from Canada, and it didn’t secede, but the secession almost did succeed because there was a fifty point one percentage vote against secession, and that’s well within the margin of sampling error.

DUBNER: Are there others you want to name?

TETLOCK: Well another popular prediction is “off on timing.” That comes up quite frequently in the financial world as well. Many very sophisticated students of finance have commented on how hard it is, saying the market can stay irrational longer than you can stay liquid, I think is George Soros’s expression. So, “off on timing” is a fairly popular belief-system defense as well. And I predicted that Canada would be gone. And you know what? It’s not gone yet. But just hold on.

DUBNER: You answered very economically when I asked you what are the characteristics of a bad predictor; you used one word, dogmatismm. What are the characteristics, then, of a good one?

TETLOCK: Capacity for constructive self-criticism.

DUBNER: How does that self-criticism come into play and actually change the course of the prediction?

TETLOCK: Well, one sign that you’re capable of constructive self-criticism is that you’re not dumbfounded by the question: What would it take to convince you you’re wrong? If you can’t answer that question you can take that as a warning sign.

DUBNER: In his study, Tetlock found that one factor was more important than any other in someone’s predictive ability: cognitive style. You know the story about the fox and the hedgehog?

TETLOCK: Isaiah Berlin tells us that the quotation comes from the Greek warrior poet Archilichus 2,500 years ago. And the rough translation was the fox knows many things but the hedgehog knows one big thing.

DUBNER: So, talk to me about what the foxes do as predictors and what the hedgehogs do as predictors.

TETLOCK: Sure. The foxes tend to have a rather eclectic, opportunistic approach to forecasting. They’re very pragmatic. A famous aphorism by Deng Xiaoping was he “didn’t care if the cat was white or black as long as it caught mice.” And I think the attitude of many foxes is they really didn’t care whether ideas came from the left or the right, they tended to deploy them rather flexibly in deriving predictions. So they often borrowed ideas across schools of thought that hedgehogs viewed as more sacrosanct. There are many subspecies of hedgehog. But what they have in common is a tendency to approach forecasting as a deductive, top-down exercise. They start off with some abstract principles, and they apply those abstract principles to messy, real-world situations, and the fit is often decidedly imperfect.

DUBNER: So foxes tend to be less dogmatic than hedgehogs, which makes them better predictors. But, if you had to guess, who do you think more likely to show up TV or in an op-ed column, the pragmatic, nuanced fox or the know-it-all hedgehog?

[SOUND MONTAGE]

DUBNER: You got it!

TETLOCK: Hedgehogs, I think, are more likely to offer quotable sound bites, whereas foxes are more likely to offer rather complex, caveat-laden sound bites. They’re not sound bites anymore if they’re complex and caveat-laden.

DUBNER: So, if you were to gain control of let’s say a really big media outlet, New York Times, or NBC TV, and you said, you know, I want to dispense a different kind of news and analysis to the public, what would you do? How would you suggest building a mechanism to do a better job of keeping all this kind of poor expert prediction out of the, off the airwaves.

TETLOCK: I’m so glad you asked that question. I have some specific ideas about that. And I don’t think they would be all that difficult to implement. I think they should try to keep score more. I think there’s remarkably little effort in tracking accuracy. If you happen to be someone like Tom Friedman or Paul Krugman, or someone who’s at the top of the pundit pecking order, there’s very little incentive for you to want to have your accuracy tested because your followers are quite convinced that you’re extremely accurate, and it’s pretty much a game you can only lose.

DUBNER: Can you imagine? Every time a pundit appeared on TV, the network would list his batting average, right after his name and affiliation. You think that might cut down on blowhard predictions just a little bit? Looking back at what we’ve learned so far, it makes me wonder: maybe the first step toward predicting the future should be to acknowledge our limitations. Or–at the very least–let’s start small. For instance: if I could tell you what kind of music I like, and then you could predict for me some other music I’d want to hear. That actually already exists. It’s called Pandora Radio. Here’s co-founder Tim Westergren.

Tim WESTERGREN: So, what we’ve done is, we’ve broken down recordings into their basic components for every dimension of melody, harmony, and rhythm, and form, and instrumentation, down into kind of the musical equivalent of primary colors.

DUBNER: The Pandora database includes more than a million songs, across every genre that you or I could name. Each song is broken down into as many as 480 musical attributes, almost like genetic code. Pandora’s organizing system is in fact called the “Music Genome Project.” You tell the Pandora website a song you like, and it rummages through that massive genetic database to make an educated guess about what you want to hear next. If you like that song, you press the thumbs-up button, and Pandora takes note.

WESTERGREN: I wouldn’t make the claim that Pandora can map your emotional persona. And I also don’t think frankly that Pandora can predict a hit because I think it is very hard, it’s a bit of a magic, that’s what makes music so fantastic. So, I think that we know our limitations, but within those limitations I think that we make it much, much more likely that you’re going to find that song that just really touches you.

DUBNER: So Tim, you were good enough to set up a station for me here. It’s called “Train in Vain Radio.” So the song we gave you was “Train in Vain.” So let me open up my radio station here and I’ll hit play and see what you got for me.

[MUSIC PLAYS]

DUBNER: Oh yeah. Yeah I like them, that’s The Jam, so I’m going to give it a thumbs up I like “Town Called Malice.” .on my little window here. I think there are a couple more songs in my station here.

[MUSIC PLAYS]

“Television” by Tom Verlaine, he was always too cool for me. I can see why you would think that I would like them, and I appreciate your effort, Mr. Pandora. How about you, were you a “Television” fan?

WESTERGREN: Yeah, yeah. And you know, one thing of course is that the songs are all rooted in guitar riffs.

DUBNER: Yep.

WESTERGREN: There’s a repetitive motif played on the guitar. And a similar sound and they’ve got a little twang– and they’re played kind of rambly, a little bit rough, there’s a sort of punk element in there. The vocals have over twenty attributes just for the voice. In this case these are pretty unpolished vocal deliveries.

DUBNER: I got to tell you that even though when this song came up, and I’ve heard this song a few times, and I told you I didn’t like Television very much, this song, I’m kind of digging it now.

WESTERGREN: See, there you go, that’s exactly what we’re trying to do.

DUBNER: So, it’s a really great thing to do, but it’s not really predicting the future the way most people think of it as predicting the future, is it?

WESTERGREN: Well, I certainly wouldn’t have put our mission in the same category as predicting the economy, or, you know, geopolitical futures. But you know, the average American listens to 17 hours of music a week. So, they spend a lot of time doing it, and I think that if we can make that a more enjoyable experience and more personalized, I think maybe we’ll make some kind of meaningful contribution to culture.

DUBNER: So Pandora does a pretty good job of predicting the music you might want to hear, based on what you already know you like. But again, look how much effort that takes — 480 musical attributes! And it’s not really predicting the future, is it? All Pandora does is breaks down the confirmed musical preferences of one person today and comes up with some more music that’ll fulfill that same person’s preferences tomorrow. If we really want to know the future, we probably need to get much more ambitious. We probably need a whole new model. Like, how about prediction markets?

Robin HANSON: A prediction market is basically like a betting market or a speculative market, like orange juice futures or stock markets, things like that. The mechanics is that there’s a — an asset of some sort that pays off if something’s true, like whether a, a person wins the presidency or a team wins a sporting contest. And people trade that asset and the price of that asset becomes then a forecast of whether that claim is likely to be true.

DUBNER: That’s Robin Hanson, an economics professor at George Mason University and an admitted advocate of prediction markets. As Hanson sees it, a prediction market is far more reliable than other forecasting methods because it addresses the pesky incentive problems of the old-time prediction industry.

HANSON: So a prediction market gives people an incentive, a clear personal incentive to be right and not wrong. Equally important, it gives people an incentive to shut up when they don’t know, which is often a problem with many of our other institutions. So if you as a reporter call up almost any academic and and ask them vaguely related questions, they’ll typically try to answer them, just because they want to be heard. But in a prediction market most people don’t speak up. Every one of your listeners today had the right to go speak up on orange juice futures yesterday. Every one of you could have gone and said, orange juice futures forecasts are too low or too high, and almost no one did. Why? Because most of you don’t think you know. And that’s just the way we want it.So in most of these prediction markets what we want is the few people who know the best to speak up and everybody else to shut up.

DUBNER: Prediction markets are flourishing. Some of them are private — a multinational firm might set up an internal market to try to forecast when a big project will be done. And there are for-profit prediction markets like InTrade, based in Dublin, where you can place a bet on, say, whether any country that currently uses the Euro will drop the Euro by the end of the year. (As I speak, that bet has a 15% chance on InTrade.) Here’s another InTrade bet: whether there’ll be a successful WMD terrorist attack anywhere in the world by the end of 2013. (That’s got a 28% chance.) Now that’s starting to sound a little edgy, no? Betting on terrorism? Robin Hanson himself has a little experience in this area, on a U.S. government project he worked on.

HANSON: All right, so — back in 2000, DARPA, the Defense Advanced Research Projects Agency, had heard about prediction markets, and they decided to fund a research project. And they basically said, listen, we’ve heard this is useful for other things, we’d like you to show us that this can be useful for the kind of topics we are interested in. Our project was going to be forecasting geopolitical trends in the Middle East. We were going to show that prediction markets could tell you about economic growth, about riots, about perhaps wars, about whether the changes of heads of state… and how these things would interact with each other.

DUBNER: In 2003, just as the project was about to go live, the press heard about it.

HANSON: On Monday morning two senators had a press conference where they declared that the — DARPA, the — and the military were going to have a betting market on terrorism.

HANSON: And so, there was a sudden burst of media coverage and by the very next morning the head of the military basically declared before the Senate that this project was dead, and there was nothing more to worry about.

DUBNER: What do you think you — we collectively, you, in particular — would know now about that part of the world, let’s say, if this market had been allowed to take root?

HANSON: Well, I think we would have gotten much earlier warning about the revolutions we just had. And if we would have had participants from the Middle East forecasting those markets. Not only we would get advanced warning about which things might happen, but then how our actions could affect those. So, for example, the United States just came in on the side of the Libyan rebels, to support the Libya rebels against the Qaddafi regime. What’s the chances that will actually help the situation, as opposed to make it worse?

DUBNER: But give me an example of what you consider among the hardest problems that a prediction market could potentially help solve?

HANSON: Who should — not only who should we elect for president but whether we should go to war here or whether we should begin this initiative? Or should we approve this reform bill for medicine, etc.

DUBNER: So that sounds very logical, very appealing. How realistic is it?

HANSON: Well, it depends on there being a set of customers who want this product. So, you know, if prediction markets have an Achilles heel, it’s certainly the possibility that people don’t really want accurate forecasts.

DUBNER: Prediction markets put a price on accountability. If you’re wrong, you pay, simple as that. Just like the proposed law against the witches in Romania. Maybe that’s what we need more of. Here’s Steve Levitt again:

LEVITT: When there are big rewards to people who make predictions and get them right, and there are zero punishments for people who make bad predictions because they’re immediately forgotten, then economists would predict that’s a recipe for getting people to make predictions all the time.

DUBNER: Because the incentives are all encouraging you to make predictions.

LEVITT: Absolutely.

DUBNER: If you get it right there’s an upside, and if you get it wrong there’s almost no downside.

LEVITT: Right, if the flipside were that if I make a false prediction I’m immediately sent to prison for a one-year term, there would be almost no prediction.

DUBNER: And all those football pundits and political pundits and financial pundits wouldn’t be able to wriggle out of their bad calls — saying “My idea was right, but my timing was wrong.” Maybe that’s how everybody does it. That big storm the weatherman called but never showed up? “Oh, it happened all right,” he says, “but two states over.” Or how about those predictions for the End of the World — the Apocalypse, the Rapture, all that? “Well,” they say, “we prayed so hard that God decided to spare us.”

Remember back in May, when an 89-year-old preacher named Harold Camping declared that the Earth would be destroyed at 5:59 p.m. on a Saturday, and only the true believers would survive? I remember it very well because my 10-year-old son was petrified. I tried telling him that Camping was a kook — that anybody can say pretty much anything they want about the future. It didn’t help; he couldn’t get to sleep at night.

And then the 21st came and went and he was psyched. “I knew it all along, Dad,” he said.

Then I asked him what he thought should happen to Harold Camping, the false Doomsday prophet. “Oh, that’s easy,” he said. “Off with his head!”

My son is not a bloodthirsty type. But he’s not a turkey either.

Should Bad Predictions Be Punished? (Freakonomics.com)

SUZIE LECHTENBERG

08/09/2011 | 8:33 pm

Government corn predictions are based on the work of people like Phil Friedrichs, gathering data in a corn field in Hiawatha, Kansas. (Photo: Stephen Koranda)

What do Wall Street forecasters and Romanian witches have in common? They usually get away, scot-free, with making bad predictions. Our world is awash in poor prediction — but for some reason, we can’t stop, even though accuracy rates often barely beat a coin toss.

But then there’s the U.S. Department of Agriculture’s crop forecasting. Predictions covering a big crop like corn (U.S. farmers have planted the second largest crop since WWII this year) usually fall within five percent of the actual yield. So how do they do it? Every year, the U.S.D.A. sends thousands of enumerators into cornfields across the country where they inspect the plants, the conditions, and even “animal loss.”

This week on Marketplace, Stephen J. Dubner and Kai Ryssdal talk about the supply and demand of predictions. You’ll hear from Joseph Prusacki, the head of U.S.D.A’s Statistics Division, who’s gearing up for his first major crop report of 2011 (the street is already “sweating” it); Phil Friedrichs, who collects cornfield data for the USDA; and our trusted economist and Freakonomics co-author Steven Levitt.

We’ll also hear from journalist Vlad Mixich in Bucharest, who tells us why those Romanian witchesmight not be getting away with bad fortune telling for much longer.

An Algorithm that Can Predict Weather a Year in Advance (Freakonomics.com)

MATTHEW PHILIPS

09/27/2011 | 3:51 pm

In our latest podcast, “The Folly of Prediction,” we poke fun at the whole notion of forecasting. The basic gist is: whether it’s Romanian witches or Wall Street quant wizards, though we love to predict things — we’re generally terrible at it. (You can download/subscribe at iTunes, get the RSS feed, or read the transcript here.)

But there is one emerging tool that’s greatly enhancing our ability to predict: algorithms. Toward the end of the podcast, Dubner talks to Tim Westergren, a co-founder of Pandora Radio, about how the company’s algorithm is able to predict what kind of music people want to hear, by breaking songs down to their basic components. We’ve written a lot about algorithms, and the potential they have to vastly change our life through customization, and perhaps satisfy our demand for predictions with some robust results.

One of the first things that comes to mind when people hear the word forecasting is the weather. Over the last few decades, we’ve gotten much better at predicting the weather. But what if through algorithms, we could extend our range of accuracy, and say, predict the weather up to a year in advance? That’d be pretty cool, right? And probably worth a bit of money too.

That’s essentially what the folks at a small company called Weather Trends International are doing. The private firm based in Bethlehem, PA, uses technology first developed in the early 1990s, to project temperature, precipitation and snowfall trends up to a year ahead, all around the world, with more than 80% accuracy. Translation: they gather up tons and tons of data, literally as much historical information on weather around the world as is out there, and then cram it into some 5.5 million lines of proprietary computer code (their algorithm) to spit out weather forecasts up to a year in advance. This is fairly different from what most meteorologists do by modeling the atmosphere. “Only about 15% of what we do is traditional forecast meteorology,” says CEO Bill Kirk, a former U.S. Air Force Captain with a degree from Rutgers in meteorology. Kirk began working on the WTI algorithm while in the Air Force.

Since launching in 2003, WTI has carved out a nice business for itself by marketing weather predictions to a range of clients, from commercial retailers and manufacturers (Wal-Mart, Target, Anheuser-Busch, Johnson & Johnson), to financial services firms and commodity traders– all of whom depend on the weather. Consumption of beer, for example, varies greatly with the temperature. “For every 1 degree hotter it is, Anheuser-Busch sells 1 percent more product,” says Kirk. And since beer is often made and bottled months in advance, the sooner they can know how hot it will be in May, the sooner they can plan accordingly. Unlike a lot of professional predictors, WTI’s business model has a built-in incentive structure: “Our clients are making multi-million dollar decisions based on our forecasts. If we’re not right, they’re not coming back.”

Though a trained meteorologist, Kirk says that over the last several years, he’s learned a lot about what really drives weather. He talks at length about the phenomenon known as Pacific decadal oscillation, which holds that the Pacific Ocean cycles through periods of warm and cold temperatures lasting about 30 years each. From 1976, to roughly 2006, the Pacific was in a warm phase, but is now cooling. Kirk believes that it’s this change that’s behind much of the bizarre weather we’ve seen over the last few years, from record snowfall and tornado activity, to droughts in the South, to floods in Australia. “The PDO cycles used to be a footnote in climate reports,” says Kirk. “Now we see them as playing a prominent role in determining weather patterns.”

Kirk is now trying to market his long-range forecasting to the private sector with a new website,Weathertrends360, as well as a new app. They both allow you to get a day-by-day forecast all the way through August 2012. Here’s his forecast for New York City over the next two months:

Just for kicks, I’ll check in from time to time to see how accurate the WTI forecasts end up being.

Freakonomics Poll: When It Comes to Predictions, Whom Do You Trust? (Freakonomics.com)

FREAKONOMICS

09/16/2011 | 11:27 am

Our latest Freakonomics Radio podcast, “The Folly of Prediction,” is built around the premise that humans love to predict the future, but are generally terrible at it. (You can download/subscribe at iTunes, get the RSS feed, listen live via the media player above, or read the transcript here.)

There are a host of professions built around predicting some future outcome: from predicting the score of a sports match, to forecasting the weather for the weekend, to being able to tell what the stock market is going to do tomorrow. But is anyone actually good at it?

From your experience, which experts do you trust for predictions?

  • None of the Above (39%, 447 Votes)
  • Meteorologists (37%, 414 Votes)
  • Economists (14%, 158 Votes)
  • Sports Experts (9%, 98 Votes)
  • Political Pundits (1%, 16 Votes)
  • Stock Market Analysts (1%, 10 Votes)

Total Voters: 1,132