Arquivo da tag: Percepção

Our brains exist in a state of “controlled hallucination” (MIT Technology Review)

technologyreview.com

Matthew Hutson – August 25, 2021

Three new books lay bare the weirdness of how our brains process the world around us.

Eventually, vision scientists figured out what was happening. It wasn’t our computer screens or our eyes. It was the mental calculations that brains make when we see. Some people unconsciously inferred that the dress was in direct light and mentally subtracted yellow from the image, so they saw blue and black stripes. Others saw it as being in shadow, where bluish light dominates. Their brains mentally subtracted blue from the image, and came up with a white and gold dress. 

Not only does thinking filter reality; it constructs it, inferring an outside world from ambiguous input. In Being You, Anil Seth, a neuroscientist at the University of Sussex, relates his explanation for how the “inner universe of subjective experience relates to, and can be explained in terms of, biological and physical processes unfolding in brains and bodies.” He contends that “experiences of being you, or of being me, emerge from the way the brain predicts and controls the internal state of the body.” 

Prediction has come into vogue in academic circles in recent years. Seth and the philosopher Andy Clark, a colleague at Sussex, refer to predictions made by the brain as “controlled hallucinations.” The idea is that the brain is always constructing models of the world to explain and predict incoming information; it updates these models when prediction and the experience we get from our sensory inputs diverge. 

“Chairs aren’t red,” Seth writes, “just as they aren’t ugly or old-fashioned or avant-garde … When I look at a red chair, the redness I experience depends both on properties of the chair and on properties of my brain. It corresponds to the content of a set of perceptual predictions about the ways in which a specific kind of surface reflects light.” 

Seth is not particularly interested in redness, or even in color more generally. Rather his larger claim is that this same process applies to all of perception: “The entirety of perceptual experience is a neuronal fantasy that remains yoked to the world through a continuous making and remaking of perceptual best guesses, of controlled hallucinations. You could even say that we’re all hallucinating all the time. It’s just that when we agree about our hallucinations, that’s what we call reality.”

Cognitive scientists often rely on atypical examples to gain understanding of what’s really happening. Seth takes the reader through a fun litany of optical illusions and demonstrations, some quite familiar and others less so. Squares that are in fact the same shade appear to be different; spirals printed on paper appear to spontaneously rotate; an obscure image turns out to be a woman kissing a horse; a face shows up in a bathroom sink. Re-creating the mind’s psychedelic powers in silicon, an artificial-intelligence-powered virtual-reality setup that he and his colleagues created produces a Hunter Thompson–esque menagerie of animal parts emerging piecemeal from other objects in a square on the Sussex University campus. This series of examples, in Seth’s telling, “chips away at the beguiling but unhelpful intuition that consciousness is one thing—one big scary mystery in search of one big scary solution.” Seth’s perspective might be unsettling to those who prefer to believe that things are as they seem to be: “Experiences of free will are perceptions. The flow of time is a perception.” 

Seth is on comparatively solid ground when he describes how the brain shapes experience, what philosophers call the “easy” problems of consciousness. They’re easy only in comparison to the “hard” problem: why subjective experience exists at all as a feature of the universe. Here he treads awkwardly, introducing the “real” problem, which is to “explain, predict, and control the phenomenological properties of conscious experience.” It’s not clear how the real problem differs from the easy problems, but somehow, he says, tackling it will get us some way toward resolving the hard problem. Now that would be a neat trick.

Where Seth relates, for the most part, the experiences of people with typical brains wrestling with atypical stimuli, in Coming to Our Senses, Susan Barry, an emeritus professor of neurobiology at Mount Holyoke college, tells the stories of two people who acquired new senses later in life than is usual. Liam McCoy, who had been nearly blind since he was an infant, was able to see almost clearly after a series of operations when he was 15 years old. Zohra Damji was profoundly deaf until she was given a cochlear implant at the unusually late age of 12. As Barry explains, Damji’s surgeon “told her aunt that, had he known the length and degree of Zohra’s deafness, he would not have performed the operation.” Barry’s compassionate, nuanced, and observant exposition is informed by her own experience:

At age forty-eight, I experienced a dramatic improvement in my vision, a change that repeatedly brought me moments of childlike glee. Cross-eyed from early infancy, I had seen the world primarily through one eye. Then, in mid-life, I learned, through a program of vision therapy, to use my eyes together. With each glance, everything I saw took on a new look. I could see the volume and 3D shape of the empty space between things. Tree branches reached out toward me; light fixtures floated. A visit to the produce section of the supermarket, with all its colors and 3D shapes, could send me into a sort of ecstasy. 

Barry was overwhelmed with joy at her new capacities, which she describes as “seeing in a new way.” She takes pains to point out how different this is from “seeing for the first time.” A person who has grown up with eyesight can grasp a scene in a single glance. “But where we perceive a three-dimensional landscape full of objects and people, a newly sighted adult sees a hodgepodge of lines and patches of colors appearing on one flat plane.” As McCoy described his experience of walking up and down stairs to Barry: 

The upstairs are large alternating bars of light and dark and the downstairs are a series of small lines. My main focus is to balance and step IN BETWEEN lines, never on one … Of course going downstairs you step in between every line but upstairs you skip every other bar. All the while, when I move, the stairs are skewing and changing.

Even a sidewalk was tricky, at first, to navigate. He had to judge whether a line “indicated the junction between flat sidewalk blocks, a crack in the cement, the outline of a stick, a shadow cast by an upright pole, or the presence of a sidewalk step,” Barry explains. “Should he step up, down, or over the line, or should he ignore it entirely?” As McCoy says, the complexity of his perceptual confusion probably cannot be fully explained in terms that sighted people are used to.

The same, of course, is true of hearing. Raw audio can be hard to untangle. Barry describes her own ability to listen to the radio while working, effortlessly distinguishing the background sounds in the room from her own typing and from the flute and violin music coming over the radio. “Like object recognition, sound recognition depends upon communication between lower and higher sensory areas in the brain … This neural attention to frequency helps with sound source recognition. Drop a spoon on a tiled kitchen floor, and you know immediately whether the spoon is metal or wood by the high- or low-frequency sound waves it produces upon impact.” Most people acquire such capacities in infancy. Damji didn’t. She would often ask others what she was hearing, but had an easier time learning to distinguish sounds that she made herself. She was surprised by how noisy eating potato chips was, telling Barry: “To me, potato chips were always such a delicate thing, the way they were so lightweight, and so fragile that you could break them easily, and I expected them to be soft-sounding. But the amount of noise they make when you crunch them was something out of place. So loud.” 

As Barry recounts, at first Damji was frightened by all sounds, “because they were meaningless.” But as she grew accustomed to her new capabilities, Damji found that “a sound is not a noise anymore but more like a story or an event.” The sound of laughter came to her as a complete surprise, and she told Barry it was her favorite. As Barry writes, “Although we may be hardly conscious of background sounds, we are also dependent upon them for our emotional well-being.” One strength of the book is in the depth of her connection with both McCoy and Damji. She spent years speaking with them and corresponding as they progressed through their careers: McCoy is now an ophthalmology researcher at Washington University in St. Louis, while Damji is a doctor. From the details of how they learned to see and hear, Barry concludes, convincingly, that “since the world and everything in it is constantly changing, it’s surprising that we can recognize anything at all.”

In What Makes Us Smart, Samuel Gershman, a psychology professor at Harvard, says that there are “two fundamental principles governing the organization of human intelligence.” Gershman’s book is not particularly accessible; it lacks connective tissue and is peppered with equations that are incompletely explained. He writes that intelligence is governed by “inductive bias,” meaning we prefer certain hypotheses before making observations, and “approximation bias,” which means we take mental shortcuts when faced with limited resources. Gershman uses these ideas to explain everything from visual illusions to conspiracy theories to the development of language, asserting that what looks dumb is often “smart.”

“The brain is evolution’s solution to the twin problems of limited data and limited computation,” he writes. 

He portrays the mind as a raucous committee of modules that somehow helps us fumble our way through the day. “Our mind consists of multiple systems for learning and decision making that only exchange limited amounts of information with one another,” he writes. If he’s correct, it’s impossible for even the most introspective and insightful among us to fully grasp what’s going  on inside our own head. As Damji wrote in a letter to Barry: 

When I had no choice but to learn Swahili in medical school in order to be able to talk to the patients—that is when I realized how much potential we have—especially when we are pushed out of our comfort zone. The brain learns it somehow.

Matthew Hutson is a contributing writer at The New Yorker and a freelance science and tech writer.

The Mind issue

This story was part of our September 2021 issue

Book Review: Why Science Denialism Persists (Undark)

BooksPrint

Two new books explore what motivates people to reject science — and why it’s so hard to shake deep-seated beliefs.

By Elizabeth Svoboda – 05.22.2020

To hear some experts tell it, science denial is mostly a contemporary phenomenon, with climate change deniers and vaccine skeptics at the vanguard. Yet the story of Galileo Galilei reveals just how far back denial’s lineage stretches.

BOOK REVIEW “Galileo and the Science Deniers,” by Mario Livio (Simon & Schuster, 304 pages).

Years of astronomical sightings and calculations had convinced Galileo that the Earth, rather than sitting at the center of things, revolved around a larger body, the sun. But when he laid out his findings in widely shared texts, as astrophysicist Mario Livio writes in “Galileo and the Science Deniers,” the ossified Catholic Church leadership — heavily invested in older Earth-centric theories — aimed its ire in his direction.

Rather than revise their own maps of reality to include his discoveries, clerics labeled him a heretic and banned his writings. He spent the last years of his life under house arrest, hemmed in by his own insistence on the expansiveness of the cosmos.

Nearly 400 years later, the legacy of denial remains intact in some respects. Scientists who publish research about climate change or the safety of genetically modified crops still encounter the same kind of pushback from deniers that Galileo did. Yet denialism has also sprouted some distinctly modern features: As Alan Levinovitz points out in “Natural: How Faith in Nature’s Goodness Leads to Harmful Fads, Unjust Laws, and Flawed Science,” sometimes we ourselves can become unwitting purveyors of denial, falling prey to flawed or false beliefs we may not realize we’re holding.

Levinovitz passionately protests the common assumption that natural things are inherently better than unnatural ones. Not only do people automatically tend to conclude organic foods are healthier, many choose “natural” or “alternative” methods of cancer treatment over proven chemotherapy regimens. Medication-free childbirth, meanwhile, is now considered the gold standard in many societies, despite mixed evidence of its health benefits for mothers and babies.

BOOK REVIEW “Natural: How Faith in Nature’s Goodness Leads to Harmful Fads, Unjust Laws, and Flawed Science,” by Alan Levinovitz (Beacon Press, 264 pages).

“What someone calls ‘natural’ may be good,” writes Levinovitz, a religion professor at James Madison University, “but the association is by no means necessary, or even likely.” Weaving real-life examples with vivid retellings of ancient myths about nature’s power, he demonstrates that our pro-natural bias is so pervasive that we often lose the ability to see it — or to admit the legitimacy of science that contradicts it.

From this perspective, science denial starts to look like a stunted outgrowth of what we typically consider common sense. In Galileo’s time, people thought it perfectly sensible that the planet they inhabited was at the center of everything. Today, it might seem equally sensible that it’s always better to choose natural products over artificial ones, or that a plant burger ingredient called “soy leghemoglobin” is suspect because it’s genetically engineered and can’t be sourced in the wild. Yet in these cases, what we think of as common sense turns out to be humbug.

In exploring the past and present of anti-science bias, Livio and Levinovitz show how deniers’ basic toolbox has not changed much through the centuries. Practitioners marshal arguments that appeal to our tendency to think in dichotomies: wrong or right, saved or damned, pure or tainted. Food is either nourishing manna from the earth or processed, artificial junk. The Catholic Church touted its own supreme authority while casting Galileo as an unregenerate apostate.

In the realm of denialism, Levinovitz writes, “simplicity and homogeneity take precedence over diversity, complexity, and change. Righteous laws and rituals are universal. Disobedience is sacrilege.”

The very language of pro-nature, anti-science arguments, Levinovitz argues, is structured to play up this us-versus-them credo. Monikers like Frankenfood — often used to describe genetically-modified (GM) crops — frame the entire GM food industry as monstrous, a deviation from the supposed order of things. And in some circles, he writes, the word “unnatural” has come to be almost a synonym for “moral deficiency.” Not only is such black-and-white rhetoric seductive, it can give deniers the heady sense that they occupy the moral high ground.

Both pro-natural bias and the Church’s crusade against Galileo reflect the human penchant to fit new information into an existing framework. Rather than scrapping or changing that framework, we try to jerry-rig it to make it function. Some of the jerry-rigging examples the authors describe are more toxic than others: Opting for so-called natural foods despite dubious science on their benefits, for instance, is less harmful than denying evidence of a human-caused climate crisis.

What’s more, many people actually tend to cling harder to their beliefs in the face of contradictory evidence. Studies confirm that facts and reality aren’t likely to sway most people’s pre-existing views. This is as true now as it was at the close of the Renaissance, as shown by some extremists’ stubborn denial that the Covid-19 virus is dangerous.

In the realm of denialism, “simplicity and homogeneity take precedence over diversity, complexity, and change.”

In one of his book’s most compelling chapters, Livio takes us inside a panel of theologians that convened in 1616 to rule on whether the sun was at the center of things. None of Galileo’s incisive arguments swayed their thinking one iota. “This proposition is foolish and absurd in philosophy,” the theologians wrote, “and formally heretical, since it explicitly contradicts in many places the sense of Holy Scripture.” Cardinal Bellarmino warned Galileo that if he did not renounce his heliocentric views, he could be thrown into prison.

Galileo’s discoveries threatened to topple a superstructure that the Church had spent hundreds of years buttressing. In making their case against him, his critics liked to cite a passage from Psalm 93: “The world also is established that it cannot be moved.”

Galileo refused to cave. In his 1632 book, “Dialogue Concerning the Two Chief World Systems,” he did give the views of Pope Urban VIII an airing: He repeated Urban’s statement that no human could ever hope to decode the workings of the universe. But Livio slyly points out that Galileo put these words in the mouth of a ridiculous character named Simplicio. It was a slight Urban would not forgive. “May God forgive Signor Galilei,” he intoned, “for having meddled with these subjects.”

At the close of his 1633 Inquisition trial, Galileo was forced to declare that he abandoned any belief that the Earth revolved around the sun. “I abjure, curse, and detest the above-mentioned errors and heresies.” He swore that he would never again say “anything which might cause a similar suspicion about me.” Yet as he left the courtroom, legend goes, he muttered to himself “E pur si muove” (And yet it moves).

In the face of science denial, Livio observes, people have taken up “And yet it moves” as a rallying cry: a reminder that no matter how strong our prejudices or presuppositions, the facts always remain the same. But in today’s “post-truth era,” as political theorist John Keane calls it, with little agreement on what defines a reliable source, even the idea of an inescapable what is seems to have receded from view.

Levinovitz’s own evolution in writing “Natural” reveals how hard it can be to elevate facts above all, even for avowed anti-deniers. When he began his research, he picked off instances of pro-natural bias as if they were clay pigeons, confident in the rigor of his approach. “Confronted with a false faith, I had resolved that it was wholly evil,” he reflects.

Yet he later concedes that a favoritism toward nature is logical in domains like sports, which celebrate the potential of the human body in its unaltered form. He also accepts one expert’s point that it makes sense to buy organic if the pesticides used are less dangerous to farm workers than conventional ones. By the end of the book, he finds himself in a more nuanced place: “The art of celebrating humanity and nature,” he concludes, depends on “having the courage to embrace paradox.” His quest to puncture the myth of the natural turns out to have been dogmatic in its own way.

In acknowledging this, Levinovitz hits on something important. When deniers take up arms, it’s tempting to follow their lead: to use science to build an open-and-shut case that strikes with the finality of a courtroom witness pointing out a killer.

But as Galileo knew — and as Levinovitz ultimately concedes — science, in its endlessly unspooling grandeur, tends to resist any conclusion that smacks of the absolute. “What only science can promise,” Livio writes, “is a continuous, midcourse self-correction, as additional experimental and observational evidence accumulates, and new theoretical ideas emerge.”

In their skepticism of pat answers, these books bolster the case that science’s strength is in its flexibility — its willingness to leave room for iteration, for correction, for innovation. Science is an imperfect vehicle, as any truth-seeking discipline must be. And yet, as Galileo would have noted, it moves.

Elizabeth Svoboda is a science writer based in San Jose, California. Her most recent book for children is “The Life Heroic.”

Related

Opinion: The Roots of Modern Medical Denialism

Qual o tamanho da pandemia do novo coronavírus? (Estadão)

estadao.com.br

Com mais de 300 mil mortes confirmadas no mundo, espalhadas por todos os continentes, a covid-19 já é mais letal que desastres naturais, atentados terroristas e guerras

Renato Vasconcelos e Paulo Beraldo

15 de maio de 2020 | 05h00

Apandemia do novo coronavírus já tem envergadura de desastre. Com mais de 300 mil mortes confirmadas até esta quinta-feira, 14, a covid-19 já matou mais pessoas do que guerras, desastres naturais e atentados terroristas que marcaram a história. Apesar da letalidade da doença, uma grande quantidade de pessoas, incluindo líderes mundiais, continuam a minimizar ou negar a pandemia – que continua a fazer vítimas diárias em todos os continentes.

Para o professor de história da Universidade Federal de Santa Maria (UFSM), João Malaia, o quanto um evento trágico impressiona alguém depende de fatores como a duração, a proximidade de quem morre e a distância física do fenômeno em si. “Muitas mortes em um período curto também tendem a impressionar mais. No caso de uma pandemia, as mortes diárias vão diluindo o sentimento da tragédia, a não ser para aqueles que perdem pessoas próximas”, explica o pesquisador, que coordena um projeto de pesquisa sobre a gripe espanhola no Brasil, o ‘Mais História, por favor!’.

Segundo Malaia, a normalização da morte nos discursos de autoridades como o presidente da República acaba reforçando o sentimento de conformação de parte da população. Olhando para o passado, vê semelhanças na forma como o Brasil lidou com a gripe espanhola. “O governo brasileiro foi muito criticado na época por setores da imprensa por demorar a tomar medidas, principalmente no Rio de Janeiro, então capital federal, quando já se sabia dos casos”, diz.

Imagem aérea mostra o dano causado pelo tsunami na cidade turística de Phuket, na Tailândia, em 26 de dezembro de 2004Reuters

O número de mortes pelo mundo já ultrapassou qualquer desastre natural da história recente. O tsunami de 2004, que varreu países banhados pelo Oceano Índico e considerado o mais mortal da história, vitimou cerca de 230 mil pessoas. O cenário não é muito diferente se observados os contextos regionais e nacionais.


MAIS BAIXAS QUE OS CIVIS DO IRAQUE

Soldado americano observa a derrubada da estátua de Saddam Hussein no centro de Bagdá, em 9 de abril de 2003Goran Tomasevic/Reuters

Na Europa, continente com mais mortos até o momento, somados os quatro países mais afetados pela pandemia – Reino Unido, Itália, França e Espanha – o número de vítimas é maior do que o total de civis mortos nos últimos dez anos da Guerra do Iraque (2009-2019).


EUA

PIOR QUE O VIETNÃ

Corpos de soldados americanos mortos na Batalha do vale Ia Drang, primeira grande derrota do país no Vietnã, em 15 de novembro de 1965Neil Sheehan/The New York Times

Nos Estados Unidos, o número de vítimas do novo coronavírus entre fevereiro e maio – menos de 120 dias – já é maior do que o de militares americanos mortos na Guerra do Vietña (58 mil), que durou 20 anos.

QUASE 30 VEZES O 11 DE SETEMBRO

Equipe de resgate retira homem de uma das torres do World Trade Center, em Nova York, logo após o atentado de 11 de setembro de 2001Shannon Stapleton/Reuters

Seriam necessários mais de 28 atentados iguais aos de 11 de setembro de 2001, que destruiu as torres gêmeas do World Trade Center, para igualar o número de mortos pela covid-19 nos EUA. Já o Estado de Nova York, palco da catástrofe, precisaria presenciar mais de 9 atentados para igualar o número de mortos pela pandemia.


MAIS DE 100 GUERRAS

Parentes de soldados argentinos mortos na Guerra das Malvinas visitam cemitério na ilha pela primeira vez, em 19 de março de 1991Reuters

O Reino Unido, que tomou o posto da Itália de país mais afetado pela pandemia no continente, teria que lutar mais de 130 Guerras das Malvinas para ter o mesmo número de baixas provocadas pelo coronavírus. Se contarmos o número total de mortos na guerra (britânicos e argentinos), seriam necessários mais de 36 conflitos idênticos ao disputado no Atlântico sul.


CUSTO MAIOR QUE A INDEPENDÊNCIA

Quadro retrata a Batalha de San MartinoLuigi Norfini

A Segunda Guerra de Independência da Itália, iniciada em 1859, foi o último episódio no processo de unificação do país. Estima-se que mais de 12 mil vidas foram perdidas durante o conflito, o que equivale a menos da metade das vítimas da pandemia.


1500 ANOS DE TERRORISMO

Mascarados, guerrilheiros do ETA leem anúncio ao vivo na televisão espanhola em 18 de fevereiro de 2004Vincent West/Reuters

Na Espanha, as vítimas da covid-19 somam um número 30 vezes maior do que os mortos em atentados promovidos pela Pátria Basca e Liberdade (ETA). Em 50 anos de atividade, as ações do grupo terrorista vitimaram 584 pessoas. Caso ainda existisse e mantivesse a mesma média de letalidade, o ETA só conseguiria igualar o número de mortes provocadas pela pandemia em 1.586 anos de terrorismo.


ATENTADOS DO ISIS EM PARIS

Brigadistas prestam socorro a feridos perto da boate Bataclan, em 13 de novembro de 2015Christian Hartmann/Reuters

Comparativamente, os mortos pela covid-19 na França correspondem a, aproximadamente, 300 ataques terroristas iguais ao que ocorreu na boate Bataclan, em 25 de novembro de 2015, quando o grupo jihadista Estado Islâmico (ISIS) fez um de seus mais famosos atentados até então.


BRASIL E SÃO PAULO

No caso brasileiro, os mais de 13 mil mortos fazem desastres como o de Brumadinho ficarem pequenos. Teriam que ter ocorrido 52 acidentes iguais ao da cidade mineira para alcançar a mortalidade. O mesmo pode ser dito do massacre do Carandiru. Seriam precisas 122 chacinas para que o número de mortos se igualasse ao do país. Já São Paulo teria que lutar quatro Revoluções Constitucionalistas para igualar as baixas.

REVOLUÇÃO CONSTITUCIONALISTA DE 1932

Soldados paulistas combateram, com armamento precário, as poderosas colunas inimigas. Reprodução feita no dia 02 de junho de 2013, dos originais publicados pelo jornal ‘O Estado de S. Paulo’ durante a cobertura da Revolução Constitucionalista de 1932ARQUIVO/AE
Os corpos dos detentos mortos há dois dias são acondicionados de salas e corredores do IML (Instituto Médico Legal)EPITÁCIO PESSOA/ESTADÃO
Helicóptero do Corpo de Bombeiros  e agentes da defesa civil trabalham no resgate dos corpos das vítimas encontrados em um ônibus de funcionários da VALE na região onde ficavam os escritórios da empresa em BrumadinhoWILTON JUNIOR/ ESTADÃO

REVOLUÇÃO TEOCRÁTICA DO IRÃ

Apoiadores do aiatolá Khomeini mostra sua imagem em Teerã, no Irã, durante a revolução islâmica de 1979REUTERS

Na Ásia, onde a pandemia começou, a mortalidade também alcançou níveis históricos. O número de mortos no Irã é duas vezes superior ao número de mortos da Revolução Teocrática que mudou o regime do país em 1979.


EXÉRCITO DE TERRACOTA

Imagem de parte do Exército de Soldados de Terracota de Xian, na ChinaLudovic Marin/ AFP

Na China, o número de mortos sepultados no país já é o equivalente a metade das estátuas do Exército de Terracota, enterradas no túmulo do imperador Qin Shi Huang.


Expediente

Editor executivo multimídia Fabio Sales / Editora de infografia multimídia Regina Elisabeth Silva / Editores assistentes multimídia Adriano Araujo, Carlos Marin, Glauco Lara e William Marioto / Editor de Internacional Cristiano Dias / Reportagem Renato Vasconcelos, Rodrigo Turrer e Paulo Beraldo / Edição de fotografia Sérgio Neves / Foto da capa Juan Carlos Ulate/Reuters / SEO Brenda Zacharias / Designer multimídia Lucas Almeida

Acute stress may slow down the spread of fears (Science Daily)

Date: May 12, 2020

Source: University of Konstanz

Summary: Psychologists find that we are less likely to amplify fears in social exchange if we are stressed.

New psychology research from the University of Konstanz reveals that stress changes the way we deal with risky information — results that shed light on how stressful events, such as a global crisis, can influence how information and misinformation about health risks spreads in social networks.

“The global coronavirus crisis, and the pandemic of misinformation that has spread in its wake, underscores the importance of understanding how people process and share information about health risks under stressful times,” says Professor Wolfgang Gaissmaier, Professor in Social Psychology at the University of Konstanz, and senior author on the study. “Our results uncovered a complex web in which various strands of endocrine stress, subjective stress, risk perception, and the sharing of information are interwoven.”

The study, which appears in the journal Scientific Reports, brings together psychologists from the DFG Cluster of Excellence “Centre for the Advanced Study of Collective Behaviour” at the University of Konstanz: Gaissmaier, an expert in risk dynamics, and Professor Jens Pruessner, who studies the effects of stress on the brain. The study also includes Nathalie Popovic, first author on the study and a former graduate student at the University of Konstanz, Ulrike Bentele, also a Konstanz graduate student, and Mehdi Moussaïd from the Max Planck Institute for Human Development in Berlin.

In our hyper-connected world, information flows rapidly from person to person. The COVID-19 pandemic has demonstrated how risk information — such as about dangers to our health — can spread through social networks and influence people’s perception of the threat, with severe repercussions on public health efforts. However, whether or not stress influences this has never been studied.

“Since we are often under acute stress even in normal times and particularly so during the current health pandemic, it seems highly relevant not only to understand how sober minds process this kind of information and share it in their social networks, but also how stressed minds do,” says Pruessner, a Professor in Clinical Neuropsychology working at the Reichenau Centre of Psychiatry, which is also an academic teaching hospital of the University of Konstanz.

To do this, researchers had participants read articles about a controversial chemical substance, then report their risk perception of the substance before and after reading the articles, and say what information they would pass on to others. Just prior to this task, half of the group was exposed to acute social stress, which involved public speaking and mental arithmetic in front of an audience, while the other half completed a control task.

The results showed that experiencing a stressful event drastically changes how we process and share risk information. Stressed participants were less influenced by the articles and chose to share concerning information to a significantly smaller degree. Notably, this dampened amplification of risk was a direct function of elevated cortisol levels indicative of an endocrine-level stress response. In contrast, participants who reported subjective feelings of stress did show higher concern and more alarming risk communication.

“On the one hand, the endocrine stress reaction may thus contribute to underestimating risks when risk information is exchanged in social contexts, whereas feeling stressed may contribute to overestimating risks, and both effects can be harmful,” says Popovic. “Underestimating risks can increase incautious actions such as risky driving or practising unsafe sex. Overestimating risks can lead to unnecessary anxieties and dangerous behaviours, such as not getting vaccinated.”

By revealing the differential effects of stress on the social dynamics of risk perception, the Konstanz study shines light on the relevance of such work not only from an individual, but also from a policy perspective. “Coming back to the ongoing COVID-19 pandemic, it highlights that we do not only need to understand its virology and epidemiology, but also the psychological mechanisms that determine how we feel and think about the virus, and how we spread those feelings and thoughts in our social networks,” says Gaissmaier.

O efeito Dunning-Kruger, ou por que os ignorantes acham que são especialistas (Universo Racionalista)

[A ironia do autor parece indicar que ele não entendeu muito bem o assunto de que trata. Há frases inconsistentes, como “o efeito Dunning-Kruger não é uma falha humana; é simplesmente um produto da nossa compreensão subjetiva do mundo”, por exemplo. RT]

Por Julio Batista – fev 20, 2020

Imagem via Pxhere.

Artigo original em português

Traduzido por Julio Batista
Original de Alexandru Micu no ZME Science

O efeito Dunning-Kruger é um viés cognitivo que foi descrito pela primeira vez no trabalho de David Dunning e Justin Kruger no (agora famoso) estudo de 1999 Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments.

O estudo nasceu baseado em um caso criminal de um rapaz chamado McArthur Wheeler que, em plena luz do dia de 19 de abril de 1995, decidiu roubar dois bancos em Pittsburg, Estados Unidos. Wheeler portava uma arma, mas não uma máscara. Câmeras de vigilância o registraram em flagrante, e a polícia divulgou sua foto nas notícias locais, recebendo várias denúncias de onde ele estava quase que imediatamente.

Um gráfico mostrando o efeito Dunning-Kruger. Imagem adaptada do Wikimedia.

Quando eles foram o prender, o Sr. Wheeler estava visivelmente confuso.

“Mas eu estava coberto de suco”, ele disse, antes que os oficiais o levassem.

Não existe “métodos infalíveis”

Em algum momento de sua vida, Wheeler aprendeu de alguém que o suco de limão poderia ser usado como uma ‘tinta invisível’. Se algo fosse escrito em um pedaço de papel usando suco de limão, você não veria nada – a não ser que você aquecesse o suco, o que tornaria os rabiscos visíveis. Então, naturalmente, ele cobriu seu rosto de suco de limão e foi assaltar um banco, confiante de que sua identidade permaneceria secreta para as câmeras, desde que ele não chegasse perto de nenhuma fonte de calor.

Ainda assim, devemos dar créditos pro sujeito: Wheeler não apostou cegamente. Ele realmente testou sua teoria tirando uma selfie com uma câmera polaroid (existe um cientista dentro de todos nós). Por alguma razão ou outra, talvez porque o filme estava com defeito, não sabemos exatamente o porquê, a câmera revelou uma imagem em branco.

As notícias circularam pelo mundo, todo mundo deu uma boa risada, e o Sr. Wheeler foi levado para a cadeia. A polícia concluiu que ele não era louco, nem usava drogas, ele realmente acreditava que seu plano funcionaria. “Durante sua interação com a polícia, ele ficou incrédulo sobre como sua ignorância havia falhado com ele”, escreveu Anupum Pant para a Awesci.

David Dunning estava trabalhando como psicólogo na Universidade Cornell na época, e a história bizarra chamou sua atenção. Com a ajuda de Justin Kruger, um de seus alunos de pós-graduação, ele começou a entender como o Sr. Wheeler podia estar tão confiante em um plano que era claramente estúpido. A teoria que eles desenvolveram é que quase todos nós consideramos nossas habilidades em determinadas áreas acima da média e que a maioria provavelmente avalia as próprias habilidades como muito melhores do que elas são objetivamente – uma “ilusão de confiança” que sustenta o efeito Dunning-Kruger.

Sejamos todos sem noção

“Cuidado com o vão”… entre como você se vê e como realmente é. Imagem via Pxfuel.

“Se você é incompetente, você não pode saber que é incompetente”, escreveu Dunning no seu livro Self-Insight: Roadblocks and Detours on the Path to Knowing Thyself.

“As habilidades necessárias para produzir uma resposta certa são exatamente as habilidades necessárias para reconhecer o que é uma resposta certa”.

No estudo de 1999 (o primeiro realizado sobre o tópico), a dupla fez uma série de perguntas aos alunos de Cornell sobre gramática, lógica e humor (usadas para medir as habilidades reais dos alunos) e, em seguida, pediu que cada um avaliasse a pontuação geral que eles alcançariam e como suas pontuações se relacionariam às pontuações dos outros participantes. Eles descobriram que os estudantes com a pontuação mais baixa, superestimaram consistente e substancialmente suas próprias capacidades. Os alunos do quartil inferior (25% mais baixos por nota) pensaram que atavam acima de dois terços em média dos outros estudantes (ou seja, que ficaram entre os 33% melhores por pontuação).

Um estudo relacionado realizado pelos mesmo autores em um clube de tiro esportivo mostrou resultados semelhantes. Dunning e Kruger usaram uma metodologia semelhante, fazendo perguntas aos aficionados sobre segurança de armas, visando que estes estimassem a si próprios sobre seus desempenhos no teste. Aqueles que responderam o menor número de perguntas de forma correta também superestimaram demasiadamente seu domínio do conhecimento sobre armas de fogo.

Não é específico apenas às habilidades técnicas, pois afeta todas as esferas da existência humana por igual. Um estudo descobriu que 80% dos motoristas se classificam como acima da média, o que é literalmente impossível, porque não é assim que as médias funcionam. Tendemos a avaliar nossa popularidade relativa da mesma maneira.

Também não se limita a pessoas com habilidades baixas ou inexistentes em um determinado assunto – funciona em praticamente todos nós. Em seu primeiro estudo, Dunning e Kruger também descobriram que os alunos que pontuavam no quartil superior (25%) subestimavam rotineiramente sua própria competência.

Uma definição mais completa do efeito Dunning-Kruger seria que ele representa um viés na estimativa de nossa própria capacidade decorrente de nossa perspectiva limitada. Quando temos uma compreensão ruim ou inexistente sobre um tópico, sabemos literalmente muito pouco para entender o quão pouco sabemos. Aqueles que de fato possuem o conhecimento ou habilidades, no entanto, têm uma ideia muito melhor que as outras pessoas com quem andam. Mas eles também pensam que, se uma tarefa é clara e simples para eles, também deve ser assim para todos os outros.

Uma pessoa no primeiro grupo e uma no segundo grupo são igualmente suscetíveis de usar sua própria experiência como base e tendem a dar como certo que todos estão próximos dessa “base”. Ambos tem “ilusão de confiança” – em um, essa confiança eles tem em si mesmos, e no outro, eles tem em todos as outras pessoas.

Mas talvez não sejamos igualmente sem noção

Errar é humano. Mas, persistir com confiança no erro é hilário.

Dunning e Kruger pareciam encontrar uma saída para o efeito que ajudaram a documentar. Embora todos pareçamos ter a mesma probabilidade de nos iludir, há uma diferença importante entre aqueles que são confiantes, mas incapazes, e aqueles que são capazes e não têm confiança: a forma que lidam e absorvem o feedback ao próprio comportamento.

O Sr. Wheeler tentou verificar sua teoria. No entanto, ele olhou para uma polaroid em branco de uma foto que ele tinha acabado de tirar – um dos grandes motivos que sinalizava que algo não deu muito certo na sua teoria – e não viu motivo para se preocupar; a única explicação que ele aceitou foi que seu plano funcionava. Mais tarde, ele recebeu um feedback da polícia, mas nem isso conseguiu diminuir sua certeza; ele estava “incrédulo em como sua ignorância havia falhado com ele”, mesmo quando ele tinha absoluta confirmação (estando na prisão) de que isso falhou.

Durante sua pesquisa, Dunning e Kruger descobriram que bons alunos previam melhor seu desempenho em exames futuros quando recebessem feedback preciso sobre a pontuação que alcançaram atualmente e sobre sua classificação relativa entre a turma. Os alunos com pior desempenho não mudariam suas expectativas, mesmo após um feedback claro e repetido de que estavam tendo um desempenho ruim. Eles simplesmente insistiram que suas suposições estavam corretas.

Brincadeiras à parte, o efeito Dunning-Kruger não é uma falha humana; é simplesmente um produto da nossa compreensão subjetiva do mundo. Na verdade, serve como uma precaução contra supor que estamos sempre certos e serve pra destacar a importância de manter uma mente aberta e uma visão crítica de nossa própria capacidade.

Mas se você tem medo de ser incompetente, verifique como o feedback afeta sua visão sobre seu próprio trabalho, conhecimento, habilidades e como isso se relaciona com outras pessoas ao seu redor. Se você realmente é um incompetente, não vai mudar de ideia e esse processo é basicamente uma perda de tempo, mas não se preocupe – alguém lhe dirá que você é incompetente.

E você não vai acreditar neles.

Extreme weather: Is it all in your mind? (USA Today)

Thomas M. Kostigen, Special for USA TODAY9:53 a.m. EDT October 17, 2015

Weather is not as objective an occurrence as it might seem. People’s perceptions of what makes weather extreme are influenced by where they live, their income, as well as their political views, a new study finds.

There is a difference in both seeing and believing in extreme weather events, according to the study in the journal Environmental Sociology.

“Odds were higher among younger, female, more educated, and Democratic respondents to perceive effects from extreme weather than older, male, less educated, and Republican respondents,” said the study’s author, Matthew Cutler of the University of New Hampshire.

There were other correlations, too. For example, people with lower incomes had higher perceptions of extreme weather than people who earned more. Those who live in more vulnerable areas, as might be expected, interpret the effects of weather differently when the costs to their homes and communities are highest.

Causes of extreme weather and the frequency of extreme weather events is an under-explored area from a sociological perspective. Better understanding is important to building more resilient and adaptive communities. After all, why prepare or take safety precautions if you believe the weather isn’t going to be all that bad or occur all that often?

The U.S. Climate Extremes Index, compiled by the National Oceanic and Atmospheric Administration (NOAA), shows a significant rise in extreme weather events since the 1970s, the most back-to-back years of extremes over the past decade since 1910, and all-time record-high levels clocked in 1998 and 2012.

“Some recent research has demonstrated linkages between objectively measured weather, or climate anomalies, and public concern or beliefs about climate change,” Cutler notes. “But the factors influencing perceptions of extreme or unusual weather events have received less attention.”

Indeed, there is a faction of the public that debates how much the climate is changing and which factors are responsible for such consequences as global warming.

Weather, on the other hand, is a different order of things: it is typically defined in the here and now or in the immediate future. It also is largely confined, because of its variability, to local or regional areas. Moreover, weather is something we usually experience directly.

Climate is a more abstract concept, typically defined as atmospheric conditions over a 30-year period.

When weather isn’t experiential, reports are relied upon to gauge extremes. This is when beliefs become more muddied.

“The patterns found in this research provide evidence that individuals experience extreme weather in the context of their social circumstances and thus perceive the impacts of extreme weather through the lens of cultural and social influences. In other words, it is not simply a matter of seeing to believe, but rather an emergent process of both seeing and believing — individuals experiencing extreme weather and interpreting the impacts against the backdrop of social and economic circumstances central to and surrounding their lives,” Cutler concludes.

Sophocles said, “what people believe prevails over the truth.” The consequences of disbelief come at a price in the context of extreme weather, however, as damage, injury, and death are often results.

Too many times do we hear about people being unprepared for storms, ignoring officials’ warnings, failing to evacuate, or engaging in reckless behavior during weather extremes.

There is a need to draw a more complete picture of “weather prejudice,” as I’ll call it, in order to render more practical advice about preparing, surviving, and recovering from what is indisputable: extreme weather disasters to come.

Thomas M. Kostigen is the founder of TheClimateSurvivalist.com and a New York Times bestselling author and journalist. He is the National Geographic author of “The Extreme Weather Survival Guide: Understand, Prepare, Survive, Recover” and the NG Kids book, “Extreme Weather: Surviving Tornadoes, Tsunamis, Hailstorms, Thundersnow, Hurricanes and More!” Follow him @weathersurvival, or email kostigen@theclimatesurvivalist.com.

Local Weather Patterns Affect Beliefs About Global Warming (Science Daily)

People living in places experiencing warmer-than-normal temperatures at the time they were surveyed were significantly more likely than others to say there is evidence for global warming. (Credit: © Rafael Ben-Ari / Fotolia)

ScienceDaily (July 25, 2012) — Local weather patterns temporarily influence people’s beliefs about evidence for global warming, according to research by political scientists at New York University and Temple University. Their study, which appears in theJournal of Politics, found that those living in places experiencing warmer-than-normal temperatures at the time they were surveyed were significantly more likely than others to say there is evidence for global warming.

“Global climate change is one of the most important public policy challenges of our time, but it is a complex issue with which Americans have little direct experience,” wrote the study’s co-authors, Patrick Egan of New York University and Megan Mullin of Temple University. “As they try to make sense of this difficult issue, many people use fluctuations in local temperature to reassess their beliefs about the existence of global warming.”

Their study examined five national surveys of American adults sponsored by the Pew Research Center: June, July, and August 2006, January 2007, and April 2008. In each survey, respondents were asked the following question: “From what you’ve read and heard, is there solid evidence that the average temperature on earth has been getting warmer over the past few decades, or not?” On average over the five surveys, 73 percent of respondents agreed that Earth is getting warmer.

Egan and Mullin wondered about variation in attitudes among the survey’s respondents, and hypothesized that local temperatures could influence perceptions. To measure the potential impact of temperature on individuals’ opinions, they looked at zip codes from respondents in the Pew surveys and matched weather data to each person surveyed at the time of each poll. They used local weather data to determine if the temperature in the location of each respondent was significantly higher or lower than normal for that area at that time of year.

Their results showed that an abnormal shift in local temperature is associated with a significant shift in beliefs about evidence for global warming. Specifically, for every three degrees Fahrenheit that local temperatures in the past week have risen above normal, Americans become one percentage point more likely to agree that there is ”solid evidence” that Earth is getting warmer. The researchers found cooler-than-normal temperatures have similar effects on attitudes — but in the opposite direction.

The study took into account other variables that may explain the results — such as existing political attitudes and geography — and found the results still held.

The researchers also wondered if heat waves — or prolonged higher-than-normal temperatures — intensified this effect. To do so, they looked at respondents living in areas that experienced at least seven days of temperatures of 10° or more above normal in the three weeks prior to interview and compared their views with those who experienced the same number of hot days, but did not experience a heat wave.

Their estimates showed that the effect of a heat wave on opinion is even greater, increasing the share of Americans believing in global warming by 5.0 to 5.9 percentage points.

However, Egan and Mullin found the effects of temperature changes to be short-lived — even in the wake of heat waves. Americans who had been interviewed after 12 or more days had elapsed since a heat wave were estimated to have attitudes that were no different than those who had not been exposed to a heat wave.

“Under typical circumstances, the effects of temperature fluctuations on opinion are swiftly wiped out by new weather patterns,” they wrote. “More sustained periods of unusual weather cause attitudes to change both to a greater extent and for a longer period of time. However, even these effects eventually decay, leaving no long-term impact of weather on public opinion.”

The findings make an important contribution to the political science research on the relationship between personal experience and opinion on a larger issue, which has long been studied with varying results.

“On issues such as crime, the economy, education, health care, public infrastructure, and taxation, large shares of the public are exposed to experiences that could logically be linked to attitude formation,” the researchers wrote. “But findings from research examining how these experiences affect opinion have been mixed. Although direct experience — whether it be as a victim of crime, a worker who has lost a job or health insurance, or a parent with children in public schools — can influence attitudes, the impact of these experiences tends to be weak or nonexistent after accounting for typical predictors such as party identification and liberal-conservative ideology.”

“Our research suggests that personal experience has substantial effects on political attitudes,” Egan and Mullin concluded. “Rich discoveries await those who can explore these questions in ways that permit clean identification of these effects.”

Egan is an assistant professor in the Wilf Family Department of Politics at NYU and Mullin is an associate professor in the Department of Political Science at Temple University