Arquivo da tag: Neurologia

How big science failed to unlock the mysteries of the human brain (MIT Technology Review)

technologyreview.com

Large, expensive efforts to map the brain started a decade ago but have largely fallen short. It’s a good reminder of just how complex this organ is.

Emily Mullin

August 25, 2021


In September 2011, a group of neuroscientists and nanoscientists gathered at a picturesque estate in the English countryside for a symposium meant to bring their two fields together. 

At the meeting, Columbia University neurobiologist Rafael Yuste and Harvard geneticist George Church made a not-so-modest proposal: to map the activity of the entire human brain at the level of individual neurons and detail how those cells form circuits. That knowledge could be harnessed to treat brain disorders like Alzheimer’s, autism, schizophrenia, depression, and traumatic brain injury. And it would help answer one of the great questions of science: How does the brain bring about consciousness? 

Yuste, Church, and their colleagues drafted a proposal that would later be published in the journal Neuron. Their ambition was extreme: “a large-scale, international public effort, the Brain Activity Map Project, aimed at reconstructing the full record of neural activity across complete neural circuits.” Like the Human Genome Project a decade earlier, they wrote, the brain project would lead to “entirely new industries and commercial ventures.” 

New technologies would be needed to achieve that goal, and that’s where the nanoscientists came in. At the time, researchers could record activity from just a few hundred neurons at once—but with around 86 billion neurons in the human brain, it was akin to “watching a TV one pixel at a time,” Yuste recalled in 2017. The researchers proposed tools to measure “every spike from every neuron” in an attempt to understand how the firing of these neurons produced complex thoughts. 

The audacious proposal intrigued the Obama administration and laid the foundation for the multi-year Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative, announced in April 2013. President Obama called it the “next great American project.” 

But it wasn’t the first audacious brain venture. In fact, a few years earlier, Henry Markram, a neuroscientist at the École Polytechnique Fédérale de Lausanne in Switzerland, had set an even loftier goal: to make a computer simulation of a living human brain. Markram wanted to build a fully digital, three-dimensional model at the resolution of the individual cell, tracing all of those cells’ many connections. “We can do it within 10 years,” he boasted during a 2009 TED talk

In January 2013, a few months before the American project was announced, the EU awarded Markram $1.3 billion to build his brain model. The US and EU projects sparked similar large-scale research efforts in countries including Japan, Australia, Canada, China, South Korea, and Israel. A new era of neuroscience had begun. 

An impossible dream?

A decade later, the US project is winding down, and the EU project faces its deadline to build a digital brain. So how did it go? Have we begun to unwrap the secrets of the human brain? Or have we spent a decade and billions of dollars chasing a vision that remains as elusive as ever? 

From the beginning, both projects had critics.

EU scientists worried about the costs of the Markram scheme and thought it would squeeze out other neuroscience research. And even at the original 2011 meeting in which Yuste and Church presented their ambitious vision, many of their colleagues argued it simply wasn’t possible to map the complex firings of billions of human neurons. Others said it was feasible but would cost too much money and generate more data than researchers would know what to do with. 

In a blistering article appearing in Scientific American in 2013, Partha Mitra, a neuroscientist at the Cold Spring Harbor Laboratory, warned against the “irrational exuberance” behind the Brain Activity Map and questioned whether its overall goal was meaningful. 

Even if it were possible to record all spikes from all neurons at once, he argued, a brain doesn’t exist in isolation: in order to properly connect the dots, you’d need to simultaneously record external stimuli that the brain is exposed to, as well as the behavior of the organism. And he reasoned that we need to understand the brain at a macroscopic level before trying to decode what the firings of individual neurons mean.  

Others had concerns about the impact of centralizing control over these fields. Cornelia Bargmann, a neuroscientist at Rockefeller University, worried that it would crowd out research spearheaded by individual investigators. (Bargmann was soon tapped to co-lead the BRAIN Initiative’s working group.)

There isn’t a single, agreed-upon theory of how the brain works, and not everyone in the field agreed that building a simulated brain was the best way to study it.

While the US initiative sought input from scientists to guide its direction, the EU project was decidedly more top-down, with Markram at the helm. But as Noah Hutton documents in his 2020 film In Silico, Markram’s grand plans soon unraveled. As an undergraduate studying neuroscience, Hutton had been assigned to read Markram’s papers and was impressed by his proposal to simulate the human brain; when he started making documentary films, he decided to chronicle the effort. He soon realized, however, that the billion-dollar enterprise was characterized more by infighting and shifting goals than by breakthrough science.

In Silico shows Markram as a charismatic leader who needed to make bold claims about the future of neuroscience to attract the funding to carry out his particular vision. But the project was troubled from the outset by a major issue: there isn’t a single, agreed-upon theory of how the brain works, and not everyone in the field agreed that building a simulated brain was the best way to study it. It didn’t take long for those differences to arise in the EU project. 

In 2014, hundreds of experts across Europe penned a letter citing concerns about oversight, funding mechanisms, and transparency in the Human Brain Project. The scientists felt Markram’s aim was premature and too narrow and would exclude funding for researchers who sought other ways to study the brain. 

“What struck me was, if he was successful and turned it on and the simulated brain worked, what have you learned?” Terry Sejnowski, a computational neuroscientist at the Salk Institute who served on the advisory committee for the BRAIN Initiative, told me. “The simulation is just as complicated as the brain.” 

The Human Brain Project’s board of directors voted to change its organization and leadership in early 2015, replacing a three-member executive committee led by Markram with a 22-member governing board. Christoph Ebell, a Swiss entrepreneur with a background in science diplomacy, was appointed executive director. “When I took over, the project was at a crisis point,” he says. “People were openly wondering if the project was going to go forward.”

But a few years later he was out too, after a “strategic disagreement” with the project’s host institution. The project is now focused on providing a new computational research infrastructure to help neuroscientists store, process, and analyze large amounts of data—unsystematic data collection has been an issue for the field—and develop 3D brain atlases and software for creating simulations.

The US BRAIN Initiative, meanwhile, underwent its own changes. Early on, in 2014, responding to the concerns of scientists and acknowledging the limits of what was possible, it evolved into something more pragmatic, focusing on developing technologies to probe the brain. 

New day

Those changes have finally started to produce results—even if they weren’t the ones that the founders of each of the large brain projects had originally envisaged. 

Last year, the Human Brain Project released a 3D digital map that integrates different aspects of human brain organization at the millimeter and micrometer level. It’s essentially a Google Earth for the brain. 

And earlier this year Alipasha Vaziri, a neuroscientist funded by the BRAIN Initiative, and his team at Rockefeller University reported in a preprint paper that they’d simultaneously recorded the activity of more than a million neurons across the mouse cortex. It’s the largest recording of animal cortical activity yet made, if far from listening to all 86 billion neurons in the human brain as the original Brain Activity Map hoped.

The US effort has also shown some progress in its attempt to build new tools to study the brain. It has speeded the development of optogenetics, an approach that uses light to control neurons, and its funding has led to new high-density silicon electrodes capable of recording from hundreds of neurons simultaneously. And it has arguably accelerated the development of single-cell sequencing. In September, researchers using these advances will publish a detailed classification of cell types in the mouse and human motor cortexes—the biggest single output from the BRAIN Initiative to date.

While these are all important steps forward, though, they’re far from the initial grand ambitions. 

Lasting legacy

We are now heading into the last phase of these projects—the EU effort will conclude in 2023, while the US initiative is expected to have funding through 2026. What happens in these next years will determine just how much impact they’ll have on the field of neuroscience.

When I asked Ebell what he sees as the biggest accomplishment of the Human Brain Project, he didn’t name any one scientific achievement. Instead, he pointed to EBRAINS, a platform launched in April of this year to help neuroscientists work with neurological data, perform modeling, and simulate brain function. It offers researchers a wide range of data and connects many of the most advanced European lab facilities, supercomputing centers, clinics, and technology hubs in one system. 

“If you ask me ‘Are you happy with how it turned out?’ I would say yes,” Ebell said. “Has it led to the breakthroughs that some have expected in terms of gaining a completely new understanding of the brain? Perhaps not.” 

Katrin Amunts, a neuroscientist at the University of Düsseldorf, who has been the Human Brain Project’s scientific research director since 2016, says that while Markram’s dream of simulating the human brain hasn’t been realized yet, it is getting closer. “We will use the last three years to make such simulations happen,” she says. But it won’t be a big, single model—instead, several simulation approaches will be needed to understand the brain in all its complexity. 

Meanwhile, the BRAIN Initiative has provided more than 900 grants to researchers so far, totaling around $2 billion. The National Institutes of Health is projected to spend nearly $6 billion on the project by the time it concludes. 

For the final phase of the BRAIN Initiative, scientists will attempt to understand how brain circuits work by diagramming connected neurons. But claims for what can be achieved are far more restrained than in the project’s early days. The researchers now realize that understanding the brain will be an ongoing task—it’s not something that can be finalized by a project’s deadline, even if that project meets its specific goals.

“With a brand-new tool or a fabulous new microscope, you know when you’ve got it. If you’re talking about understanding how a piece of the brain works or how the brain actually does a task, it’s much more difficult to know what success is,” says Eve Marder, a neuroscientist at Brandeis University. “And success for one person would be just the beginning of the story for another person.” 

Yuste and his colleagues were right that new tools and techniques would be needed to study the brain in a more meaningful way. Now, scientists will have to figure out how to use them. But instead of answering the question of consciousness, developing these methods has, if anything, only opened up more questions about the brain—and shown just how complex it is. 

“I have to be honest,” says Yuste. “We had higher hopes.”

Emily Mullin is a freelance journalist based in Pittsburgh who focuses on biotechnology.

The Most Common Pain Relief Drug in The World Induces Risky Behaviour, Study Suggests (Science Alert)

www-sciencealert-com.cdn.ampproject.org

Peter Dockrill

9 September 2020


One of the most consumed drugs in the US – and the most commonly taken analgesic worldwide – could be doing a lot more than simply taking the edge off your headache, new evidence suggests.

Acetaminophen, also known as paracetamol and sold widely under the brand names Tylenol and Panadol, also increases risk-taking, according to a new study that measured changes in people’s behaviour when under the influence of the common over-the-counter medication.

“Acetaminophen seems to make people feel less negative emotion when they consider risky activities – they just don’t feel as scared,” says neuroscientist Baldwin Way from The Ohio State University.

“With nearly 25 percent of the population in the US taking acetaminophen each week, reduced risk perceptions and increased risk-taking could have important effects on society.”

The findings add to a recent body of research suggesting that acetaminophen’s effects on pain reduction also extend to various psychological processes, lowering people’s receptivity to hurt feelings, experiencing reduced empathy, and even blunting cognitive functions.

In a similar way, the new research suggests people’s affective ability to perceive and evaluate risks can be impaired when they take acetaminophen. While the effects might be slight, they’re definitely worth noting, given acetaminophen is the most common drug ingredient in America, found in over 600 different kinds of over-the-counter and prescription medicines.

In a series of experiments involving over 500 university students as participants, Way and his team measured how a single 1,000 mg dose of acetaminophen (the recommended maximum adult single dosage) randomly assigned to participants affected their risk-taking behaviour, compared against placebos randomly given to a control group.

In each of the experiments, participants had to pump up an uninflated balloon on a computer screen, with each single pump earning imaginary money. Their instructions were to earn as much imaginary money as possible by pumping the balloon as much as possible, but to make sure not to pop the balloon, in which case they would lose the money.

The results showed that the students who took acetaminophen engaged in significantly more risk-taking during the exercise, relative to the more cautious and conservative placebo group. On the whole, those on acetaminophen pumped (and burst) their balloons more than the controls.

“If you’re risk-averse, you may pump a few times and then decide to cash out because you don’t want the balloon to burst and lose your money,” Way says.

“But for those who are on acetaminophen, as the balloon gets bigger, we believe they have less anxiety and less negative emotion about how big the balloon is getting and the possibility of it bursting.”

In addition to the balloon simulation, participants also filled out surveys during two of the experiments, rating the level of risk they perceived in various hypothetical scenarios, such as betting a day’s income on a sporting event, bungee jumping off a tall bridge, or driving a car without a seatbelt.

In one of the surveys, acetaminophen consumption did appear to reduce perceived risk compared to the control group, although in another similar survey, the same effect wasn’t observed.

Overall, however, based on an average of results across the various tests, the team concludes that there is a significant relationship between taking acetaminophen and choosing more risk, even if the observed effect can be slight.

That said, they acknowledge the drug’s apparent effects on risk-taking behaviour could also be interpreted via other kinds of psychological processes, such as reduced anxiety, perhaps.

“It may be that as the balloon increases in size, those on placebo feel increasing amounts of anxiety about a potential burst,” the researchers explain.

“When the anxiety becomes too much, they end the trial. Acetaminophen may reduce this anxiety, thus leading to greater risk taking.”

Exploring such psychological alternative explanations for this phenomenon – as well as investigating the biological mechanisms responsible for acetaminophen’s effects on people’s choices in situations like this – should be addressed in future research, the team says.

While they’re at it, scientists no doubt will also have future opportunities to further investigate the role and efficacy of acetaminophen in pain relief more broadly, after studies in recent years found that in many medical scenarios, the drug can be ineffective at pain relief, and sometimes is no better than a placebo, in addition to inviting other kinds of health problems.

Despite the seriousness of those findings, acetaminophen nonetheless remains one of the most used medications in the world, considered an essential medicine by the World Health Organisation, and recommended by the CDC as the primary drug you should probably take to ease symptoms if you think you might have coronavirus.

In light of what we’re finding out about acetaminophen, we might want to rethink some of that advice, Way says.

“Perhaps someone with mild COVID-19 symptoms may not think it is as risky to leave their house and meet with people if they’re taking acetaminophen,” Way says.

“We really need more research on the effects of acetaminophen and other over-the-counter drugs on the choices and risks we take.”

The findings are reported in Social Cognitive and Affective Neuroscience.

Repetitive negative thinking linked to dementia risk (Science Daily)

Date: June 7, 2020

Source: University College London

Summary: Persistently engaging in negative thinking patterns may raise the risk of Alzheimer’s disease, finds a new UCL-led study published in Alzheimer’s & Dementia.

Persistently engaging in negative thinking patterns may raise the risk of Alzheimer’s disease, finds a new UCL-led study.

In the study of people aged over 55, published in Alzheimer’s & Dementia, researchers found ‘repetitive negative thinking’ (RNT) is linked to subsequent cognitive decline as well as the deposition of harmful brain proteins linked to Alzheimer’s.

The researchers say RNT should now be further investigated as a potential risk factor for dementia, and psychological tools, such as mindfulness or meditation, should be studied to see if these could reduce dementia risk.

Lead author Dr Natalie Marchant (UCL Psychiatry) said: “Depression and anxiety in mid-life and old age are already known to be risk factors for dementia. Here, we found that certain thinking patterns implicated in depression and anxiety could be an underlying reason why people with those disorders are more likely to develop dementia.

“Taken alongside other studies, which link depression and anxiety with dementia risk, we expect that chronic negative thinking patterns over a long period of time could increase the risk of dementia. We do not think the evidence suggests that short-term setbacks would increase one’s risk of dementia.

“We hope that our findings could be used to develop strategies to lower people’s risk of dementia by helping them to reduce their negative thinking patterns.”

For the Alzheimer’s Society-supported study, the research team from UCL, INSERM and McGill University studied 292 people over the age of 55 who were part of the PREVENT-AD cohort study, and a further 68 people from the IMAP+ cohort.

Over a period of two years, the study participants responded to questions about how they typically think about negative experiences, focusing on RNT patterns like rumination about the past and worry about the future. The participants also completed measures of depression and anxiety symptoms.

Their cognitive function was assessed, measuring memory, attention, spatial cognition, and language. Some (113) of the participants also underwent PET brain scans, measuring deposits of tau and amyloid, two proteins which cause the most common type of dementia, Alzheimer’s disease, when they build up in the brain.

The researchers found that people who exhibited higher RNT patterns experienced more cognitive decline over a four-year period, and declines in memory (which is among the earlier signs of Alzheimer’s disease), and they were more likely to have amyloid and tau deposits in their brain.

Depression and anxiety were associated with subsequent cognitive decline but not with either amyloid or tau deposition, suggesting that RNT could be the main reason why depression and anxiety contribute to Alzheimer’s disease risk.

“We propose that repetitive negative thinking may be a new risk factor for dementia as it could contribute to dementia in a unique way,” said Dr Marchant.

The researchers suggest that RNT may contribute to Alzheimer’s risk via its impact on indicators of stress such as high blood pressure, as other studies have found that physiological stress can contribute to amyloid and tau deposition.

Co-author Dr Gael Chételat (INSERM and Université de Caen-Normandie) commented: “Our thoughts can have a biological impact on our physical health, which might be positive or negative. Mental training practices such as meditation might help promoting positive- while down-regulating negative-associated mental schemes.

“Looking after your mental health is important, and it should be a major public health priority, as it’s not only important for people’s health and well-being in the short term, but it could also impact your eventual risk of dementia.”

The researchers hope to find out if reducing RNT, possibly through mindfulness training or targeted talk therapy, could in turn reduce the risk of dementia. Dr Marchant and Dr Chételat and other European researchers are currently working on a large project to see if interventions such as meditation may help reduce dementia risk by supporting mental health in old age.

Fiona Carragher, Director of Research and Influencing at Alzheimer’s Society, said: “Understanding the factors that can increase the risk of dementia is vital in helping us improve our knowledge of this devastating condition and, where possible, developing prevention strategies. The link shown between repeated negative thinking patterns and both cognitive decline and harmful deposits is interesting although we need further investigation to understand this better. Most of the people in the study were already identified as being at higher risk of Alzheimer’s disease, so we would need to see if these results are echoed within the general population and if repeated negative thinking increases the risk of Alzheimer’s disease itself.

“During these unstable times, we are hearing from people every day on our Alzheimer’s Society Dementia Connect line who are feeling scared, confused, or struggling with their mental health. So it’s important to point out that this isn’t saying a short-term period of negative thinking will cause Alzheimer’s disease. Mental health could be a vital cog in the prevention and treatment of dementia; more research will tell us to what extent.”


Story Source:

Materials provided by University College London. Note: Content may be edited for style and length.


Journal Reference:

  1. Natalie L. Marchant, Lise R. Lovland, Rebecca Jones, Alexa Pichet Binette, Julie Gonneaud, Eider M. Arenaza-Urquijo, Gael Chételat, Sylvia Villeneuve. Repetitive negative thinking is associated with amyloid, tau, and cognitive decline. Alzheimer’s & Dementia, 2020; DOI: 10.1002/alz.12116