A reconstruction of Neanderthal mealtime Mauricio Anton/Science Source
Here’s another blow to the popular image of Neanderthals as brutish meat eaters: A new study of bacteria collected from Neanderthal teeth shows that our close cousins ate so many roots, nuts, or other starchy foods that they dramatically altered the type of bacteria in their mouths. The finding suggests our ancestors had adapted to eating lots of starch by at least 600,000 years ago—about the same time as they needed more sugars to fuel a big expansion of their brains.
The study is “groundbreaking,” says Harvard University evolutionary biologist Rachel Carmody, who was not part of the research. The work suggests the ancestors of both humans and Neanderthals were cooking lots of starchy foods at least 600,000 years ago. And they had already adapted to eating more starchy plants long before the invention of agriculture 10,000 years ago, she says.
The brains of our ancestors doubled in size between 2 million and 700,000 years ago. Researchers have long credited better stone tools and cooperative hunting: As early humans got better at killing animals and processing meat, they ate a higher quality diet, which gave them more energy more rapidly to fuel the growth of their hungrier brains.
Still, researchers have puzzled over how meat did the job. “For human ancestors to efficiently grow a bigger brain, they needed energy dense foods containing glucose”—a type of sugar—says molecular archaeologist Christina Warinner of Harvard and the Max Planck Institute for the Science of Human History. “Meat is not a good source of glucose.”
Researchers analyzed the bacterial DNA preserved in dental plaque of fossilized teeth, such as this one from a prehistoric human. Werner Siemens Foundation/Felix Wey
The starchy plants gathered by many living hunter-gatherers are an excellent source of glucose, however. To figure out whether oral bacteria track changes in diet or the environment, Warinner, Max Planck graduate student James Fellows Yates, and a large international team looked at the oral bacteria stuck to the teeth of Neanderthals, preagriculture modern humans that lived more than 10,000 years ago, chimps, gorillas, and howler monkeys. The researchers analyzed billions of DNA fragments from long-dead bacteria still preserved on the teeth of 124 individuals. One was a Neanderthal who lived 100,000 years ago at Pešturina Cave in Serbia, which produced the oldest oral microbiome genome reconstructed to date.
The communities of bacteria in the mouths of preagricultural humans and Neanderthals strongly resembled each other, the team reports today in the Proceedings of the National Academy of Sciences. In particular, humans and Neanderthals harbored an unusual group of Streptococcus bacteria in their mouths. These microbes had a special ability to bind to an abundant enzyme in human saliva called amylase, which frees sugars from starchy foods. The presence of the strep bacteria that consume sugar on the teeth of Neanderthals and ancient modern humans, but not chimps, shows they were eating more starchy foods, the researchers conclude.
Finding the streptococci on the teeth of both ancient humans and Neanderthals also suggests they inherited these microbes from their common ancestor, who lived more than 600,000 years ago. Although earlier studies found evidence that Neanderthals ate grasses and tubers and cooked barley, the new study indicates they ate so much starch that it dramatically altered the composition of their oral microbiomes.
“This pushes the importance of starch in the diet further back in time,” to when human brains were still expanding, Warinner says. Because the amylase enzyme is much more efficient at digesting cooked rather than raw starch, the finding also suggests cooking, too, was common by 600,000 years ago, Carmody says. Researchers have debated whether cooking became common when the big brain began to expand almost 2 million years ago or it spread later, during a second surge of growth.
The study offers a new way to detect major shifts in diet, says geneticist Ran Blekhman of the University of Minnesota, Twin Cities. In the case of Neanderthals, it reveals how much they depended on plants.
“We sometimes have given short shrift to the plant components of the diet,” says anthropological geneticist Anne Stone of Arizona State University, Tempe. “As we know from modern hunter-gatherers, it’s often the gathering that ends up providing a substantial portion of the calories.”
A review of millions of blood tests has shown a whole host of human hormones that fall into clear seasonal patterns, although these changes are small in magnitude.
Hormones from the pituitary gland, which help control reproduction, metabolism, stress and lactation, were mostly found to peak in late summer.
Peripheral organs under the control of the pituitary, like those that produce our sex hormones or the thyroid hormone, also showed seasonality. Instead of peaking in summer, however, these hormones hit their stride in winter.
Testosterone, estradiol, and progesterone, for instance, reached their pinnacle in late winter or spring.
The findings provide the strongest evidence to date that humans possess an internal seasonal clock, which somehow impacts our hormones in a way that lines up with the seasons.
“Together with a long history of studies on a winter−spring peak in human function and growth, the hormone seasonality indicates that, like other animals, humans may have a physiological peak season for basic biological functions,” the authors write.
The underlying mechanism that drives this circannual clock is still unknown, but the authors suggest there is a natural, year-long feedback circuit at play between the pituitary gland and peripheral glands in the body.
The pituitary hormones, which are uniquely tuned to sunlight, could be feeding these other organs over the course of a year, allowing them to grow in functional mass in a way that aligns with the seasons.
“Thus, humans may show coordinated seasonal set-points with a winter−spring peak in the growth, stress, metabolism, and reproduction axes,” the authors write.
As the paper mentions, it’s not too different from what we find in other mammals, where fluctuations in certain hormones lead to seasonal changes in an animal’s reproduction, activity, growth, pigmentation, or migration.
Mammals like arctic reindeer, for instance, show a decrease in a hormone called leptin when winter days become shortest, and this helps lower their energy consumption, decreasing their body temperature and inhibiting their ability to reproduce.
Even primates closer to the equator show sensitivity to subtle seasonal changes. For instance, Rhesus macaques ovulate significantly more during the post-monsoon season so that their offspring are born just before the monsoons hit in summer.
Whether or not human hormones also fluctuate with the seasons remains unclear.
Most datasets that have been analysed so far are not very large and do not cover all human hormones, which makes drawing conclusions very challenging. Studies have either examined only human sex hormones, or they have focused on stress and metabolic hormones. Results have also been quite varied and inconsistent.
While some studies on human sex hormones suggest seasonal changes should be considered, other studies conclude seasons are an unimportant source of variability.
Meanwhile, research on salivary cortisol levels – aka the stress hormone – finds there is some seasonal variability, and a big data study on the thyroid-stimulating hormone found higher levels of this hormone in summer and winter.
The new research is the largest of the lot and includes a massive dataset of Israeli health records covering 46 million person-years. It also analyses all human hormones.
Controlling for changes throughout a single day, the authors found humans do show seasonal patterns in their hormone levels, although not as strongly as other mammals.
The physiological effects of these hormonal shifts are still not clear, but some of the changes to the thyroid hormone, T3, and the stress hormone, cortisol, do align with previous findings.
For example, the thyroid hormone, which was found to peak in winter, has been tied to thermogeneration. The seasonal timing of cortisol, which was found to peak in February, also agrees with past studies spanning the northern and southern hemispheres.
The seasonal changes are small in magnitude, but as the authors point out, from a clinical perspective, “even a small systematic effect can cause misdiagnosis if the normal ranges are not adapted to the seasons, with associated costs of extra tests and treatment.”
More studies on a similarly large scale and in various parts of the world will need to be done to verify the results further. But the findings suggest we are not so different from other mammals after all.
If our hormones really do ebb and flow with the seasons, even just a little bit, it could be important for our health that we know.
Earlier this summer, the Summit supercomputer at Oak Ridge National Lab in Tennessee set about crunching data on more than 40,000 genes from 17,000 genetic samples in an effort to better understand Covid-19. Summit is the second-fastest computer in the world, but the process — which involved analyzing 2.5 billion genetic combinations — still took more than a week.
When Summit was done, researchers analyzed the results. It was, in the words of Dr. Daniel Jacobson, lead researcher and chief scientist for computational systems biology at Oak Ridge, a “eureka moment.” The computer had revealed a new theory about how Covid-19 impacts the body: the bradykinin hypothesis. The hypothesis provides a model that explains many aspects of Covid-19, including some of its most bizarre symptoms. It also suggests 10-plus potential treatments, many of which are already FDA approved. Jacobson’s group published their results in a paper in the journal eLife in early July.
According to the team’s findings, a Covid-19 infection generally begins when the virus enters the body through ACE2 receptors in the nose, (The receptors, which the virus is known to target, are abundant there.) The virus then proceeds through the body, entering cells in other places where ACE2 is also present: the intestines, kidneys, and heart. This likely accounts for at least some of the disease’s cardiac and GI symptoms.
But once Covid-19 has established itself in the body, things start to get really interesting. According to Jacobson’s group, the data Summit analyzed shows that Covid-19 isn’t content to simply infect cells that already express lots of ACE2 receptors. Instead, it actively hijacks the body’s own systems, tricking it into upregulating ACE2 receptors in places where they’re usually expressed at low or medium levels, including the lungs.
In this sense, Covid-19 is like a burglar who slips in your unlocked second-floor window and starts to ransack your house. Once inside, though, they don’t just take your stuff — they also throw open all your doors and windows so their accomplices can rush in and help pillage more efficiently.
The renin–angiotensin system (RAS) controls many aspects of the circulatory system, including the body’s levels of a chemical called bradykinin, which normally helps to regulate blood pressure. According to the team’s analysis, when the virus tweaks the RAS, it causes the body’s mechanisms for regulating bradykinin to go haywire. Bradykinin receptors are resensitized, and the body also stops effectively breaking down bradykinin. (ACE normally degrades bradykinin, but when the virus downregulates it, it can’t do this as effectively.)
The end result, the researchers say, is to release a bradykinin storm — a massive, runaway buildup of bradykinin in the body. According to the bradykinin hypothesis, it’s this storm that is ultimately responsible for many of Covid-19’s deadly effects. Jacobson’s team says in their paper that “the pathology of Covid-19 is likely the result of Bradykinin Storms rather than cytokine storms,” which had been previously identified in Covid-19 patients, but that “the two may be intricately linked.” Other papers had previously identified bradykinin storms as a possible cause of Covid-19’s pathologies.
Covid-19 is like a burglar who slips in your unlocked second-floor window and starts to ransack your house.
As bradykinin builds up in the body, it dramatically increases vascular permeability. In short, it makes your blood vessels leaky. This aligns with recent clinical data, which increasingly views Covid-19 primarily as a vascular disease, rather than a respiratory one. But Covid-19 still has a massive effect on the lungs. As blood vessels start to leak due to a bradykinin storm, the researchers say, the lungs can fill with fluid. Immune cells also leak out into the lungs, Jacobson’s team found, causing inflammation.
And Covid-19 has another especially insidious trick. Through another pathway, the team’s data shows, it increases production of hyaluronic acid (HLA) in the lungs. HLA is often used in soaps and lotions for its ability to absorb more than 1,000 times its weight in fluid. When it combines with fluid leaking into the lungs, the results are disastrous: It forms a hydrogel, which can fill the lungs in some patients. According to Jacobson, once this happens, “it’s like trying to breathe through Jell-O.”
This may explain why ventilators have proven less effective in treating advanced Covid-19 than doctors originally expected, based on experiences with other viruses. “It reaches a point where regardless of how much oxygen you pump in, it doesn’t matter, because the alveoli in the lungs are filled with this hydrogel,” Jacobson says. “The lungs become like a water balloon.” Patients can suffocate even while receiving full breathing support.
The bradykinin hypothesis also extends to many of Covid-19’s effects on the heart. About one in five hospitalized Covid-19 patients have damage to their hearts, even if they never had cardiac issues before. Some of this is likely due to the virus infecting the heart directly through its ACE2 receptors. But the RAS also controls aspects of cardiac contractions and blood pressure. According to the researchers, bradykinin storms could create arrhythmias and low blood pressure, which are often seen in Covid-19 patients.
The bradykinin hypothesis also accounts for Covid-19’s neurological effects, which are some of the most surprising and concerning elements of the disease. These symptoms (which include dizziness, seizures, delirium, and stroke) are present in as many as half of hospitalized Covid-19 patients. According to Jacobson and his team, MRI studies in France revealed that many Covid-19 patients have evidence of leaky blood vessels in their brains.
Bradykinin — especially at high doses — can also lead to a breakdown of the blood-brain barrier. Under normal circumstances, this barrier acts as a filter between your brain and the rest of your circulatory system. It lets in the nutrients and small molecules that the brain needs to function, while keeping out toxins and pathogens and keeping the brain’s internal environment tightly regulated.
If bradykinin storms cause the blood-brain barrier to break down, this could allow harmful cells and compounds into the brain, leading to inflammation, potential brain damage, and many of the neurological symptoms Covid-19 patients experience. Jacobson told me, “It is a reasonable hypothesis that many of the neurological symptoms in Covid-19 could be due to an excess of bradykinin. It has been reported that bradykinin would indeed be likely to increase the permeability of the blood-brain barrier. In addition, similar neurological symptoms have been observed in other diseases that result from an excess of bradykinin.”
Increased bradykinin levels could also account for other common Covid-19 symptoms. ACE inhibitors — a class of drugs used to treat high blood pressure — have a similar effect on the RAS system as Covid-19, increasing bradykinin levels. In fact, Jacobson and his team note in their paper that “the virus… acts pharmacologically as an ACE inhibitor” — almost directly mirroring the actions of these drugs.
By acting like a natural ACE inhibitor, Covid-19 may be causing the same effects that hypertensive patients sometimes get when they take blood pressure–lowering drugs. ACE inhibitors are known to cause a dry cough and fatigue, two textbook symptoms of Covid-19. And they can potentially increase blood potassium levels, which has also been observed in Covid-19 patients. The similarities between ACE inhibitor side effects and Covid-19 symptoms strengthen the bradykinin hypothesis, the researchers say.
ACE inhibitors are also known to cause a loss of taste and smell. Jacobson stresses, though, that this symptom is more likely due to the virus “affecting the cells surrounding olfactory nerve cells” than the direct effects of bradykinin.
Though still an emerging theory, the bradykinin hypothesis explains several other of Covid-19’s seemingly bizarre symptoms. Jacobson and his team speculate that leaky vasculature caused by bradykinin storms could be responsible for “Covid toes,” a condition involving swollen, bruised toes that some Covid-19 patients experience. Bradykinin can also mess with the thyroid gland, which could produce the thyroid symptoms recently observed in some patients.
The bradykinin hypothesis could also explain some of the broader demographic patterns of the disease’s spread. The researchers note that some aspects of the RAS system are sex-linked, with proteins for several receptors (such as one called TMSB4X) located on the X chromosome. This means that “women… would have twice the levels of this protein than men,” a result borne out by the researchers’ data. In their paper, Jacobson’s team concludes that this “could explain the lower incidence of Covid-19 induced mortality in women.” A genetic quirk of the RAS could be giving women extra protection against the disease.
The bradykinin hypothesis provides a model that “contributes to a better understanding of Covid-19” and “adds novelty to the existing literature,” according to scientists Frank van de Veerdonk, Jos WM van der Meer, and Roger Little, who peer-reviewed the team’s paper. It predicts nearly all the disease’s symptoms, even ones (like bruises on the toes) that at first appear random, and further suggests new treatments for the disease.
As Jacobson and team point out, several drugs target aspects of the RAS and are already FDA approved to treat other conditions. They could arguably be applied to treating Covid-19 as well. Several, like danazol, stanozolol, and ecallantide, reduce bradykinin production and could potentially stop a deadly bradykinin storm. Others, like icatibant, reduce bradykinin signaling and could blunt its effects once it’s already in the body.
Interestingly, Jacobson’s team also suggests vitamin D as a potentially useful Covid-19 drug. The vitamin is involved in the RAS system and could prove helpful by reducing levels of another compound, known as REN. Again, this could stop potentially deadly bradykinin storms from forming. The researchers note that vitamin D has already been shown to help those with Covid-19. The vitamin is readily available over the counter, and around 20% of the population is deficient. If indeed the vitamin proves effective at reducing the severity of bradykinin storms, it could be an easy, relatively safe way to reduce the severity of the virus.
Other compounds could treat symptoms associated with bradykinin storms. Hymecromone, for example, could reduce hyaluronic acid levels, potentially stopping deadly hydrogels from forming in the lungs. And timbetasin could mimic the mechanism that the researchers believe protects women from more severe Covid-19 infections. All of these potential treatments are speculative, of course, and would need to be studied in a rigorous, controlled environment before their effectiveness could be determined and they could be used more broadly.
Covid-19 stands out for both the scale of its global impact and the apparent randomness of its many symptoms. Physicians have struggled to understand the disease and come up with a unified theory for how it works. Though as of yet unproven, the bradykinin hypothesis provides such a theory. And like all good hypotheses, it also provides specific, testable predictions — in this case, actual drugs that could provide relief to real patients.
The researchers are quick to point out that “the testing of any of these pharmaceutical interventions should be done in well-designed clinical trials.” As to the next step in the process, Jacobson is clear: “We have to get this message out.” His team’s finding won’t cure Covid-19. But if the treatments it points to pan out in the clinic, interventions guided by the bradykinin hypothesis could greatly reduce patients’ suffering — and potentially save lives.
Summary: Psychologists find that we are less likely to amplify fears in social exchange if we are stressed.
New psychology research from the University of Konstanz reveals that stress changes the way we deal with risky information — results that shed light on how stressful events, such as a global crisis, can influence how information and misinformation about health risks spreads in social networks.
“The global coronavirus crisis, and the pandemic of misinformation that has spread in its wake, underscores the importance of understanding how people process and share information about health risks under stressful times,” says Professor Wolfgang Gaissmaier, Professor in Social Psychology at the University of Konstanz, and senior author on the study. “Our results uncovered a complex web in which various strands of endocrine stress, subjective stress, risk perception, and the sharing of information are interwoven.”
The study, which appears in the journal Scientific Reports, brings together psychologists from the DFG Cluster of Excellence “Centre for the Advanced Study of Collective Behaviour” at the University of Konstanz: Gaissmaier, an expert in risk dynamics, and Professor Jens Pruessner, who studies the effects of stress on the brain. The study also includes Nathalie Popovic, first author on the study and a former graduate student at the University of Konstanz, Ulrike Bentele, also a Konstanz graduate student, and Mehdi Moussaïd from the Max Planck Institute for Human Development in Berlin.
In our hyper-connected world, information flows rapidly from person to person. The COVID-19 pandemic has demonstrated how risk information — such as about dangers to our health — can spread through social networks and influence people’s perception of the threat, with severe repercussions on public health efforts. However, whether or not stress influences this has never been studied.
“Since we are often under acute stress even in normal times and particularly so during the current health pandemic, it seems highly relevant not only to understand how sober minds process this kind of information and share it in their social networks, but also how stressed minds do,” says Pruessner, a Professor in Clinical Neuropsychology working at the Reichenau Centre of Psychiatry, which is also an academic teaching hospital of the University of Konstanz.
To do this, researchers had participants read articles about a controversial chemical substance, then report their risk perception of the substance before and after reading the articles, and say what information they would pass on to others. Just prior to this task, half of the group was exposed to acute social stress, which involved public speaking and mental arithmetic in front of an audience, while the other half completed a control task.
The results showed that experiencing a stressful event drastically changes how we process and share risk information. Stressed participants were less influenced by the articles and chose to share concerning information to a significantly smaller degree. Notably, this dampened amplification of risk was a direct function of elevated cortisol levels indicative of an endocrine-level stress response. In contrast, participants who reported subjective feelings of stress did show higher concern and more alarming risk communication.
“On the one hand, the endocrine stress reaction may thus contribute to underestimating risks when risk information is exchanged in social contexts, whereas feeling stressed may contribute to overestimating risks, and both effects can be harmful,” says Popovic. “Underestimating risks can increase incautious actions such as risky driving or practising unsafe sex. Overestimating risks can lead to unnecessary anxieties and dangerous behaviours, such as not getting vaccinated.”
By revealing the differential effects of stress on the social dynamics of risk perception, the Konstanz study shines light on the relevance of such work not only from an individual, but also from a policy perspective. “Coming back to the ongoing COVID-19 pandemic, it highlights that we do not only need to understand its virology and epidemiology, but also the psychological mechanisms that determine how we feel and think about the virus, and how we spread those feelings and thoughts in our social networks,” says Gaissmaier.
photo credit: Topic / Shutterstock. It used to be thought that the lymphatic system stopped at the neck, but it has now been found to reach into the brain
In contradiction to decades of medical education, a direct connection has been reported between the brain and the immune system. Claims this radical always require plenty of testing, even after winning publication, but this could be big news for research into diseases like multiple sclerosis (MS) and Alzheimer’s.
It seems astonishing that, after centuries of dissection, a system of lymphatic vessels could have survived undetected. That, however, is exactly what Professor Jonathan Kipnis of the University of Virginia claims in Nature.
Old and new representations of the lymphatic system that carries immune cells around the body. Credit: University of Virginia Health System
“It changes entirely the way we perceive the neuro-immune interaction,” says Kipnis. “We always perceived it before as something esoteric that can’t be studied. But now we can ask mechanistic questions.”
MS is known to be an example of the immune system attacking the brain, although the reasons are poorly understood. The opportunity to study lymphatic vessels that link the brain to the immune system could transform our understanding of how these attacks occur, and what could stop them. The causes of Alzheimer’s disease are even more controversial, but may also have immune system origins, and the authors suggest protein accumulation is a result of the vessels failing to do their job.
Indeed, Kipnis claims, “We believe that for every neurological disease that has an immune component to it, these vessels may play a major role.”
The discovery originated when Dr. Antoine Louveau, a researcher in Kipnis’ lab, mounted the membranes that cover mouse brains, known as meninges, on a slide. In the dural sinuses, which drain blood from the brain, he noticed linear patterns in the arrangement of immune T-cells. “I called Jony [Kipnis] to the microscope and I said, ‘I think we have something,'” Louveau recalls.
Kipnis was skeptical, and now says, “I thought that these discoveries ended somewhere around the middle of the last century. But apparently they have not.” Extensive further research convinced him and a group of co-authors from some of Virginia’s most prestigious neuroscience institutes that the vessels are real, they carry white blood cells and they also exist in humans. The network, they report, “appears to start from both eyes and track above the olfactory bulb before aligning adjacent to the sinuses.”
Kipnis pays particular credit to colleague Dr. Tajie Harris who enabled the team to image the vessels in action on live animals, confirming their function. Louveau also credits the discovery to fixing the meninges to a skullcap before dissecting, rather than the other way around. This, along with the closeness of the network to a blood vessel, is presumably why no one has observed it before.
The authors say the vessels, “Express all of the molecular hallmarks of lymphatic endothelial cells, are able to carry both fluid and immune cells from the cerebrospinal fluid, and are connected to the deep cervical lymph nodes.”
The authors add that the network bears many resemblances to the peripheral lymphatic system, but it “displays certain unique features,” including being “less complex [and] composed of narrower vessels.”
The discovery reinforces findings that immune cells are present even within healthy brains, a notion that was doubted until recently.
Meningial lymphatic vessels in mice. Credit: Louveau et al, Nature.
Summary: New insights into day-length measurement in flies have been uncovered by researchers. The study has corroborated previous observations that flies developed under short days become significantly more cold-resistant compared with flies raised in long-days, suggesting that this response can be used to study seasonal photoperiodic timing. Photoperiodism is the physiological reaction of organisms to the length of day or night, occurring in both plants and animals.
Researchers from the University of Leicester have for the first time provided experimental evidence for a genetic link between two major timing mechanisms, the circadian clock and the seasonal timer.
New research from the Tauber laboratory at the University of Leicester, which will be published in the academic journal PLOS Genetics on 4 September, has corroborated previous observations that flies developed under short days become significantly more cold-resistant compared with flies raised in long-days, suggesting that this response can be used to study seasonal photoperiodic timing.
Photoperiodism is the physiological reaction of organisms to the length of day or night, occurring in both plants and animals.
Dr Mirko Pegoraro, a member of the team, explained: “The ability to tell the difference between a long and short day is essential for accurate seasonal timing, as the photoperiod changes regularly and predictably along the year.”
The difference in cold response can be easily seen using the chill-coma recovery assay — in which flies exposed to freezing temperatures enter a reversible narcosis. The recovery time from this narcosis reflects how cold-adaptive the flies are.
The team has demonstrated that this response is largely regulated by the photoperiod — for example, flies exposed to short days (winter-like) during development exhibit shorter recovery times (more cold adapted) during the narcosis test.
Dr Eran Tauber from the University of Leicester’s Department of Genetics explained: “Seasonal timing is a key process for survival for most organisms, especially in regions with a mild climate. In a broad range of species, from plants to mammals, the annual change in day-length is monitored by the so-called ‘photoperiodic clock’.
“Many insects for example, including numerous agricultural pests, detect the shortening of the day during the autumn and switch to diapause — a developmental arrest — which allows them to survive the winter.
“Despite intensive study of the photoperiodic clock for the last 80 years, however, the underlying molecular mechanism is still largely unknown. This is in marked contrast to our understanding of the circadian clock that regulates daily rhythms.”
The team has tested mutant strains in which the circadian clock is disrupted and has found that the photoperiodic clock was also disrupted, providing the first experimental evidence for the role of the circadian clock in seasonal photoperiodic timing in flies.
The new research is based on an automated system, allowing the monitoring of hundreds of flies, which paves the way for new insights into our understanding of the genes involved in the photoperiodic response and seasonal timing.
Professor Melanie Welham, Executive Director for Science, at the Biotechnology and Biological Sciences Research Council (BBSRC), said: “This study shows an interesting genetic link between the circadian clock and the seasonal timer. The ubiquity of these clocks across so many species makes this an important discovery which will lead to a better understanding of these essential processes.”
Journal Reference:
Mirko Pegoraro, Joao S. Gesto, Charalambos P. Kyriacou, Eran Tauber. Role for Circadian Clock Genes in Seasonal Timing: Testing the Bünning Hypothesis.PLOS Genetics, September 2014 DOI: 10.1371/journal.pgen.1004603
Why does the metabolism of a sloth differ from that of a human? Brains are a big reason, say researchers who recently carried out a detailed comparison of metabolism in humans and other mammals.CreditFelipe Dana/Associated Press
Carl Zimmer
All animals do the same thing to the food they eat — they break it down to extract fuel and building blocks for growing new tissue. But the metabolism of one species may be profoundly different from another’s. A sloth will generate just enough energy to hang from a tree, for example, while some birds can convert their food into a flight from Alaska to New Zealand.
For decades, scientists have wondered how our metabolism compares to that of other species. It’s been a hard question to tackle, because metabolism is complicated — something that anyone who’s stared at a textbook diagram knows all too well. As we break down our food, we produce thousands of small molecules, some of which we flush out of our bodies and some of which we depend on for our survival.
An international team of researchers has now carried out a detailed comparison of metabolism in humans and other mammals. As they report in the journal PLOS Biology, both our brains and our muscles turn out to be unusual, metabolically speaking. And it’s possible that their odd metabolism was part of what made us uniquely human.
When scientists first began to study metabolism, they could measure it only in simple ways. They might estimate how many calories an animal burned in a day, for example. If they were feeling particularly ambitious, they might try to estimate how many calories each organ in the animal’s body burned.
Those tactics were enough to reveal some striking things about metabolism. Compared with other animals, we humans have ravenous brains. Twenty percent of the calories we take in each day are consumed by our neurons as they send signals to one another.
Ten years ago, Philipp Khaitovich of the Max Planck Institute of Evolutionary Anthropology and his colleagues began to study human metabolism in a more detailed way. They started making a catalog of the many molecules produced as we break down food.
“We wanted to get as much data as possible, just to see what happened,” said Dr. Khaitovich.
To do so, the scientists obtained brain, muscle and kidney tissues from organ donors. They then extracted metabolic compounds like glucose from the samples and measured their concentrations. All told, they measured the levels of over 10,000 different molecules.
The scientists found that each tissue had a different metabolic fingerprint, with high levels of some molecules and low levels of others.
These distinctive fingerprints came as little surprise, since each tissue has a different job to carry out. Muscles need to burn energy to generate mechanical forces, for example, while kidney cells need to pull waste out of the bloodstream.
The scientists then carried out the same experiment on chimpanzees, monkeys and mice. They found that the metabolic fingerprint for a given tissue was usually very similar in closely related species. The same tissues in more distantly related species had fingerprints with less in common.
But the scientists found two exceptions to this pattern.
The first exception turned up in the front of the brain. This region, called the prefrontal cortex, is important for figuring out how to reach long-term goals. Dr. Khaitovich’s team found that the way the human prefrontal cortex uses energy is quite distinct from other species; other tissues had comparable metabolic fingerprints across species, and even in other regions of the brain, the scientists didn’t find such a drastic difference.
This result fit in nicely with findings by other scientists that the human prefrontal cortex expanded greatly over the past six million years of our evolution. Its expansion accounts for much of the extra demand our brains make for calories.
The evolution of our enormous prefrontal cortex also had a profound effect on our species. We use it for many of the tasks that only humans can perform, such as reflecting on ourselves, thinking about what others are thinking and planning for the future.
But the prefrontal cortex was not the only part of the human body that has experienced a great deal of metabolic evolution. Dr. Khaitovich and his colleagues found that the metabolic fingerprint of muscle is even more distinct in humans.
“Muscle was really off the charts,” Dr. Khaitovich said. “We didn’t expect to see that at all.”
It was possible that the peculiar metabolism in human muscle was just the result of our modern lifestyle — not an evolutionary shift in our species. Our high-calorie diet might change the way muscle cells generated energy. It was also possible that a sedentary lifestyle made muscles weaker, creating a smaller metabolic demand.
To test that possibility, Dr. Khaitovich compared the strength of humans to that of our closest relatives. They found that chimpanzees and monkeys are far stronger, for their weight, than even university basketball players or professional climbers.
The scientists also tested their findings by putting monkeys on a couch-potato regime for a month to see if their muscles acquired a human metabolic fingerprint.
They barely changed.
Dr. Khaitovich suspects that the metabolic fingerprint of our muscles represents a genuine evolutionary change in our species.
Karen Isler and Carel van Schaik of the University of Zurich have argued that the gradual changes in human brains and muscles were intimately linked. To fuel a big brain, our ancestors had to sacrifice other tissues, including muscles.
Dr. Isler said that the new research fit their hypothesis nicely. “It looks quite convincing,” she said.
Daniel E. Lieberman, a professor of human evolutionary biology at Harvard, said he found Dr. Khaitovich’s study “very cool,” but didn’t think the results meant that brain growth came at the cost of strength. Instead, he suggested, our ancestors evolved muscles adapted for a new activity: long-distance walking and running.
“We have traded strength for endurance,” he said. And that endurance allowed our ancestors to gather more food, which could then fuel bigger brains.
“It may be that the human brain is bigger not in spite of brawn but rather because of brawn, albeit a very different kind,” he said.