Arquivo da tag: Hominídeos

Neanderthals carb loaded, helping grow their big brains (Science)

sciencemag.org

By Ann GibbonsMay. 10, 2021 , 3:00 PM 5-7 minutos


A reconstruction of Neanderthal mealtime Mauricio Anton/Science Source

Here’s another blow to the popular image of Neanderthals as brutish meat eaters: A new study of bacteria collected from Neanderthal teeth shows that our close cousins ate so many roots, nuts, or other starchy foods that they dramatically altered the type of bacteria in their mouths. The finding suggests our ancestors had adapted to eating lots of starch by at least 600,000 years ago—about the same time as they needed more sugars to fuel a big expansion of their brains.

The study is “groundbreaking,” says Harvard University evolutionary biologist Rachel Carmody, who was not part of the research. The work suggests the ancestors of both humans and Neanderthals were cooking lots of starchy foods at least 600,000 years ago. And they had already adapted to eating more starchy plants long before the invention of agriculture 10,000 years ago, she says.

The brains of our ancestors doubled in size between 2 million and 700,000 years ago. Researchers have long credited better stone tools and cooperative hunting: As early humans got better at killing animals and processing meat, they ate a higher quality diet, which gave them more energy more rapidly to fuel the growth of their hungrier brains.

Still, researchers have puzzled over how meat did the job. “For human ancestors to efficiently grow a bigger brain, they needed energy dense foods containing glucose”—a type of sugar—says molecular archaeologist Christina Warinner of Harvard and the Max Planck Institute for the Science of Human History. “Meat is not a good source of glucose.”

Researchers analyzed the bacterial DNA preserved in dental plaque of fossilized teeth, such as this one from a prehistoric human. Werner Siemens Foundation/Felix Wey

The starchy plants gathered by many living hunter-gatherers are an excellent source of glucose, however. To figure out whether oral bacteria track changes in diet or the environment, Warinner, Max Planck graduate student James Fellows Yates, and a large international team looked at the oral bacteria stuck to the teeth of Neanderthals, preagriculture modern humans that lived more than 10,000 years ago, chimps, gorillas, and howler monkeys. The researchers analyzed billions of DNA fragments from long-dead bacteria still preserved on the teeth of 124 individuals. One was a Neanderthal who lived 100,000 years ago at Pešturina Cave in Serbia, which produced the oldest oral microbiome genome reconstructed to date.

The communities of bacteria in the mouths of preagricultural humans and Neanderthals strongly resembled each other, the team reports today in the Proceedings of the National Academy of Sciences. In particular, humans and Neanderthals harbored an unusual group of Streptococcus bacteria in their mouths. These microbes had a special ability to bind to an abundant enzyme in human saliva called amylase, which frees sugars from starchy foods. The presence of the strep bacteria that consume sugar on the teeth of Neanderthals and ancient modern humans, but not chimps, shows they were eating more starchy foods, the researchers conclude.

Finding the streptococci on the teeth of both ancient humans and Neanderthals also suggests they inherited these microbes from their common ancestor, who lived more than 600,000 years ago. Although earlier studies found evidence that Neanderthals ate grasses and tubers and cooked barley, the new study indicates they ate so much starch that it dramatically altered the composition of their oral microbiomes.

“This pushes the importance of starch in the diet further back in time,” to when human brains were still expanding, Warinner says. Because the amylase enzyme is much more efficient at digesting cooked rather than raw starch, the finding also suggests cooking, too, was common by 600,000 years ago, Carmody says. Researchers have debated whether cooking became common when the big brain began to expand almost 2 million years ago or it spread later, during a second surge of growth.

The study offers a new way to detect major shifts in diet, says geneticist Ran Blekhman of the University of Minnesota, Twin Cities. In the case of Neanderthals, it reveals how much they depended on plants.

“We sometimes have given short shrift to the plant components of the diet,” says anthropological geneticist Anne Stone of Arizona State University, Tempe. “As we know from modern hunter-gatherers, it’s often the gathering that ends up providing a substantial portion of the calories.”

Israeli Archaeologists Present Groundbreaking Universal Theory of Human Evolution (Haaretz)

Tel Aviv University archaeologists Miki Ben-Dor and Ran Barkai proffer novel hypothesis, showing how the greed of Homo erectus set us careening down an anomalous evolutionary path

Ruth Schuster, Feb. 25, 2021

Why the human brain evolved as it did never has been plausibly explained. Apparently, not since the first life-form billions of years ago did a single species gain dominance over all others – until we came along. Now, in a groundbreaking paper, two Israeli researchers propose that our anomalous evolution was propelled by the very mass extinctions we helped cause. Or: As we sawed off the culinary branches from which we swung, we had to get ever more inventive in order to survive.

As ambling, slow-to-reproduce large animals diminished and gradually went extinct, we were forced to resort to smaller, nimbler animals that flee as a strategy to escape predation. To catch them, we had to get smarter, nimbler and faster, according to the universal theory of human evolution proposed by researchers Miki Ben-Dor and Prof. Ran Barkai of Tel Aviv University, in a paper published in the journal Quaternary.

In fact, the great African megafauna began to decline about 4.6 million years ago. But our story begins with Homo habilis, which lived about 2.6 million years ago and apparently used crude stone tools to help it eat flesh, and with Homo erectus, which thronged Africa and expanded to Eurasia about 2 million years ago. The thing is, erectus wasn’t an omnivore: it was a carnivore, Ben-Dor explains to Haaretz.

“Eighty percent of mammals are omnivores but still specialize in a narrow food range. If anything, it seems Homo erectus was a hyper-carnivore,” he observes.

And in the last couple of million years, our brains grew threefold to a maximum capacity of about 1,500 cranial capacity, a size achieved about 300,000 years ago. We also gradually but consistently ramped up in technology and culture – until the Neolithic revolution and advent of the sedentary lifestyle, when our brains shrank to about 1,400 to 1,300cc, but more on that anomaly later.

The hypothesis suggested by Ben-Dor and Barkai – that we ate our way to our present physical, cultural and ecological state – is an original unifying explanation for the behavioral, physiological and cultural evolution of the human species.

Out of chaos

Evolution is chaotic. Charles Darwin came up with the theory of the survival of the fittest, and nobody has a better suggestion yet, but mutations aren’t “planned.” Bodies aren’t “designed,” if we leave genetic engineering out of it. The point is, evolution isn’t linear but chaotic, and that should theoretically apply to humans too.

Hence, it is strange that certain changes in the course of millions of years of human history, including the expansion of our brain, tool manufacture techniques and use of fire, for example, were uncharacteristically progressive, say Ben-Dor and Barkai.

“Uncharacteristically progressive” means that certain traits such as brain size, or cultural developments such as fire usage, evolved in one direction over a long time, in the direction of escalation. That isn’t what chaos is expected to produce over vast spans of time, Barkai explains to Haaretz: it is bizarre. Very few parameters behave like that.

So, their discovery of correlation between contraction of the average weight of African animals, the extinction of megafauna and the development of the human brain is intriguing.

From mammoth marrow to joint of rat

To be clear, just this month a new paper posited that the late Quaternary extinction of megafauna, in the last few tens of thousands of years, wasn’t entirely the fault of humanity. In North America specifically, it was due primarily to climate change, with the late-arriving humans apparently providing the coup de grâce to some species.

In the Old World, however, a human role is clearer. African megafauna apparently began to decline 4.6 million years ago, but during the Pleistocene (2.6 million to 11,600 years ago) the size of African animals trended sharply down, in what the authors term an abrupt reversal from a continuous growth trend of 65 million years (i.e., since the dinosaurs almost died out).

When Homo erectus the carnivore began to roam Africa around 2 million years ago, land mammals averaged nearly 500 kilograms. Barkai’s team and others have demonstrated that hominins ate elephants and large animals when they could. In fact, originally Africa had six elephant species (today there are two: the bush elephant and forest elephant). By the end of the Pleistocene, by which time all hominins other than modern humans were extinct too, that average weight of the African animal had shrunk by more than 90 percent.

And during the Pleistocene, as the African animals shrank, the Homo genus grew taller and more gracile, and our stone tool technology improved (which in no way diminished our affection for archaic implements like the hand ax or chopper, both of which remained in use for more than a million years, even as more sophisticated technologies were developed).

If we started some 3.3 million years ago with large, crude stone hammers that may have been used to bang big animals on the head or break bones to get at the marrow, over the epochs we invented the spear for remote slaughter. By about 80,000 years ago, the bow and arrow was making its appearance, which was more suitable for bringing down small fry like small deer and birds. Over a million years ago, we began to use fire, and later achieved better control of it, meaning the ability to ignite it at will. Later we domesticated the dog from the wolf, and it would help us hunt smaller, fleet animals.

Why did the earliest humans hunt large animals anyway? Wouldn’t a peeved elephant be more dangerous than a rat? Arguably, but catching one elephant is easier than catching a large number of rats. And megafauna had more fat.

A modern human can only derive up to about 50 percent of calories from lean meat (protein): past a certain point, our livers can’t digest more protein. We need energy from carbs or fat, but before developing agriculture about 10,000 years ago, a key source of calories had to be animal fat.

Big animals have a lot of fat. Small animals don’t. In Africa and Europe, and in Israel too, the researchers found a significant decline in the prevalence of animals weighing over 200 kilograms correlated to an increase in the volume of the human brain. Thus, Ben-Dor and Barkai deduce that the declining availability of large prey seems to have been a key element in the natural selection from Homo erectus onward. Catching one elephant is more efficient than catching 1,000 rabbits, but if we must catch 1,000 rabbits, improved cunning, planning and tools are in order.

Say it with fat

Our changing hunting habits would have had cultural impacts too, Ben-Dor and Barkai posit. “Cultural evolution in archaeology usually refers to objects, such as stone tools,” Ben-Dor tells Haaretz. But cultural evolution also refers to learned behavior, such as our choice of which animals to hunt, and how.

Thus, they posit, our hunting conundrum may have also been a key element to that enigmatic human characteristic: complex language. When language began, with what ancestor of Homo sapiens, if any before us, is hotly debated.

Ben-Dor, an economist by training prior to obtaining a Ph.D. in archaeology, believes it began early. “We just need to follow the money. When speaking of evolution, one must follow the energy. Language is energetically costly. Speaking requires devotion of part of the brain, which is costly. Our brain consumes huge amounts of energy. It’s an investment, and language has to produce enough benefit to make it worthwhile. What did language bring us? It had to be more energetically efficient hunting.”

Domestication of the dog also requires resources and, therefore, also had to bring sufficient compensation in the form of more efficient hunting of smaller animals, he points out. That may help explain the fact that Neolithic humans not only embraced the dog but ate it too, going by archaeological evidence of butchered dogs.

At the end of the day, wherever we went, humans devastated the local ecologies, given enough time.

There is a lot of thinking about the Neolithic agricultural revolution. Some think grain farming was driven by the desire to make beer. Given residue analysis indicating that it’s been around for over 10,000 years, that theory isn’t as far-fetched as one might think. Ben-Dor and Barkai suggest that once we could grow our own food and husband herbivores, the megafauna almost entirely gone, hunting for them became too energy-costly. So we had to use our large brains to develop agriculture.

And as the hunter-gathering lifestyle gave way to permanent settlement, our brain size decreased.

Note, Ben-Dor adds, that the brains of wolves which have to hunt to survive are larger than the brain of the domesticated wolf, i.e., dogs. We did promise more on that. That was it. Also: The chimpanzee brain has remained stable for 7 million years, since the split with the Homo line, Barkai points out.

“Why does any of this matter?” Ben-Dor asks. “People think humans reached this condition because it was ‘meant to be.’ But in the Earth’s 4.5 billion years, there have been billions of species. They rose and fell. What’s the probability that we would take over the world? It’s an accident of nature. It never happened before that one species achieved dominance over all, and now it’s all over. How did that happen? This is the answer: A non-carnivore entered the niche of carnivore, and ate out its niche. We can’t eat that much protein: we need fat too. Because we needed the fat, we began with the big animals. We hunted the prime adult animals which have more fat than the kiddies and the old. We wiped out the prime adults who were crucial to survival of species. Because of our need for fat, we wiped out the animals we depended on. And this required us to keep getting smarter and smarter, and thus we took over the world.”

Did Human Evolution Include a Semi-Aquatic Phase? (The Scientist)

A recent book outlines fossil evidence supporting the controversial hypothesis.

Peter Rhys-Evans
Apr 1, 2020

For the past 150 years, scientists and laypeople alike have accepted a “savanna” scenario of human evolution. The theory, primarily based on fossil evidence, suggests that because our ancestral ape family members were living in the trees of East African forests, and because we humans live on terra firma, our primate ancestors simply came down from the trees onto the grasslands and stood upright to see farther over the vegetation, increasing their efficiency as hunter-gatherers. In the late 19th century, anthropologists only had a few Neanderthal fossils to study, and science had very little knowledge of genetics and evolutionary changes. So this savanna theory of human evolution became ingrained in anthropological dogma and has remained the established explanation of early hominin evolution following the genetic split from our primate cousins 6 million to 7 million years ago.

But in 1960, a different twist on human evolution emerged. That year, marine biologist Sir Alister Hardy wrote an article in New Scientist suggesting a possible aquatic phase in our evolution, noting Homo sapiens’s differences from other primates and similarities to other aquatic and semi-aquatic mammals. In 1967, zoologist Desmond Morris published The Naked Ape, which explored different theories about why modern humans lost their fur. Morris mentioned Hardy’s “aquatic ape” hypothesis as an “ingenious” theory that sufficiently explained “why we are so nimble in the water today and why our closest living relatives, the chimpanzees, are so helpless and quickly drown.”

CRC Press, July 2019

Morris concluded, however, that “despite its most appealing indirect evidence, the aquatic theory lacks solid support.” Even if eventually the aquatic ape hypothesis turns out to be true, he continued, it need not completely rewrite the story of human evolution, but rather add to our species’ evolutionary arc a “salutary christening ceremony.”

In 1992, I published a paper describing a curious ear condition colloquially known as “surfer’s ear,” which I and other ear, nose, and throat doctors frequently see in clinics. Exostoses are small bones that grow in the outer ear canal, but only in humans who swim and dive on a regular, almost daily basis. In modern humans, there is undisputed evidence of aural exostoses in people who swim and dive, with the size and extent being directly dependent on the frequency and length of exposure to water, as well as its temperature.

I predicted that if these exostoses were found in early hominin skulls, it would provide vital fossil evidence for frequent swimming and diving by our ancestors. Researchers have now found these features in 1 million– to 2 million–year-old hominin skulls. In a recent study on nearly two dozen Neanderthal skulls, about 47 percent had exostoses. There are many other references to contemporary, historical, and archaeological coastal and river communities with a significantly increased incidence of aural exostoses. In my latest book, The Waterside Ape, I propose that the presence of exostoses in the skulls of ancient human ancestors is a prime support for an aquatic phase of our evolution, which may explain our unique human phenotype.

Other Homo sapiens–specific features that may be tied to a semi-aquatic stage of human evolution include erect posture, loss of body hair, deposition of subcutaneous fat, a completely different heat-regulation system from other primates, and kidneys that function much like those of aquatic mammals. This combination of characteristics, which do not exist in any other terrestrial mammal, would have gradually arisen over several million years. The finding of the bipedal hominin named “Lucy,” dating to 3.5 million years ago, suggested that walking on two legs was the initial major evolutionary adaptation to a semi-aquatic habitat. By the time the Neanderthals appeared some 400,000 to 300,000 years ago, their semi-aquatic lifestyle—swimming, diving, and perhaps hunting for food on land and in the water—may have been firmly part of day-to-day life.

In my opinion, the accumulated fossil, anatomical, and physiological evidence about early hominin evolution points to our human ancestors learning to survive as semi-aquatic creatures in a changing East African environment. After transitioning to bipedalism, ancient hominins had both forelimbs free from aiding in walking, which may have allowed for increasing manual dexterity and skills. Perhaps a marine diet with lipoproteins that are essential for brain development fueled the unique intellectual advances and ecological dominance of Homo sapiens.

Peter Rhys-Evans works in private practice as an otolaryngologist in London at several hospitals including the Harley Street Clinic. He is the founder and chairman of Oracle Cancer Trust, the largest head and neck cancer charity in the UK. Read an excerpt from The Waterside Ape. Follow Rhys-Evans on Twitter @TheWatersideApe.

Ancestors of Modern Humans Interbred With Extinct Hominins, Study Finds (N.Y.Times)

Carl Zimmer

Skulls of the Neanderthal man. Credit: European Press photo Agency 

The ancestors of modern humans interbred with Neanderthals and another extinct line of humans known as the Denisovans at least four times in the course of prehistory, according to an analysis of global genomes published Thursday in the journal Science. 

The interbreeding may have given modern humans genes that bolstered immunity to pathogens, the authors concluded.“This is yet another genetic nail in the coffin of our oversimplistic models of human evolution,” said Carles Lalueza-Fox, a research scientist at the Institute of Evolutionary Biology in Barcelona, Spain, who was not involved in the study. 

The new study expands on a series of findings in recent years showing that the ancestors of modern humans once shared the planet with a surprising number of near relatives — lineages like the Neanderthals and Denisovans that became extinct tens of thousands of years ago.

Before disappearing, however, they interbred with our forebears on at least several occasions. Today, we carry DNA from these encounters.

The first clues to ancient interbreeding surfaced in 2010, when scientists discovered that some modern humans — mostly Europeans — carried DNA that matched material recovered from Neanderthal fossils.

Later studies showed that the forebears of modern humans first encountered Neanderthals after expanding out of Africa more than 50,000 years ago.

But the Neanderthals were not the only extinct humans that our own ancestors found. A finger bone discovered in a Siberian cave, called Denisova, yielded DNA from yet another group of humans.

Research later indicated that all three groups — modern humans, Neanderthals and Denisovans — shared a common ancestor who lived roughly 600,000 years ago. And, perhaps no surprise, some ancestors of modern humans also interbred with Denisovans.

Some of their DNA has survived in people in Melanesia, a region of the Pacific that includes New Guinea and the islands around it.

Those initial discoveries left major questions unanswered, such as how often our ancestors interbred with Neanderthals and Denisovans. Scientists have developed new ways to study the DNA of living people to tackle these mysteries.

Joshua M. Akey, a geneticist at the University of Washington, and his colleagues analyzed a database of 1,488 genomes from people around the world. The scientists added 35 genomes from people in New Britain and other Melanesian islands in an effort to learn more about Denisovans in particular.

The researchers found that all of the non-Africans in their study had Neanderthal DNA, while the Africans had very little or none. That finding supported previous studies.

But when Dr. Akey and his colleagues compared DNA from modern Europeans, East Asians and Melanesians, they found that each population carried its own distinctive mix of Neanderthal genes.

The best explanation for these patterns, the scientists concluded, was that the ancestors of modern humans acquired Neanderthal DNA on three occasions.

The first encounter happened when the common ancestor of all non-Africans interbred with Neanderthals.

The second occurred among the ancestors of East Asians and Europeans, after the ancestors of Melanesians split off. Later, the ancestors of East Asians — but not Europeans — interbred a third time with Neanderthals.

Earlier studies had hinted at the possibility that the forebears of modern humans had multiple encounters with Neanderthals, but hard data had been lacking.

“A lot of people have been arguing for that, but now they’re really providing the evidence for it,” said Rasmus Nielsen, a geneticist at the University of California, Berkeley, who was not involved in the new study.

The Melanesians took a different course. After a single interbreeding with Neanderthals, Dr. Akey found, their ancestors went on to interbreed just once with Denisovans as well.

Where that encounter could have taken place remains an enigma. The only place Denisovan remains have been found is Siberia, a long way from New Guinea.

It is possible that Denisovans ranged down to Southeast Asia, Dr. Akey said, crossing paths with modern humans who later settled in Melanesia.

Dr. Akey and his colleagues also identified some regions of Neanderthal and Denisovan DNA that became more common in modern humans as generations passed, suggesting that they provided some kind of a survival advantage.

Many of the regions contain immune system genes, Dr. Akey noted.

“As modern humans are spreading out across the world, they’re encountering pathogens they haven’t experienced before,” he said. Neanderthals and Denisovans may have had genes that were adapted to fight those enemies.

“Maybe they really helped us survive and thrive in these new environments,” he said.

Dr. Akey and his colleagues found that Neanderthal and Denisovan DNA was glaringly absent from four regions of the modern human genome.

That absence may signal that these stretches of the genome are instrumental in making modern humans unique. Intriguingly, one of those regions includes a gene called FOXP2, which is involved in speech.

Scientists suspect that Neanderthals and Denisovans were not the only extinct races our ancestors interbred with.

PingHsun Hsieh, a biologist at the University of Arizona, and his colleagues reported last month that the genomes of African pygmies contained pieces of DNA that came from an unknown source within the last 30,000 years.

Dr. Akey and his colleagues are now following up with an analysis of African populations. “This potentially allows us to find new twigs on the human family tree,” he said.

Insect diet helped early humans build bigger brains: Quest for elusive bugs spurred primate tool use, problem-solving skills (Science Daily)

Date: July 1, 2014

Source: Washington University in St. Louis

Summary: Figuring out how to survive on a lean-season diet of hard-to-reach ants, slugs and other bugs may have spurred the development of bigger brains and higher-level cognitive functions in the ancestors of humans and other primates, suggests new research.

An adult female tufted capuchin monkey of the Sapajus lineage using a stone tool and a sandstone anvil to crack a palm nut as her infant hangs on. Credit: E. Visalberghi

Figuring out how to survive on a lean-season diet of hard-to-reach ants, slugs and other bugs may have spurred the development of bigger brains and higher-level cognitive functions in the ancestors of humans and other primates, suggests research from Washington University in St. Louis.

“Challenges associated with finding food have long been recognized as important in shaping evolution of the brain and cognition in primates, including humans,” said Amanda D. Melin, PhD, assistant professor of anthropology in Arts & Sciences and lead author of the study.

“Our work suggests that digging for insects when food was scarce may have contributed to hominid cognitive evolution and set the stage for advanced tool use.”

Based on a five-year study of capuchin monkeys in Costa Rica, the research provides support for an evolutionary theory that links the development of sensorimotor (SMI) skills, such as increased manual dexterity, tool use, and innovative problem solving, to the creative challenges of foraging for insects and other foods that are buried, embedded or otherwise hard to procure.

Published in the June 2014 Journal of Human Evolution, the study is the first to provide detailed evidence from the field on how seasonal changes in food supplies influence the foraging patterns of wild capuchin monkeys.

The study is co-authored by biologist Hilary C. Young and anthropologists Krisztina N. Mosdossy and Linda M. Fedigan, all from the University of Calgary, Canada.

It notes that many human populations also eat embedded insects on a seasonal basis and suggests that this practice played a key role in human evolution.

“We find that capuchin monkeys eat embedded insects year-round but intensify their feeding seasonally, during the time that their preferred food — ripe fruit — is less abundant,” Melin said. “These results suggest embedded insects are an important fallback food.”

Previous research has shown that fallback foods help shape the evolution of primate body forms, including the development of strong jaws, thick teeth and specialized digestive systems in primates whose fallback diets rely mainly on vegetation.

This study suggests that fallback foods can also play an important role in shaping brain evolution among primates that fall back on insect-based diets, and that this influence is most pronounced among primates that evolve in habitats with wide seasonal variations, such as the wet-dry cycles found in some South American forests.

“Capuchin monkeys are excellent models for examining evolution of brain size and intelligence for their small body size, they have impressively large brains,” Melin said. “Accessing hidden and well-protected insects living in tree branches and under bark is a cognitively demanding task, but provides a high-quality reward: fat and protein, which is needed to fuel big brains.”

But when it comes to using tools, not all capuchin monkey strains and lineages are created equal, and Melin’s theories may explain why.

Perhaps the most notable difference between the robust (tufted, genus Sapajus) and gracile (untufted, genus Cebus) capuchin lineages is their variation in tool use. While Cebus monkeys are known for clever food-foraging tricks, such as banging snails or fruits against branches, they can’t hold a stick to their Sapajus cousins when it comes to theinnovative use and modification of sophisticated tools.

One explanation, Melin said, is that Cebus capuchins have historically and consistently occupied tropical rainforests, whereas the Sapajus lineage spread from their origins in the Atlantic rainforest into drier, more temperate and seasonal habitat types.

“Primates who extract foods in the most seasonal environments are expected to experience the strongest selection in the ‘sensorimotor intelligence’ domain, which includes cognition related to object handling,” Melin said. “This may explain the occurrence of tool use in some capuchin lineages, but not in others.”

Genetic analysis of mitochondial chromosomes suggests that the Sapajus-Cebus diversification occurred millions of years ago in the late Miocene epoch.

“We predict that the last common ancestor of Cebus and Sapajus had a level of SMI more closely resembling extant Cebus monkeys, and that further expansion of SMI evolved in the robust lineage to facilitate increased access to varied embedded fallback foods,necessitated by more intense periods of fruit shortage,” she said.

One of the more compelling modern examples of this behavior, said Melin, is the seasonal consumption of termites by chimpanzees, whose use of tools to extract this protein-rich food source is an important survival technique in harsh environments.

What does this all mean for hominids?

While it’s hard to decipher the extent of seasonal dietary variations from the fossil record, stable isotope analyses indicate seasonal variation in diet for at least one South African hominin, Paranthropus robustus. Other isotopic research suggests that early human diets may have included a range of extractable foods, such as termites, plant roots and tubers.

Modern humans frequently consume insects, which are seasonally important when other animal foods are limited.

This study suggests that the ingenuity required to survive on a diet of elusive insects has been a key factor in the development of uniquely human skills: It may well have been bugs that helped build our brains.

Journal Reference:

  1. Amanda D. Melin, Hilary C. Young, Krisztina N. Mosdossy, Linda M. Fedigan.Seasonality, extractive foraging and the evolution of primate sensorimotor intelligenceJournal of Human Evolution, 2014; 71: 77 DOI:10.1016/j.jhevol.2014.02.009

Domestication of Dogs May Explain Mammoth Kill Sites and the Success of Early Modern Humans (The Pennsylvania State University)

Pat Shipman and Barbara K. Kennedy

May 30, 2014

a dog's skullA fragment of a large bone, probably from a mammoth, Pat Shipman reports, was placed in this dog’s mouth shortly after death. This finding suggests the animal was according special mortuary treatment, perhaps acknowledging its role in mammoth hunting. The fossil comes from the site of Predmosti, in the Czech republic, and is about 27,000 years B.P. old. This object is one of three canid skulls from Predmosti that were identified as dogs based on analysis of their morphology. Photo credit: Anthropos Museum, Brno, the Czech Republic, courtesy of Mietje Germonpre.

29 May 2014 — A new analysis of European archaeological sites containing large numbers of dead mammoths and dwellings built with mammoth bones has led Penn State Professor Emerita Pat Shipman to formulate a new interpretation of how these sites were formed. She suggests that their abrupt appearance may have been due to early modern humans working with the earliest domestic dogs to kill the now-extinct mammoth — a now-extinct animal distantly related to the modern-day elephant. Shipman’s analysis also provides a way to test the predictions of her new hypothesis. Advance publication of her article “How do you kill 86 mammoths?” is available online throughQuaternary International.

Spectacular archaeological sites yielding stone tools and extraordinary numbers of dead mammoths — some containing the remains of hundreds of individuals — suddenly became common in central and eastern Eurasia between about 45,000 and 15,000 years ago, although mammoths previously had been hunted by humans and their extinct relatives and ancestors for at least a million years. Some of these mysterious sites have huts built of mammoth bones in complex, geometric patterns as well as piles of butchered mammoth bones.

“One of the greatest puzzles about these sites is how such large numbers of mammoths could have been killed with the weapons available during that time,” Shipman said. Many earlier studies of the age distribution of the mammoths at these sites found similarities with modern elephants killed by hunting or natural disasters, but Shipman’s new analysis of the earlier studies found that they lacked the statistical evaluations necessary for concluding with any certainty how these animals were killed.

Surprisingly, Shipman said, she found that “few of the mortality patterns from these mammoth deaths matched either those from natural deaths among modern elephants killed by droughts or by culling operations with modern weapons that kill entire family herds of modern elephants at once.” This discovery suggested to Shipman that a successful new technique for killing such large animals had been developed and its repeated use over time could explain the mysterious, massive collections of mammoth bones in Europe.

hand-drawn mapThese maps show the locations of collections of mammoth bones at the archaeological sites that Pat Shipman analyzed in her paper that will be published in the journal Quaternary International. Credit: Jeffrey Mathison. 

The key to Shipman’s new hypothesis is recent work by a team led by Mietje Germonpré of the Royal Belgian Institute of Natural Sciences, which has uncovered evidence that some of the large carnivores at these sites were early domesticated dogs, not wolves as generally had been assumed. Then, with this evidence as a clue, Shipman used information about how humans hunt with dogs to formulate a series of testable predictions about these mammoth sites.

“Dogs help hunters find prey faster and more often, and dogs also can surround a large animal and hold it in place by growling and charging while hunters move in. Both of these effects would increase hunting success,” Shipman said. “Furthermore, large dogs like those identified by Germonpré either can help carry the prey home or, by guarding the carcass from other carnivores, can make it possible for the hunters to camp at the kill sites.” Shipman said that these predictions already have been confirmed by other analyses. In addition, she said, “if hunters working with dogs catch more prey, have a higher intake of protein and fat, and have a lower expenditure of energy, their reproductive rate is likely to rise.”

Another unusual feature of these large mammoth kill sites is the presence of extraordinary numbers of other predators, particularly wolves and foxes. “Both dogs and wolves are very alert to the presence of other related carnivores — the canids — and they defend their territories and food fiercely,” Shipman explained. “If humans were working and living with domesticated dogs or even semi-domesticated wolves at these archaeological sites, we would expect to find the new focus on killing the wild wolves that we see there.”

bonesThe photo shows part of the very-high-density concentration of mammoth bones at the Krakow-Spadzista Street archaeological site. Credit line Piotr Wojtal.

Two other types of studies have yielded data that support Shipman’s hypothesis. Hervé Bocherens and Dorothée Drucker of the University of Tubingen in Germany, carried out an isotopic analysis of the ones of wolves and purported dogs from the Czech site of Predmostí. They found that the individuals identified as dogs had different diets from those identified as wolves, possibly indicating feeding by humans. Also, analysis of mitochondrial DNA by Olaf Thalmann of the University of Turku in Finland, and others, showed that the individuals identified as dogs have a distinctive genetic signature that is not known from any other canid. “Since mitochondrial DNA is carried only by females, this finding may indicate that these odd canids did not give rise to modern domesticated dogs and were simply a peculiar, extinct group of wolves,” Shipman said. “Alternatively, it may indicate that early humans did domesticate wolves into dogs or a doglike group, but the female canids interbred with wild wolf males and so the distinctive female mitochondrial DNA lineage was lost.”

As more information is gathered on fossil canids dated to between 45,000 and 15,000 years ago, Shipman’s hunting-dog hypothesis will be supported “if more of these distinctive doglike canids are found at large, long-term sites with unusually high numbers of dead mammoths and wolves; if the canids are consistently large, strong individuals; and if their diets differ from those of wolves,” Shipman said. “Dogs may indeed be man’s best friend.”

Humanity’s forgotten return to Africa revealed in DNA (New Scientist)

20:00 03 February 2014 by Catherine Brahic

Call it humanity’s unexpected U-turn. One of the biggest events in the history of our species is the exodus out of Africa some 65,000 years ago, the start ofHomo sapiens‘ long march across the world. Now a study of southern African genes shows that, unexpectedly, another migration took western Eurasian DNA back to the very southern tip of the continent 3000 years ago.

According to conventional thinking, the Khoisan tribes of southern Africa, have lived in near-isolation from the rest of humanity for thousands of years. In fact, the study shows that some of their DNA matches most closely people from modern-day southern Europe, including Spain and Italy.

Because Eurasian people also carry traces of Neanderthal DNA, the finding also shows – for the first time – that genetic material from our extinct cousin may be widespread in African populations.

The Khoisan tribes of southern Africa are hunter-gatherers and pastoralists who speak unique click languages. Their extraordinarily diverse gene pool split from everyone else’s before the African exodus.

Ancient lineages

“These are very special, isolated populations, carrying what are probably the most ancient lineages in human populations today,” says David Reich of Harvard University. “For a lot of our genetic studies we had treated them as groups that had split from all other present-day humans before they had split from each other.”

So he and his colleagues were not expecting to find signs of western Eurasian genes in 32 individuals belonging to a variety of Khoisan tribes. “I think we were shocked,” says Reich.

The unexpected snippets of DNA most resembled sequences from southern Europeans, including Sardinians, Italians and people from the Basque region (see “Back to Africa – but from where?“). Dating methods suggested they made their way into the Khoisan DNA sometime between 900 and 1800 years ago – well before known European contact with southern Africa (see map).

Archaeological and linguistic studies of the region can make sense of the discovery. They suggest that a subset of the Khoisan, known as the Khoe-Kwadi speakers, arrived in southern Africa from east Africa around 2200 years ago. Khoe-Kwadi speakers were – and remain – pastoralists who make their living from herding cows and sheep. The suggestion is that they introduced herding to a region that was otherwise dominated by hunter-gatherers.

Khoe-Kwadi tribes

Reich and his team found that the proportion of Eurasian DNA was highest in Khoe-Kwadi tribes, who have up to 14 per cent of western Eurasian ancestry. What is more, when they looked at the east African tribes from which the Khoe-Kwadi descended, they found a much stronger proportion of Eurasian DNA – up to 50 per cent.

That result confirms a 2012 study by Luca Pagani of the Wellcome Trust Sanger Institute in Hinxton, UK, which found non-African genes in people living in Ethiopia. Both the 2012 study and this week’s new results show that the Eurasian genes made their way into east African genomes around 3000 years ago. About a millennium later, the ancestors of the Khoe-Kwadi headed south, carrying a weaker signal of the Eurasian DNA into southern Africa.

The cultural implications are complex and potentially uncomfortably close to European colonial themes. “I actually am not sure there’s any population that doesn’t have west Eurasian [DNA],” says Reich.

“These populations were always thought to be pristine hunter-gatherers who had not interacted with anyone for millennia,” says Reich’s collaborator, linguist Brigitte Pakendorf of the University of Lyon in France. “Well, no. Just like the rest of the world, Africa had population movements too. There was simply no writing, no Romans or Greeks to document it.”

Twist in tale

There’s one more twist to the tale. In 2010 a research team – including Reich – published the first draft genome of a Neanderthal. Comparisons with living humans revealed traces of Neanderthal DNA in all humans with one notable exception: sub-Saharan peoples like the Yoruba and Khoisan.

That made sense. After early humans migrated out of Africa around 60,000 years ago, they bumped into Neanderthals somewhere in what is now the Middle East. Some got rather cosy with each other. As their descendants spread across the world to Europe, Asia and eventually the Americas, they spread bits of Neanderthal DNA along with their own genes. But because those descendants did not move back into Africa until historical times, most of this continent remained a Neanderthal DNA-free zone.

Or so it seemed at the time. Now it appears that the Back to Africa migration 3000 years ago carried a weak Neanderthal genetic signal deep into the homeland. Indeed one of Reich’s analyses, published last month, found Neanderthal traces in Yoruba DNA (Nature, DOI: 10.1038/nature12886).

In other words, not only is western Eurasian DNA ancestry a global phenomenon, so is having a bit of Neanderthal living on inside you.

Journal reference: PNAS, DOI: 10.1073/pnas.1313787111

Back to Africa – but from where?

Reich and his colleagues found that DNA sequences in the Khoisan people most closely resemble some found in people who today live in southern Europe. That, however, does not mean the migration back to Africa started in Italy or Spain. More likely, the migration began in what is now the Middle East.

We know that southern Europeans can trace their ancestry to the Middle East. However, in the thousands of years since they – and the ancestors of the Khoisan – left the region, it has experienced several waves of immigration. These waves have had a significant effect on the genes of people living in the Middle East today, and and means southern Europeans are much closer to the original inhabitants of the Levant than modern-day Middle Easterners.