Arquivo da tag: Evolução

Study: Evolution now accepted by majority of Americans (EurekaAlert!)

News Release 20-Aug-2021

Peer-Reviewed Publication

University of Michigan

The level of public acceptance of evolution in the United States is now solidly above the halfway mark, according to a new study based on a series of national public opinion surveys conducted over the last 35 years.

“From 1985 to 2010, there was a statistical dead heat between acceptance and rejection of evolution,” said lead researcher Jon D. Miller of the Institute for Social Research at the University of Michigan. “But acceptance then surged, becoming the majority position in 2016.”

Examining data over 35 years, the study consistently identified aspects of education—civic science literacy, taking college courses in science and having a college degree—as the strongest factors leading to the acceptance of evolution.

“Almost twice as many Americans held a college degree in 2018 as in 1988,” said co-author Mark Ackerman, a researcher at Michigan Engineering, the U-M School of Information and Michigan Medicine. “It’s hard to earn a college degree without acquiring at least a little respect for the success of science.”

The researchers analyzed a collection of biennial surveys from the National Science Board, several national surveys funded by units of the National Science Foundations, and a series focused on adult civic literacy funded by NASA. Beginning in 1985, these national samples of U.S. adults were asked to agree or disagree with this statement: “Human beings, as we know them today, developed from earlier species of animals.”

The series of surveys showed that Americans were evenly divided on the question of evolution from 1985 to 2007. According to a 2005 study of the acceptance of evolution in 34 developed nations, led by Miller, only Turkey, at 27%, scored lower than the United States. But over the last decade, until 2019, the percentage of American adults who agreed with this statement increased from 40% to 54%.

The current study consistently identified religious fundamentalism as the strongest factor leading to the rejection of evolution. While their numbers declined slightly in the last decade, approximately 30% of Americans continue to be religious fundamentalists as defined in the study. But even those who scored highest on the scale of religious fundamentalism shifted toward acceptance of evolution, rising from 8% in 1988 to 32% in 2019.

Miller predicted that religious fundamentalism would continue to impede the public acceptance of evolution. 

“Such beliefs are not only tenacious but also, increasingly, politicized,” he said, citing a widening gap between Republican and Democratic acceptance of evolution. 

As of 2019, 34% of conservative Republicans accepted evolution compared to 83% of liberal Democrats.

The study is published in the journal Public Understanding of Science.

Besides Miller and Ackerman, the authors are Eugenie Scott and Glenn Branch of the National Center for Science Education; Belén Laspra of the University of Oviedo in Spain; and Carmelo Polino of the University of Oviedo and Centre Redes in Argentina; and Jordan Huffaker of U-M.

Study abstract: Public acceptance of evolution in the United States, 1985-2020 

A PDF of the study is available upon request


Journal

Public Understanding of Science

Por que somos a única espécie humana do planeta (El País)

brasil.elpais.com

Nuño Domínguez, 04 jul 2021 – 12:48 BRT

Três grandes descobertas feitas nos últimos dias nos obrigam a repensar as origens da humanidade


Três descobertas nos últimos dias acabam de mudar o que sabíamos sobre a origem da raça humana e da nossa própria espécie, Homo sapiens. Talvez − dizem alguns especialistas − precisemos abandonar esse conceito para nos referir a nós mesmos, pois as novas descobertas sugerem que somos uma criatura de Frankenstein com partes de outras espécies humanas com as quais, não muito tempo atrás, compartilhamos planeta, sexo e filhos.

As descobertas da última semana indicam que cerca de 200.000 anos atrás havia até oito espécies ou grupos humanos diferentes. Todos faziam parte do gênero Homo, que nos engloba. Os recém-chegados apresentam uma interessante mistura de traços primitivos − arcos enormes acima das sobrancelhas, cabeça achatada − e modernos. O “homem dragão” da China tinha uma capacidade craniana tão grande quanto a dos humanos atuais, ou até superior. O Homo de Nesher Ramla, encontrado em Israel, pode ter sido o que deu origem aos neandertais e aos denisovanos que ocuparam, respectivamente, a Europa e a Ásia e com quem nossa espécie teve repetidos encontros sexuais, dos quais nasceram filhos mestiços que foram aceitos em suas respectivas tribos como mais um.

Agora sabemos que devido àqueles cruzamentos todas as pessoas de fora da África têm 3% de DNA neandertal, ou que os habitantes do Tibete têm genes transmitidos pelos denisovanos para poder viver em grandes altitudes. Algo muito mais inquietante foi revelado pela análise genética das populações atuais da Nova Guiné: é possível que os denisovanos − um ramo irmão dos neandertais − tenham vivido até apenas 15.000 anos atrás, uma distância muito pequena em termos evolutivos.

A terceira grande descoberta dos últimos dias é quase detetivesca. Na análise de DNA conservado no solo da caverna de Denisova, na Sibéria, foi encontrado material genético dos humanos autóctones, os denisovanos, de neandertais e de sapiens em períodos tão próximos que poderiam até se sobrepor. Lá foram encontrados há três anos os restos do primeiro híbrido entre espécies humanas que se conhece: uma menina filha de uma neandertal e de um denisovano.

O paleoantropólogo Florent Detroit descobriu para a ciência outra dessas novas espécies humanas: o Homo luzonensis, que viveu em uma ilha das Filipinas há 67.000 anos e que apresenta uma estranha mistura de traços que poderiam ser o resultado de sua longa evolução em isolamento durante mais de um milhão de anos. É um pouco parecido com o que experimentou seu contemporâneo Homo floresiensis, ou “homem de Flores”, um humano de um metro e meio que viveu em uma ilha indonésia. Tinha um cérebro do tamanho do de um chimpanzé, mas se for aplicado a ele o teste de inteligência mais usado pelos paleoantropólogos, podemos dizer que era tão avançado quanto o sapiens, pois suas ferramentas de pedra eram igualmente evoluídas.

Imagem radiográfica da mandíbula do ‘Homo’ de Nesher Ramla descoberta em Israel.
Imagem radiográfica da mandíbula do ‘Homo’ de Nesher Ramla descoberta em Israel.Ariel Pokhojaev

A esses dois habitantes insulares se soma o Homo erectus, o primeiro Homo viajante que saiu da África há cerca de dois milhões de anos. Ele conquistou a Ásia e lá viveu até pelo menos 100.000 anos atrás. O oitavo passageiro desta história seria o Homo daliensis, um fóssil encontrado na China com uma mistura de erectus e sapiens, embora seja possível que acabe sendo incluído na nova linhagem do Homo longi.

“Não me surpreende que houvesse várias espécies humanas vivas ao mesmo tempo”, afirma Detroit. “Se considerarmos o último período geológico que começou há 2,5 milhões de anos, sempre houve diferentes gêneros e espécies de hominídeos compartilhando o planeta. A grande exceção é a atualidade, nunca havia existido apenas uma espécie humana na Terra”, reconhece. Por que nós, os sapiens, somos os únicos sobreviventes?

Para Juan Luis Arsuaga, paleoantropólogo do sítio arqueológico de Atapuerca, no norte da Espanha, a resposta é que “somos uma espécie hipersocial, os únicos capazes de construir laços além do parentesco, ao contrário dos demais mamíferos”. “Compartilhamos ficções consensuais como pátria, religião, língua, times de futebol; e chegamos a sacrificar muitas coisas por elas”, assinala. Nem mesmo a espécie humana mais próxima de nós, os neandertais, que criavam adornos, símbolos e arte, tinham esse comportamento. Arsuaga resume assim: “Os neandertais não tinham bandeira”. Por razões ainda desconhecidas, essa espécie se extinguiu há cerca de 40.000 anos.

Os sapiens não eram “estritamente superiores” a seus congêneres, opina Antonio Rosas, paleoantropólogo do Conselho Superior de Pesquisas Científicas da Espanha. “Agora sabemos que somos o resultado de hibridações com outras espécies, e o conjunto de características que temos foi o perfeito para aquele momento”, explica. Uma possível vantagem adicional é que os grupos sapiens eram mais numerosos que os neandertais, o que significa menos endogamia e melhor saúde das populações.

Detroit acredita que parte da explicação está na própria essência da nossa espécie sapiens, “sábio” em latim. “Temos um cérebro enorme que devemos alimentar, por isso precisamos de muitos recursos e, portanto, de muito território”, assinala. “O Homo sapiens teve uma expansão demográfica enorme e é bem possível que a disputa pelo território fosse muito dura para as demais espécies”, acrescenta.

María Martinón-Torres, diretora do Centro Nacional de Pesquisa sobre Evolução Humana, com sede em Burgos, acredita que o segredo seja a “hiperadaptabilidade”. “A nossa é uma espécie invasiva, não necessariamente mal-intencionada, mas somos como o cavalo de Átila da evolução”, compara. “Por onde passamos, e com nosso estilo de vida, diminui a diversidade biológica, incluindo a humana. Somos uma das forças ecológicas de maior impacto do planeta e essa história, a nossa, começou a se delinear no Pleistoceno [o período que começou há 2,5 milhões de anos e terminou há cerca de 10.000, quando o sapiens já era a única espécie humana que restava no planeta]”, acrescenta.

As descobertas dos últimos dias voltam a expor um problema crescente: os cientistas estão denominando cada vez mais espécies humanas. Tem sentido fazer isso? Para o paleoantropólogo israelense Israel Hershkovitz, autor da descoberta do Homo de Nesher Ramla, não. “Há muitas espécies”, afirma. “A definição clássica diz que duas espécies diferentes não podem ter filhos férteis. O DNA nos diz que sapiens, neandertais e denisovanos tiveram, por isso deveriam ser considerados a mesma espécie”, aponta.

“Se somos sapiens, então essas espécies que são nossos ancestrais por meio da miscigenação também são”, reforça João Zilhão, professor da Instituição Catalã de Pesquisa e Estudos Avançados na Universidade de Barcelona.

Essa questão é objeto de discórdia entre especialistas. “A hibridação é muito comum em espécies atuais, especialmente no mundo vegetal”, lembra José María Bermúdez de Castro, codiretor das pesquisas em Atapuerca. “Pode-se matizar o conceito de espécie, mas acho que não podemos abandoná-lo, porque é muito útil para podermos nos entender”, ressalta.

Escavações no sítio arqueológico de Nesher Ramla.
Escavações no sítio arqueológico de Nesher Ramla. Zaidner

Muitas nuances entram em jogo nessa questão. A evidente diferença entre sapiens e neandertais não é a mesma coisa que a identidade como espécie do Homo luzonensis, do qual só conhecemos alguns poucos ossos e dentes, ou dos denisovanos, dos quais a maioria das informações vem do DNA extraído de fósseis minúsculos.

“Curiosamente, apesar dos cruzamentos frequentes, tanto os sapiens como os neandertais foram espécies perfeitamente reconhecíveis e distinguíveis até o fim”, destaca Martinón-Torres. “Os traços do neandertal tardio são mais marcados que os dos anteriores, em vez de terem se apagado como consequência do cruzamento. Houve trocas biológicas, e talvez culturais também, mas nenhuma das espécies deixou de ser ela, distintiva, reconhecível em sua biologia, seu aspecto, suas adaptações específicas, seu nicho ecológico ao longo de sua história evolutiva. Acredito que esse é o melhor exemplo de que a hibridação não colide necessariamente com o conceito de espécie”, conclui. Seu colega Hershkovitz alerta que o debate continuará: “Estamos fazendo escavações em outras três cavernas em Israel onde encontramos fósseis humanos que nos darão uma nova perspectiva sobre a evolução humana”.

UMaine researchers: Culture drives human evolution more than genetics (Eureka Alert!)

News Release 2-Jun-2021

University of Maine

Research News

In a new study, University of Maine researchers found that culture helps humans adapt to their environment and overcome challenges better and faster than genetics.

After conducting an extensive review of the literature and evidence of long-term human evolution, scientists Tim Waring and Zach Wood concluded that humans are experiencing a “special evolutionary transition” in which the importance of culture, such as learned knowledge, practices and skills, is surpassing the value of genes as the primary driver of human evolution.

Culture is an under-appreciated factor in human evolution, Waring says. Like genes, culture helps people adjust to their environment and meet the challenges of survival and reproduction. Culture, however, does so more effectively than genes because the transfer of knowledge is faster and more flexible than the inheritance of genes, according to Waring and Wood.

Culture is a stronger mechanism of adaptation for a couple of reasons, Waring says. It’s faster: gene transfer occurs only once a generation, while cultural practices can be rapidly learned and frequently updated. Culture is also more flexible than genes: gene transfer is rigid and limited to the genetic information of two parents, while cultural transmission is based on flexible human learning and effectively unlimited with the ability to make use of information from peers and experts far beyond parents. As a result, cultural evolution is a stronger type of adaptation than old genetics.

Waring, an associate professor of social-ecological systems modeling, and Wood, a postdoctoral research associate with the School of Biology and Ecology, have just published their findings in a literature review in the Proceedings of the Royal Society B, the flagship biological research journal of The Royal Society in London.

“This research explains why humans are such a unique species. We evolve both genetically and culturally over time, but we are slowly becoming ever more cultural and ever less genetic,” Waring says.

Culture has influenced how humans survive and evolve for millenia. According to Waring and Wood, the combination of both culture and genes has fueled several key adaptations in humans such as reduced aggression, cooperative inclinations, collaborative abilities and the capacity for social learning. Increasingly, the researchers suggest, human adaptations are steered by culture, and require genes to accommodate.

Waring and Wood say culture is also special in one important way: it is strongly group-oriented. Factors like conformity, social identity and shared norms and institutions — factors that have no genetic equivalent — make cultural evolution very group-oriented, according to researchers. Therefore, competition between culturally organized groups propels adaptations such as new cooperative norms and social systems that help groups survive better together.

According to researchers, “culturally organized groups appear to solve adaptive problems more readily than individuals, through the compounding value of social learning and cultural transmission in groups.” Cultural adaptations may also occur faster in larger groups than in small ones.

With groups primarily driving culture and culture now fueling human evolution more than genetics, Waring and Wood found that evolution itself has become more group-oriented.

“In the very long term, we suggest that humans are evolving from individual genetic organisms to cultural groups which function as superorganisms, similar to ant colonies and beehives,” Waring says. “The ‘society as organism’ metaphor is not so metaphorical after all. This insight can help society better understand how individuals can fit into a well-organized and mutually beneficial system. Take the coronavirus pandemic, for example. An effective national epidemic response program is truly a national immune system, and we can therefore learn directly from how immune systems work to improve our COVID response.”

###

Waring is a member of the Cultural Evolution Society, an international research network that studies the evolution of culture in all species. He applies cultural evolution to the study of sustainability in social-ecological systems and cooperation in organizational evolution.

Wood works in the UMaine Evolutionary Applications Laboratory managed by Michael Kinnison, a professor of evolutionary applications. His research focuses on eco-evolutionary dynamics, particularly rapid evolution during trophic cascades.

Neanderthals carb loaded, helping grow their big brains (Science)

sciencemag.org

By Ann GibbonsMay. 10, 2021 , 3:00 PM 5-7 minutos


A reconstruction of Neanderthal mealtime Mauricio Anton/Science Source

Here’s another blow to the popular image of Neanderthals as brutish meat eaters: A new study of bacteria collected from Neanderthal teeth shows that our close cousins ate so many roots, nuts, or other starchy foods that they dramatically altered the type of bacteria in their mouths. The finding suggests our ancestors had adapted to eating lots of starch by at least 600,000 years ago—about the same time as they needed more sugars to fuel a big expansion of their brains.

The study is “groundbreaking,” says Harvard University evolutionary biologist Rachel Carmody, who was not part of the research. The work suggests the ancestors of both humans and Neanderthals were cooking lots of starchy foods at least 600,000 years ago. And they had already adapted to eating more starchy plants long before the invention of agriculture 10,000 years ago, she says.

The brains of our ancestors doubled in size between 2 million and 700,000 years ago. Researchers have long credited better stone tools and cooperative hunting: As early humans got better at killing animals and processing meat, they ate a higher quality diet, which gave them more energy more rapidly to fuel the growth of their hungrier brains.

Still, researchers have puzzled over how meat did the job. “For human ancestors to efficiently grow a bigger brain, they needed energy dense foods containing glucose”—a type of sugar—says molecular archaeologist Christina Warinner of Harvard and the Max Planck Institute for the Science of Human History. “Meat is not a good source of glucose.”

Researchers analyzed the bacterial DNA preserved in dental plaque of fossilized teeth, such as this one from a prehistoric human. Werner Siemens Foundation/Felix Wey

The starchy plants gathered by many living hunter-gatherers are an excellent source of glucose, however. To figure out whether oral bacteria track changes in diet or the environment, Warinner, Max Planck graduate student James Fellows Yates, and a large international team looked at the oral bacteria stuck to the teeth of Neanderthals, preagriculture modern humans that lived more than 10,000 years ago, chimps, gorillas, and howler monkeys. The researchers analyzed billions of DNA fragments from long-dead bacteria still preserved on the teeth of 124 individuals. One was a Neanderthal who lived 100,000 years ago at Pešturina Cave in Serbia, which produced the oldest oral microbiome genome reconstructed to date.

The communities of bacteria in the mouths of preagricultural humans and Neanderthals strongly resembled each other, the team reports today in the Proceedings of the National Academy of Sciences. In particular, humans and Neanderthals harbored an unusual group of Streptococcus bacteria in their mouths. These microbes had a special ability to bind to an abundant enzyme in human saliva called amylase, which frees sugars from starchy foods. The presence of the strep bacteria that consume sugar on the teeth of Neanderthals and ancient modern humans, but not chimps, shows they were eating more starchy foods, the researchers conclude.

Finding the streptococci on the teeth of both ancient humans and Neanderthals also suggests they inherited these microbes from their common ancestor, who lived more than 600,000 years ago. Although earlier studies found evidence that Neanderthals ate grasses and tubers and cooked barley, the new study indicates they ate so much starch that it dramatically altered the composition of their oral microbiomes.

“This pushes the importance of starch in the diet further back in time,” to when human brains were still expanding, Warinner says. Because the amylase enzyme is much more efficient at digesting cooked rather than raw starch, the finding also suggests cooking, too, was common by 600,000 years ago, Carmody says. Researchers have debated whether cooking became common when the big brain began to expand almost 2 million years ago or it spread later, during a second surge of growth.

The study offers a new way to detect major shifts in diet, says geneticist Ran Blekhman of the University of Minnesota, Twin Cities. In the case of Neanderthals, it reveals how much they depended on plants.

“We sometimes have given short shrift to the plant components of the diet,” says anthropological geneticist Anne Stone of Arizona State University, Tempe. “As we know from modern hunter-gatherers, it’s often the gathering that ends up providing a substantial portion of the calories.”

Humans were apex predators for two million years (Eureka Alert!)

News Release 5-Apr-2021

What did our ancestors eat during the stone age? Mostly meat

Tel-Aviv University

IMAGE
IMAGE: Human Brain. Credit: Dr. Miki Ben Dor

Researchers at Tel Aviv University were able to reconstruct the nutrition of stone age humans. In a paper published in the Yearbook of the American Physical Anthropology Association, Dr. Miki Ben-Dor and Prof. Ran Barkai of the Jacob M. Alkov Department of Archaeology at Tel Aviv University, together with Raphael Sirtoli of Portugal, show that humans were an apex predator for about two million years. Only the extinction of larger animals (megafauna) in various parts of the world, and the decline of animal food sources toward the end of the stone age, led humans to gradually increase the vegetable element in their nutrition, until finally they had no choice but to domesticate both plants and animals – and became farmers.

“So far, attempts to reconstruct the diet of stone-age humans were mostly based on comparisons to 20th century hunter-gatherer societies,” explains Dr. Ben-Dor. “This comparison is futile, however, because two million years ago hunter-gatherer societies could hunt and consume elephants and other large animals – while today’s hunter gatherers do not have access to such bounty. The entire ecosystem has changed, and conditions cannot be compared. We decided to use other methods to reconstruct the diet of stone-age humans: to examine the memory preserved in our own bodies, our metabolism, genetics and physical build. Human behavior changes rapidly, but evolution is slow. The body remembers.”

In a process unprecedented in its extent, Dr. Ben-Dor and his colleagues collected about 25 lines of evidence from about 400 scientific papers from different scientific disciplines, dealing with the focal question: Were stone-age humans specialized carnivores or were they generalist omnivores? Most evidence was found in research on current biology, namely genetics, metabolism, physiology and morphology.

“One prominent example is the acidity of the human stomach,” says Dr. Ben-Dor. “The acidity in our stomach is high when compared to omnivores and even to other predators. Producing and maintaining strong acidity require large amounts of energy, and its existence is evidence for consuming animal products. Strong acidity provides protection from harmful bacteria found in meat, and prehistoric humans, hunting large animals whose meat sufficed for days or even weeks, often consumed old meat containing large quantities of bacteria, and thus needed to maintain a high level of acidity. Another indication of being predators is the structure of the fat cells in our bodies. In the bodies of omnivores, fat is stored in a relatively small number of large fat cells, while in predators, including humans, it’s the other way around: we have a much larger number of smaller fat cells. Significant evidence for the evolution of humans as predators has also been found in our genome. For example, geneticists have concluded that “areas of the human genome were closed off to enable a fat-rich diet, while in chimpanzees, areas of the genome were opened to enable a sugar-rich diet.”

Evidence from human biology was supplemented by archaeological evidence. For instance, research on stable isotopes in the bones of prehistoric humans, as well as hunting practices unique to humans, show that humans specialized in hunting large and medium-sized animals with high fat content. Comparing humans to large social predators of today, all of whom hunt large animals and obtain more than 70% of their energy from animal sources, reinforced the conclusion that humans specialized in hunting large animals and were in fact hypercarnivores.

“Hunting large animals is not an afternoon hobby,” says Dr. Ben-Dor. “It requires a great deal of knowledge, and lions and hyenas attain these abilities after long years of learning. Clearly, the remains of large animals found in countless archaeological sites are the result of humans’ high expertise as hunters of large animals. Many researchers who study the extinction of the large animals agree that hunting by humans played a major role in this extinction – and there is no better proof of humans’ specialization in hunting large animals. Most probably, like in current-day predators, hunting itself was a focal human activity throughout most of human evolution. Other archaeological evidence – like the fact that specialized tools for obtaining and processing vegetable foods only appeared in the later stages of human evolution – also supports the centrality of large animals in the human diet, throughout most of human history.”

The multidisciplinary reconstruction conducted by TAU researchers for almost a decade proposes a complete change of paradigm in the understanding of human evolution. Contrary to the widespread hypothesis that humans owe their evolution and survival to their dietary flexibility, which allowed them to combine the hunting of animals with vegetable foods, the picture emerging here is of humans evolving mostly as predators of large animals.

“Archaeological evidence does not overlook the fact that stone-age humans also consumed plants,” adds Dr. Ben-Dor. “But according to the findings of this study plants only became a major component of the human diet toward the end of the era.”

Evidence of genetic changes and the appearance of unique stone tools for processing plants led the researchers to conclude that, starting about 85,000 years ago in Africa, and about 40,000 years ago in Europe and Asia, a gradual rise occurred in the consumption of plant foods as well as dietary diversity – in accordance with varying ecological conditions. This rise was accompanied by an increase in the local uniqueness of the stone tool culture, which is similar to the diversity of material cultures in 20th-century hunter-gatherer societies. In contrast, during the two million years when, according to the researchers, humans were apex predators, long periods of similarity and continuity were observed in stone tools, regardless of local ecological conditions.

“Our study addresses a very great current controversy – both scientific and non-scientific,” says Prof. Barkai. “For many people today, the Paleolithic diet is a critical issue, not only with regard to the past, but also concerning the present and future. It is hard to convince a devout vegetarian that his/her ancestors were not vegetarians, and people tend to confuse personal beliefs with scientific reality. Our study is both multidisciplinary and interdisciplinary. We propose a picture that is unprecedented in its inclusiveness and breadth, which clearly shows that humans were initially apex predators, who specialized in hunting large animals. As Darwin discovered, the adaptation of species to obtaining and digesting their food is the main source of evolutionary changes, and thus the claim that humans were apex predators throughout most of their development may provide a broad basis for fundamental insights on the biological and cultural evolution of humans.”

Can Evolution Explain All Dark Animal Behaviors? (Discovery)

discovermagazine.com

Many actions that would be considered heinous to humans — cannibalism, eating offspring, torture and rape — have been observed in the animal kingdom. Most (but not all) eyebrow-raising behaviors among animals have an evolutionary underpinning.

By Tim Brinkhof, March 9, 2021 3:00 PM

evil looking chimp - shutterstock
(Credit: Sharon Morris/Shutterstock)

“In sober truth,” wrote the British philosopher John Stuart Mill, “nearly all the things which men are hanged or imprisoned for doing to one another, are nature’s everyday performances.” While it is true that rape, torture and murder are more commonplace in the animal kingdom than they are in human civilization, our fellow creatures almost always seem to have some kind of evolutionary justification for their actions — one that we Homo sapiens lack.

Cats, for instance, are known to toy with small birds and rodents before finally killing them. Although it is easy to conclude that this makes the popular pet a born sadist, some zoologists have proposed that exhausting prey is the safest way of catching them. Similarly, it’s tempting to describe the way African lions and bottlenose dolphins –– large, social mammals –– commit infanticide (the killing of young offspring), as possibly psychopathic. Interestingly, experts suspect that these creatures are in fact doing themselves a favor; by killing offspring, adult males are making their female partners available to mate again.

These behaviors, which initially may seem symptomatic of some sinister psychological defect, turn out to be nothing more than different examples of the kind of selfishness that evolution is full of. Well played, Mother Nature.

But what if harming others is of no benefit to the assailant? In the human world, senseless destruction features on virtually every evening news program. In the animal world, where the laws of nature –– so we’ve been taught –– don’t allow for moral crises, it’s a different story. By all accounts, such undermining behavior shouldn’t be able to occur. Yet it does, and it’s as puzzling to biologists as the existence of somebody like Ted Bundy or Adolf Hitler has been to theodicists –– those who follow a philosophy of religion that ponders why God permits evil.

Cains and Abels

According to Charles Darwin’s theory of evolution, genes that increase an organism’s ability to survive are passed down, while those that don’t are not. Although Darwin remains an important reference point for how humans interpret the natural world, he is not infallible. During the 1960s, biologist W.D. Hamilton proposed that On the Origins of Species failed to account for the persistency of traits that didn’t directly benefit the animal in question.

The first of these two patterns –– altruism –– was amalgamated into Darwin’s theory of evolution when researchers uncovered its evolutionary benefits. One would think that creatures are hardwired to avoid self-sacrifice, but this is not the case. The common vampire bat shares its food with roostmates whose hunt ended in failure. Recently, Antarctic plunder fish have been found to guard the nests of others if they are left unprotected. In both of these cases, altruistic behavior is put on display when the indirect benefit to relatives of the animal in question outweighs the direct cost incurred by that animal.

In Search of Spite

The second animal behavior –– spite –– continues to be difficult to make sense of. For humans, its concept is a familiar yet elusive one, perhaps understood best through the Biblical story of Cain and Abel or the writings of Fyodor Dostoevsky. Although a number of prominent evolutionary biologists –– from Frans de Waal to members of the West Group at the University of Oxford’s Department of Zoology –– have made entire careers out of studying the overlap between animal and human behavior, even they warn against the stubborn tendency to anthropomorphize nonhuman subjects.

As Edward O. Wilson put it in his study, “The Insect Societies,” spite refers to any “behavior that gains nothing or may even diminish the fitness of the individual performing the act, but is definitely harmful to the fitness of another.” Wilson’s definition, which is generally accepted by biologists, allows researchers to study its occurrence in an objective, non-anthropomorphized manner. It initially drew academic attention to species of fish and birds that destroyed the eggs (hatched or unhatched) of rival nests, all at no apparent benefit to them.

Emphasis on “apparent,” though, because –– as those lions and dolphins demonstrated earlier –– certain actions and consequences aren’t always what we think they are. In their research, biologists Andy Gardner and Stuart West maintain that many of the animal behaviors which were once thought spiteful are now understood as selfish. Not in the direct sense of the word (approaching another nest often leads to brutal clashes with its guardian), but an indirect one: With fewer generational competitors, the murderer’s own offspring are more likely to thrive.

For a specific action to be considered true spite, a few more conditions have to be met. The cost incurred by the party acting out the behavior must be “smaller than the product of the negative benefit to the recipient and negative relatedness of the recipient to the actor,” Gardner and West wrote in Current Biology. In other words, a creature can be considered spiteful if harming other creatures does them more bad than good. So far, true spite has only been observed rarely in the animal kingdom, and mostly occurs among smaller creatures.

The larvae of polyembryonic parasitoid wasps, which hatch from eggs that are laid on top of caterpillar eggs, occasionally develop into adults that are not just infertile but have a habit of eating other larvae. From an evolutionary perspective, developing into this infertile form is not a smart move for the wasp because it cannot pass on its genes to the next generation. Nor does it help the creature’s relatives survive, as they are then at risk of being eaten.

That doesn’t mean spite is relegated to the world of insects. It also pops up among monkeys, where it tends to manifest in more recognizable forms. In a 2016 study, Harvard University psychology researchers Kristin Leimgruber and Alexandra Rosati separated chimpanzees and capuchins from the rest of the group during feeding time and gave them the option take away everyone’s food. While the chimps only ever denied food to those who violated their group’s social norms, the capuchins often acted simply out of spite. As Leimgruber explains: “Our study provides the first evidence of a non-human primate choosing to punish others simply because they have more. This sort of ‘if I can’t have it, no one can’ response is consistent with psychological spite, a behavior previously believed unique to humans.”

Beyond the Dark Tetrad

Of course, spite isn’t the only type of complex and curiously human behavior for which the principles of evolution have not produced an easily discoverable (or digestible) answer. Just as confounding are the four components of the Dark Tetrad — a model for categorizing malevolent behaviors, assembled by personality psychologist Delroy Paulhus. The framework’s traits include narcissism, Machiavellianism, psychopathy and everyday sadism.

Traces of all four have been found inside the animal kingdom. The intertribal warfare among chimpanzees is, first and foremost, a means of controlling resources. At the same time, many appear to actively enjoy partaking in hyperviolent patrols. Elsewhere, primate researchers who have made advances in the assessment of great ape psychology suggest the existence of psychotic personality types. As for Machiavellianism, the willingness to hurt relatives in order to protect oneself has been observed in both rhesus macaques and Nile tilapia.

Although the reasons for certain types of animal behavior are still debated, the nature of these discussions tend to be markedly different from discourse around, say, the motivations of serial killers. And often, researchers have a solid understanding of the motivations and feelings of their own study subjects but not those outside of their purview. Regardless of whether the academic community is talking about humans or animals, however, the underlying conviction guiding the conversation — that every action, no matter how upsetting or implacable, must have a logical explanation — is one and the same. 

Israeli Archaeologists Present Groundbreaking Universal Theory of Human Evolution (Haaretz)

Tel Aviv University archaeologists Miki Ben-Dor and Ran Barkai proffer novel hypothesis, showing how the greed of Homo erectus set us careening down an anomalous evolutionary path

Ruth Schuster, Feb. 25, 2021

Why the human brain evolved as it did never has been plausibly explained. Apparently, not since the first life-form billions of years ago did a single species gain dominance over all others – until we came along. Now, in a groundbreaking paper, two Israeli researchers propose that our anomalous evolution was propelled by the very mass extinctions we helped cause. Or: As we sawed off the culinary branches from which we swung, we had to get ever more inventive in order to survive.

As ambling, slow-to-reproduce large animals diminished and gradually went extinct, we were forced to resort to smaller, nimbler animals that flee as a strategy to escape predation. To catch them, we had to get smarter, nimbler and faster, according to the universal theory of human evolution proposed by researchers Miki Ben-Dor and Prof. Ran Barkai of Tel Aviv University, in a paper published in the journal Quaternary.

In fact, the great African megafauna began to decline about 4.6 million years ago. But our story begins with Homo habilis, which lived about 2.6 million years ago and apparently used crude stone tools to help it eat flesh, and with Homo erectus, which thronged Africa and expanded to Eurasia about 2 million years ago. The thing is, erectus wasn’t an omnivore: it was a carnivore, Ben-Dor explains to Haaretz.

“Eighty percent of mammals are omnivores but still specialize in a narrow food range. If anything, it seems Homo erectus was a hyper-carnivore,” he observes.

And in the last couple of million years, our brains grew threefold to a maximum capacity of about 1,500 cranial capacity, a size achieved about 300,000 years ago. We also gradually but consistently ramped up in technology and culture – until the Neolithic revolution and advent of the sedentary lifestyle, when our brains shrank to about 1,400 to 1,300cc, but more on that anomaly later.

The hypothesis suggested by Ben-Dor and Barkai – that we ate our way to our present physical, cultural and ecological state – is an original unifying explanation for the behavioral, physiological and cultural evolution of the human species.

Out of chaos

Evolution is chaotic. Charles Darwin came up with the theory of the survival of the fittest, and nobody has a better suggestion yet, but mutations aren’t “planned.” Bodies aren’t “designed,” if we leave genetic engineering out of it. The point is, evolution isn’t linear but chaotic, and that should theoretically apply to humans too.

Hence, it is strange that certain changes in the course of millions of years of human history, including the expansion of our brain, tool manufacture techniques and use of fire, for example, were uncharacteristically progressive, say Ben-Dor and Barkai.

“Uncharacteristically progressive” means that certain traits such as brain size, or cultural developments such as fire usage, evolved in one direction over a long time, in the direction of escalation. That isn’t what chaos is expected to produce over vast spans of time, Barkai explains to Haaretz: it is bizarre. Very few parameters behave like that.

So, their discovery of correlation between contraction of the average weight of African animals, the extinction of megafauna and the development of the human brain is intriguing.

From mammoth marrow to joint of rat

To be clear, just this month a new paper posited that the late Quaternary extinction of megafauna, in the last few tens of thousands of years, wasn’t entirely the fault of humanity. In North America specifically, it was due primarily to climate change, with the late-arriving humans apparently providing the coup de grâce to some species.

In the Old World, however, a human role is clearer. African megafauna apparently began to decline 4.6 million years ago, but during the Pleistocene (2.6 million to 11,600 years ago) the size of African animals trended sharply down, in what the authors term an abrupt reversal from a continuous growth trend of 65 million years (i.e., since the dinosaurs almost died out).

When Homo erectus the carnivore began to roam Africa around 2 million years ago, land mammals averaged nearly 500 kilograms. Barkai’s team and others have demonstrated that hominins ate elephants and large animals when they could. In fact, originally Africa had six elephant species (today there are two: the bush elephant and forest elephant). By the end of the Pleistocene, by which time all hominins other than modern humans were extinct too, that average weight of the African animal had shrunk by more than 90 percent.

And during the Pleistocene, as the African animals shrank, the Homo genus grew taller and more gracile, and our stone tool technology improved (which in no way diminished our affection for archaic implements like the hand ax or chopper, both of which remained in use for more than a million years, even as more sophisticated technologies were developed).

If we started some 3.3 million years ago with large, crude stone hammers that may have been used to bang big animals on the head or break bones to get at the marrow, over the epochs we invented the spear for remote slaughter. By about 80,000 years ago, the bow and arrow was making its appearance, which was more suitable for bringing down small fry like small deer and birds. Over a million years ago, we began to use fire, and later achieved better control of it, meaning the ability to ignite it at will. Later we domesticated the dog from the wolf, and it would help us hunt smaller, fleet animals.

Why did the earliest humans hunt large animals anyway? Wouldn’t a peeved elephant be more dangerous than a rat? Arguably, but catching one elephant is easier than catching a large number of rats. And megafauna had more fat.

A modern human can only derive up to about 50 percent of calories from lean meat (protein): past a certain point, our livers can’t digest more protein. We need energy from carbs or fat, but before developing agriculture about 10,000 years ago, a key source of calories had to be animal fat.

Big animals have a lot of fat. Small animals don’t. In Africa and Europe, and in Israel too, the researchers found a significant decline in the prevalence of animals weighing over 200 kilograms correlated to an increase in the volume of the human brain. Thus, Ben-Dor and Barkai deduce that the declining availability of large prey seems to have been a key element in the natural selection from Homo erectus onward. Catching one elephant is more efficient than catching 1,000 rabbits, but if we must catch 1,000 rabbits, improved cunning, planning and tools are in order.

Say it with fat

Our changing hunting habits would have had cultural impacts too, Ben-Dor and Barkai posit. “Cultural evolution in archaeology usually refers to objects, such as stone tools,” Ben-Dor tells Haaretz. But cultural evolution also refers to learned behavior, such as our choice of which animals to hunt, and how.

Thus, they posit, our hunting conundrum may have also been a key element to that enigmatic human characteristic: complex language. When language began, with what ancestor of Homo sapiens, if any before us, is hotly debated.

Ben-Dor, an economist by training prior to obtaining a Ph.D. in archaeology, believes it began early. “We just need to follow the money. When speaking of evolution, one must follow the energy. Language is energetically costly. Speaking requires devotion of part of the brain, which is costly. Our brain consumes huge amounts of energy. It’s an investment, and language has to produce enough benefit to make it worthwhile. What did language bring us? It had to be more energetically efficient hunting.”

Domestication of the dog also requires resources and, therefore, also had to bring sufficient compensation in the form of more efficient hunting of smaller animals, he points out. That may help explain the fact that Neolithic humans not only embraced the dog but ate it too, going by archaeological evidence of butchered dogs.

At the end of the day, wherever we went, humans devastated the local ecologies, given enough time.

There is a lot of thinking about the Neolithic agricultural revolution. Some think grain farming was driven by the desire to make beer. Given residue analysis indicating that it’s been around for over 10,000 years, that theory isn’t as far-fetched as one might think. Ben-Dor and Barkai suggest that once we could grow our own food and husband herbivores, the megafauna almost entirely gone, hunting for them became too energy-costly. So we had to use our large brains to develop agriculture.

And as the hunter-gathering lifestyle gave way to permanent settlement, our brain size decreased.

Note, Ben-Dor adds, that the brains of wolves which have to hunt to survive are larger than the brain of the domesticated wolf, i.e., dogs. We did promise more on that. That was it. Also: The chimpanzee brain has remained stable for 7 million years, since the split with the Homo line, Barkai points out.

“Why does any of this matter?” Ben-Dor asks. “People think humans reached this condition because it was ‘meant to be.’ But in the Earth’s 4.5 billion years, there have been billions of species. They rose and fell. What’s the probability that we would take over the world? It’s an accident of nature. It never happened before that one species achieved dominance over all, and now it’s all over. How did that happen? This is the answer: A non-carnivore entered the niche of carnivore, and ate out its niche. We can’t eat that much protein: we need fat too. Because we needed the fat, we began with the big animals. We hunted the prime adult animals which have more fat than the kiddies and the old. We wiped out the prime adults who were crucial to survival of species. Because of our need for fat, we wiped out the animals we depended on. And this required us to keep getting smarter and smarter, and thus we took over the world.”

Study suggests environmental factors had a role in the evolution of human tolerance (Eureka Alert)

News Release 3-Feb-2021

Study suggests environmental factors had a role in the evolution of human tolerance and friendliness

University of York

Environmental pressures may have led humans to become more tolerant and friendly towards each other as the need to share food and raw materials became mutually beneficial, a new study suggests.

This behaviour was not an inevitable natural progression, but subject to ecological pressures, the University of York study concludes.

Humans have a remarkable capacity to care about people well outside their own kin or local group. Whilst most other animals tend to be defensive towards those in other groups our natural tolerance allows us to collaborate today on a global scale, as seen with trade or international relief efforts to provide aid for natural disasters.

Using computer simulations of many thousands of individuals gathering resources for their group and interacting with individuals from other groups, the research team attempted to establish what key evolutionary pressures may have prompted human intergroup tolerance.

The study suggests this may have begun when humans began to leave Africa and during a period of increasingly harsh and variable environments.

The study was concerned with the period 300,000 to 30,000 years ago where archaeological evidence indicated greater mobility and more frequent interactions between different groups. In particular, this is a time in which there is a movement of raw materials over much longer distances and between groups.

The researchers found that populations which shared resources were more likely to be more successful and more likely to survive harsh environments, where extinctions occur, than those populations which do not share across borders.

However, in resource rich environments sharing was less advantageous and in extremely harsh environments populations are too low for sharing to be feasible.

Penny Spikins, Professor in the Archaeology of Human Origins at the University of York, said: “That our study demonstrates the importance of tolerance to human success is perhaps surprising, especially when we often think of prehistory as a time of competition, however we have seen that in situations where people with surplus share across borders with those in need everyone benefits in the long term.”

Dr Jennifer C. French, lecturer in Palaeolithic Archaeology at the University of Liverpool added: “Our study’s findings also have important implications for wider debates about the increases in examples of innovation and greater rates of cultural evolution that occurred during this period.

“They help to explain previously enigmatic changes in the archaeological record between 300,000 and 30,000 years ago.”

###

The study is published in the Journal of Archaeological Method and Theory.

Are Humans Still Evolving? Scientists Weigh In (Science Alert)

sciencealert.com

Eva Hamrud, Metafact – 20 Sept. 2020


As a species, humans have populated almost every corner of the earth. We have developed technologies and cultures which shape the world we live in.

The idea of ‘natural selection’ or ‘survival of the fittest’ seems to make sense in Stone Age times when we were fighting over scraps of meat, but does it still apply now?

We asked 12 experts whether humans are still evolving. The expert consensus is unanimously ‘yes’, however scientists say we might have the wrong idea of what evolution actually is.

Evolution is not the same as natural selection

Evolution is often used interchangeable with the phrases ‘survival of the fittest’ or ‘natural selection’. Actually, these are not quite the same thing.

‘Evolution’ simply means the gradual change of a population over time.

‘Natural selection’ is a mechanism by which evolution can occur. Our Stone Age ancestors who were faster runners avoided being trampled by mammoths and were more likely to have children. That is ‘natural selection’.

Overtime, the human population became faster at running. That’s evolution.

Evolution can happen without natural selection

That makes sense for Stone Age humans, but what about nowadays? We don’t need to outrun mammoths, we have medicines for when we’re sick and we can go to the shops to get food.

Natural selection needs a ‘selection pressure’ (e.g. dangerous trampling mammoths), so if we don’t have these anymore, does this mean we stop evolving?

Even with no selection pressures, experts say evolution still occurs by other mechanisms.

Professor Stanley Ambrose, an anthropologist from the University of Illinois, explains that “any change in the proportions of genes or gene variants over time is also considered evolution. The variants may be functionally equivalent, so evolution does not automatically equate with ‘improvement'”.

Whilst some genes can be affected by natural selection (e.g. genes that help us run faster), other changes in our DNA might have no obvious effect on us. ‘Neutral’ variations can also spread through a population by a different mechanism called ‘genetic drift’.

Genetic drift works by chance: some individuals might be unlucky and die for reasons which have nothing to do with their genes. Their unique gene variations will not be passed on to the next generation, and so the population will change.

Genetic drift doesn’t need any selection pressures, and it is still happening today.

Natural selection is still happening in humans

As much as we have made things easier for ourselves, there are still selection pressures around us, which mean that natural selection is still happening.

Like all mammals, humans lose the ability to digest milk when they stop breastfeeding. This is because we stop making an enzyme called lactase. In some countries, the population has acquired ‘lactase persistence’, meaning that people make lactase throughout their lives.

In European countries we can thank one specific gene variation for our lactase persistence, which is called ‘-13910*T’. By studying this specific gene variation in modern and ancient DNA samples, researchers suggest that it became common after humans started domesticated and milking animals.

This is an example of natural selection where we have actually made the selection pressure ourselves – we started drinking milk, so we evolved to digest it!

Another example of humans undergoing natural selection to adapt to a lifestyle is the Bajau people, who traditionally live in houseboats in the waters of South East Asia and spend much of their lives diving to hunt fish or collect shellfish.

Ultrasound imaging has found that Bajau people have larger spleens than their neighbours – an adaption which allows them to stay underwater for longer.

There are always selective pressures around us, even ones that we create ourselves.

As Dr Benjamin Hunt from the University of Birmingham puts it, “Our technological and cultural changes alter the strength and composition of the selection pressures within our environment, but selection pressures still exist.”

Evolution can’t be stopped

So, evolution can happen by different mechanisms like natural selection and genetic drift. As our environment is always changing, natural selection is always happening. And even if our environment was ‘just right’ for us, we would evolve anyway!

Dr Alywyn Scally, an expert in evolution and genetics from the University of Cambridge, explains: “As long as human reproduction involves randomness and genetic mutation (and the laws of the Universe pretty much guarantee that this will always be the case at some level), there will continue to be differences from one generation to the next, meaning that the process of evolution can never be truly halted.”

Takeaway: Evolution means change in a population. That includes both easy-to-spot changes to adapt to an environment as well as more subtle, genetic changes.

Humans are still evolving, and that is unlikely to change in the future.

Article based on 12 expert answers to this question: Are humans still evolving?

This expert response was published in partnership with independent fact-checking platform Metafact.io. Subscribe to their weekly newsletter here.

Love the Fig (The New Yorker)

newyorker.com

Ben Crair, August 10, 2020

The produce section of the grocery store is a botanical disaster. Most people know that a tomato is technically a fruit, but so is an eggplant, a cucumber, and a spaghetti squash. A banana, which grows from a flower with a single ovary, is actually a berry, while a strawberry, which grows from a flower with several ovaries, isn’t a berry at all but an aggregate fruit. The most confusing classification, though, will start showing up on American shelves this month. Shoppers will find mission figs with the grapes, kiwis, and other fruit, but a clever botanist would sell them at the florist, with the fresh-cut roses. Although many people dismiss figs as a geriatric delicacy or the sticky stuff inside bad cookies, they are, in fact, something awesome: enclosed flowers that bloom modestly inward, unlike the flamboyant showoffs on other plants. Bite a fig in half and you’ll discover a core of tiny blossoms.

All kinds of critters, not only humans, frequent fig trees, but the plants owe their existence to what may be evolution’s most intimate partnership between two species. Because a fig is actually a ball of flowers, it requires pollination to reproduce, but, because the flowers are sealed, not just any bug can crawl inside.* That task belongs to a minuscule insect known as the fig wasp, whose life cycle is intertwined with the fig’s. Mother wasps lay their eggs in an unripe fig. After their offspring hatch and mature, the males mate and then chew a tunnel to the surface, dying when their task is complete. The females follow and take flight, riding the winds until they smell another fig tree. (One species of wasp, in Africa, travels ten times farther than any other known pollinator.) When the insects discover the right specimen, they go inside and deposit the pollen from their birthplace. Then the females lay new eggs, and the cycle begins again. For the wasp mother, however, devotion to the fig plant soon turns tragic. A fig’s entranceway is booby-trapped to destroy her wings, so that she can never visit another plant. When you eat a dried fig, you’re probably chewing fig-wasp mummies, too.

The fig and the fig wasp are a superlative example of what biologists call codependent evolution. The plants and insects have been growing old together for more than sixty million years. Almost every species of fig plant—more than seven hundred and fifty in total—has its own species of wasp, although some commercial fig production favors varieties that do not require pollination. (They are grown from cuttings and produce fruit without any seeds.) But codependence hasn’t made the fig and the fig wasp weak, like it can with humans. The figs and fig wasps’ pollination system is extremely efficient compared with that of other plants, some of which just trust the wind to blow their pollen where it needs to go. And the figs’ specialized flowers, far from isolating them in an evolutionary niche, have allowed them to radiate throughout the natural world. Fig plants can be shrubs, vines, or trees. Strangler figs sprout in the branches of another tree, drop their roots to the forest floor, and slowly envelop their host. The branches of a large strangler fig can stretch over acres and produce a million figs in one flowering. Figs themselves can be brown, red, white, orange, yellow, or green. (Wild figs are not as sweet as the plump and purple mission figs you buy at the farmers’ market.) And their seeds sprout where other plants’ would flounder: rooftops, cliff sides, volcanic islands. The fig genus, Ficus, is the most varied one in the tropics. It also routinely shows up in the greenhouse and the garden.

The variety and adaptability of fig plants make them a favorite foodstuff among animals. In 2001, a team of researchers published a review of the scientific literature and found records of fig consumption for nearly thirteen hundred bird and mammal species. One of the researchers, Mike Shanahan—a rain-forest ecologist and the author of a forthcoming book about figs, “Gods, Wasps, and Stranglers”—had spent time studying Malaysian fig trees as a Ph.D. candidate, in 1997. He would sometimes lie beneath a huge strangler fig and record its visitors, returning repeatedly for several days. “I would typically see twenty-five to thirty different species,” Shanahan told me. “The animals would include lots of different squirrel species and some curious creatures called tree shrews. There would be some monkeys and a whole range of different bird species, from tiny little flowerpeckers up to the hornbills, which are the biggest fruit-eating birds in Asia.” There were also pigeons, fruit doves, fairy bluebirds, barbets, and parrots. As the biologist Daniel Janzen put it in “How to Be a Fig,” an article from 1979, “Who eats figs? Everybody.”

With good reason, too. Figs are high in calcium, easy to chew and digest, and, unlike plants that fruit seasonally, can be found year-round. This is the fig plant’s accommodation of the fig wasp. A fig wasp departs a ripe fig to find an unripe fig, which means that there must always be figs at different stages. As a result, an animal can usually fall back on a fig when a mango or a lychee is not in season. Sometimes figs are the only things between an animal and starvation. According to a 2003 study of Uganda’s Budongo Forest, for instance, figs are the sole source of fruit for chimpanzees at certain times of year. Our pre-human ancestors probably filled up on figs, too. The plants are what is known as a keystone species: yank them from the jungle and the whole ecosystem would collapse.

Figs’ popularity means they can play a central role in bringing deforested land back to life. The plants grow quickly in inhospitable places and, thanks to the endurance of the fig wasps, can survive at low densities. And the animals they attract will, to put it politely, deposit nearby the seeds of other fruits they’ve eaten, thereby introducing a healthy variety of new plants. Nigel Tucker, a restoration ecologist in Australia, has recommended that ten per cent of new plants in tropical-reforestation projects be fig seedlings. Rhett Harrison, a former fig biologist, told me that the ratio could be even higher. “My inclination is that we should be going to some of these places and just planting a few figs,” he said.

Fig trees are also sometimes the only trees left standing from former forests. In parts of India, for instance, they are considered holy, and farmers are reluctant to chop them down. “Diverse cultures developed taboos against felling fig trees,” Shanahan told me. “They said they were homes to gods and spirits, and made them places of prayer and symbols of their society.” You can’t really taste the fig’s spiritual aura in a Fig Newton, but it shines in the mythologies of world religions. Buddha found enlightenment under a fig tree, and the Egyptian pharaohs built wooden sarcophagi from Ficus sycomorus. An apple tree might have cost Adam and Eve their innocence, but a fig tree, whose leaves they used to cover their nudity, gave them back some dignity. If only they had preferred figs in the first place, we might all still live in Eden.

*This article has been revised to clarify the fact that not all fig plants require pollination to produce edible fruit.

Viruses have big impacts on ecology and evolution as well as human health (The Economist)

economist.com

Aug 20th 2020 – 32-41 minutos


I
The outsiders inside

HUMANS ARE lucky to live a hundred years. Oak trees may live a thousand; mayflies, in their adult form, a single day. But they are all alive in the same way. They are made up of cells which embody flows of energy and stores of information. Their metabolisms make use of that energy, be it from sunlight or food, to build new molecules and break down old ones, using mechanisms described in the genes they inherited and may, or may not, pass on.

It is this endlessly repeated, never quite perfect reproduction which explains why oak trees, humans, and every other plant, fungus or single-celled organism you have ever seen or felt the presence of are all alive in the same way. It is the most fundamental of all family resemblances. Go far enough up any creature’s family tree and you will find an ancestor that sits in your family tree, too. Travel further and you will find what scientists call the last universal common ancestor, LUCA. It was not the first living thing. But it was the one which set the template for the life that exists today.

And then there are viruses. In viruses the link between metabolism and genes that binds together all life to which you are related, from bacteria to blue whales, is broken. Viral genes have no cells, no bodies, no metabolism of their own. The tiny particles, “virions”, in which those genes come packaged—the dot-studded disks of coronaviruses, the sinister, sinuous windings of Ebola, the bacteriophages with their science-fiction landing-legs that prey on microbes—are entirely inanimate. An individual animal, or plant, embodies and maintains the restless metabolism that made it. A virion is just an arrangement of matter.

The virus is not the virion. The virus is a process, not a thing. It is truly alive only in the cells of others, a virtual organism running on borrowed hardware to produce more copies of its genome. Some bide their time, letting the cell they share the life of live on. Others immediately set about producing enough virions to split their hosts from stem to stern.

The virus has no plan or desire. The simplest purposes of the simplest life—to maintain the difference between what is inside the cell and what is outside, to move towards one chemical or away from another—are entirely beyond it. It copies itself in whatever way it does simply because it has copied itself that way before, in other cells, in other hosts.

That is why, asked whether viruses are alive, Eckard Wimmer, a chemist and biologist who works at the State University of New York, Stony Brook, offers a yes-and-no. Viruses, he says, “alternate between nonliving and living phases”. He should know. In 2002 he became the first person in the world to take an array of nonliving chemicals and build a virion from scratch—a virion which was then able to get itself reproduced by infecting cells.

The fact that viruses have only a tenuous claim to being alive, though, hardly reduces their impact on things which are indubitably so. No other biological entities are as ubiquitous, and few as consequential. The number of copies of their genes to be found on Earth is beyond astronomical. There are hundreds of billions of stars in the Milky Way galaxy and a couple of trillion galaxies in the observable universe. The virions in the surface waters of any smallish sea handily outnumber all the stars in all the skies that science could ever speak of.

Back on Earth, viruses kill more living things than any other type of predator. They shape the balance of species in ecosystems ranging from those of the open ocean to that of the human bowel. They spur evolution, driving natural selection and allowing the swapping of genes.

They may have been responsible for some of the most important events in the history of life, from the appearance of complex multicellular organisms to the emergence of DNA as a preferred genetic material. The legacy they have left in the human genome helps produce placentas and may shape the development of the brain. For scientists seeking to understand life’s origin, they offer a route into the past separate from the one mapped by humans, oak trees and their kin. For scientists wanting to reprogram cells and mend metabolisms they offer inspiration—and powerful tools.

II
A lifestyle for genes

THE IDEA of a last universal common ancestor provides a plausible and helpful, if incomplete, answer to where humans, oak trees and their ilk come from. There is no such answer for viruses. Being a virus is not something which provides you with a place in a vast, coherent family tree. It is more like a lifestyle—a way of being which different genes have discovered independently at different times. Some viral lineages seem to have begun quite recently. Others have roots that comfortably predate LUCA itself.

Disparate origins are matched by disparate architectures for information storage and retrieval. In eukaryotes—creatures, like humans, mushrooms and kelp, with complex cells—as in their simpler relatives, the bacteria and archaea, the genes that describe proteins are written in double-stranded DNA. When a particular protein is to be made, the DNA sequence of the relevant gene acts as a template for the creation of a complementary molecule made from another nucleic acid, RNA. This messenger RNA (mRNA) is what the cellular machinery tasked with translating genetic information into proteins uses in order to do so.

Because they, too, need to have proteins made to their specifications, viruses also need to produce mRNAs. But they are not restricted to using double-stranded DNA as a template. Viruses store their genes in a number of different ways, all of which require a different mechanism to produce mRNAs. In the early 1970s David Baltimore, one of the great figures of molecular biology, used these different approaches to divide the realm of viruses into seven separate classes (see diagram).

In four of these seven classes the viruses store their genes not in DNA but in RNA. Those of Baltimore group three use double strands of RNA. In Baltimore groups four and five the RNA is single-stranded; in group four the genome can be used directly as an mRNA; in group five it is the template from which mRNA must be made. In group six—the retroviruses, which include HIV—the viral RNA is copied into DNA, which then provides a template for mRNAs.

Because uninfected cells only ever make RNA on the basis of a DNA template, RNA-based viruses need distinctive molecular mechanisms those cells lack. Those mechanisms provide medicine with targets for antiviral attacks. Many drugs against HIV take aim at the system that makes DNA copies of RNA templates. Remdesivir (Veklury), a drug which stymies the mechanism that the simpler RNA viruses use to recreate their RNA genomes, was originally developed to treat hepatitis C (group four) and subsequently tried against the Ebola virus (group five). It is now being used against SARSCoV-2 (group four), the covid-19 virus.

Studies of the gene for that RNA-copying mechanism, RdRp, reveal just how confusing virus genealogy can be. Some viruses in groups three, four and five seem, on the basis of their RdRp-gene sequence, more closely related to members of one of the other groups than they are to all the other members of their own group. This may mean that quite closely related viruses can differ in the way they store their genomes; it may mean that the viruses concerned have swapped their RdRp genes. When two viruses infect the same cell at the same time such swaps are more or less compulsory. They are, among other things, one of the mechanisms by which viruses native to one species become able to infect another.

How do genes take on the viral lifestyle in the first place? There are two plausible mechanisms. Previously free-living creatures could give up metabolising and become parasitic, using other creatures’ cells as their reproductive stage. Alternatively genes allowed a certain amount of independence within one creature could have evolved the means to get into other creatures.

Living creatures contain various apparently independent bits of nucleic acid with an interest in reproducing themselves. The smallest, found exclusively in plants, are tiny rings of RNA called viroids, just a few hundred genetic letters long. Viroids replicate by hijacking a host enzyme that normally makes mRNAs. Once attached to a viroid ring, the enzyme whizzes round and round it, unable to stop, turning out a new copy of the viroid with each lap.

Viroids describe no proteins and do no good. Plasmids—somewhat larger loops of nucleic acid found in bacteria—do contain genes, and the proteins they describe can be useful to their hosts. Plasmids are sometimes, therefore, regarded as detached parts of a bacteria’s genome. But that detachment provides a degree of autonomy. Plasmids can migrate between bacterial cells, not always of the same species. When they do so they can take genetic traits such as antibiotic resistance from their old host to their new one.

Recently, some plasmids have been implicated in what looks like a progression to true virus-hood. A genetic analysis by Mart Krupovic of the Pasteur Institute suggests that the Circular Rep-Encoding Single-Strand-DNA (CRESSDNA) viruses, which infect bacteria, evolved from plasmids. He thinks that a DNA copy of the genes that another virus uses to create its virions, copied into a plasmid by chance, provided it with a way out of the cell. The analysis strongly suggests that CRESSDNA viruses, previously seen as a pretty closely related group, have arisen from plasmids this way on three different occasions.

Such jailbreaks have probably been going on since very early on in the history of life. As soon as they began to metabolise, the first proto-organisms would have constituted a niche in which other parasitic creatures could have lived. And biology abhors a vacuum. No niche goes unfilled if it is fillable.

It is widely believed that much of the evolutionary period between the origin of life and the advent of LUCA was spent in an “RNA world”—one in which that versatile substance both stored information, as DNA now does, and catalysed chemical reactions, as proteins now do. Set alongside the fact that some viruses use RNA as a storage medium today, this strongly suggests that the first to adopt the viral lifestyle did so too. Patrick Forterre, an evolutionary biologist at the Pasteur Institute with a particular interest in viruses (and the man who first popularised the term LUCA) thinks that the “RNA world” was not just rife with viruses. He also thinks they may have brought about its end.

The difference between DNA and RNA is not large: just a small change to one of the “letters” used to store genetic information and a minor modification to the backbone to which these letters are stuck. And DNA is a more stable molecule in which to store lots of information. But that is in part because DNA is inert. An RNA-world organism which rewrote its genes into DNA would cripple its metabolism, because to do so would be to lose the catalytic properties its RNA provided.

An RNA-world virus, having no metabolism of its own to undermine, would have had no such constraints if shifting to DNA offered an advantage. Dr Forterre suggests that this advantage may have lain in DNA’s imperviousness to attack. Host organisms today have all sorts of mechanisms for cutting up viral nucleic acids they don’t like the look of—mechanisms which biotechnologists have been borrowing since the 1970s, most recently in the form of tools based on a bacterial defence called CRISPR. There is no reason to imagine that the RNA-world predecessors of today’s cells did not have similar shears at their disposal. And a virus that made the leap to DNA would have been impervious to their blades.

Genes and the mechanisms they describe pass between viruses and hosts, as between viruses and viruses, all the time. Once some viruses had evolved ways of writing and copying DNA, their hosts would have been able to purloin them in order to make back-up copies of their RNA molecules. And so what began as a way of protecting viral genomes would have become the way life stores all its genes—except for those of some recalcitrant, contrary viruses.

III
The scythes of the seas

IT IS A general principle in biology that, although in terms of individual numbers herbivores outnumber carnivores, in terms of the number of species carnivores outnumber herbivores. Viruses, however, outnumber everything else in every way possible.

This makes sense. Though viruses can induce host behaviours that help them spread—such as coughing—an inert virion boasts no behaviour of its own that helps it stalk its prey. It infects only that which it comes into contact with. This is a clear invitation to flood the zone. In 1999 Roger Hendrix, a virologist, suggested that a good rule of thumb might be ten virions for every living individual creature (the overwhelming majority of which are single-celled bacteria and archaea). Estimates of the number of such creatures on the planet come out in the region of 1029-1030. If the whole Earth were broken up into pebbles, and each of those pebbles smashed into tens of thousands of specks of grit, you would still have fewer pieces of grit than the world has virions. Measurements, as opposed to estimates, produce numbers almost as arresting. A litre of seawater may contain more than 100bn virions; a kilogram of dried soil perhaps a trillion.

Metagenomics, a part of biology that looks at all the nucleic acid in a given sample to get a sense of the range of life forms within it, reveals that these tiny throngs are highly diverse. A metagenomic analysis of two surveys of ocean life, the Tara Oceans and Malaspina missions, by Ahmed Zayed of Ohio State University, found evidence of 200,000 different species of virus. These diverse species play an enormous role in the ecology of the oceans.

A litre of seawater may contain 100bn virions; a kilogram of dried soil perhaps a trillion

On land, most of the photosynthesis which provides the biomass and energy needed for life takes place in plants. In the oceans, it is overwhelmingly the business of various sorts of bacteria and algae collectively known as phytoplankton. These creatures reproduce at a terrific rate, and viruses kill them at a terrific rate, too. According to work by Curtis Suttle of the University of British Columbia, bacterial phytoplankton typically last less than a week before being killed by viruses.

This increases the overall productivity of the oceans by helping bacteria recycle organic matter (it is easier for one cell to use the contents of another if a virus helpfully lets them free). It also goes some way towards explaining what the great mid-20th-century ecologist G. Evelyn Hutchinson called “the paradox of the plankton”. Given the limited nature of the resources that single-celled plankton need, you would expect a few species particularly well adapted to their use to dominate the ecosystem. Instead, the plankton display great variety. This may well be because whenever a particular form of plankton becomes dominant, its viruses expand with it, gnawing away at its comparative success.

It is also possible that this endless dance of death between viruses and microbes sets the stage for one of evolution’s great leaps forward. Many forms of single-celled plankton have molecular mechanisms that allow them to kill themselves. They are presumably used when one cell’s sacrifice allows its sister cells—which are genetically identical—to survive. One circumstance in which such sacrifice seems to make sense is when a cell is attacked by a virus. If the infected cell can kill itself quickly (a process called apoptosis) it can limit the number of virions the virus is able to make. This lessens the chances that other related cells nearby will die. Some bacteria have been shown to use this strategy; many other microbes are suspected of it.

There is another situation where self-sacrifice is becoming conduct for a cell: when it is part of a multicellular organism. As such organisms grow, cells that were once useful to them become redundant; they have to be got rid of. Eugene Koonin of America’s National Institutes of Health and his colleagues have explored the idea that virus-thwarting self-sacrifice and complexity-permitting self-sacrifice may be related, with the latter descended from the former. Dr Koonin’s model also suggests that the closer the cells are clustered together, the more likely this act of self-sacrifice is to have beneficial consequences.

For such profound propinquity, move from the free-flowing oceans to the more structured world of soil, where potential self-sacrificers can nestle next to each other. Its structure makes soil harder to sift for genes than water is. But last year Mary Firestone of the University of California, Berkeley, and her colleagues used metagenomics to count 3,884 new viral species in a patch of Californian grassland. That is undoubtedly an underestimate of the total diversity; their technique could see only viruses with RNA genomes, thus missing, among other things, most bacteriophages.

Metagenomics can also be applied to biological samples, such as bat guano in which it picks up viruses from both the bats and their food. But for the most part the finding of animal viruses requires more specific sampling. Over the course of the 2010s PREDICT, an American-government project aimed at finding animal viruses, gathered over 160,000 animal and human tissue samples from 35 countries and discovered 949 novel viruses.

The people who put together PREDICT now have grander plans. They want a Global Virome Project to track down all the viruses native to the world’s 7,400 species of mammals and waterfowl—the reservoirs most likely to harbour viruses capable of making the leap into human beings. In accordance with the more-predator-species-than-prey rule they expect such an effort would find about 1.5m viruses, of which around 700,000 might be able to infect humans. A planning meeting in 2018 suggested that such an undertaking might take ten years and cost $4bn. It looked like a lot of money then. Today those arguing for a system that can provide advance warning of the next pandemic make it sound pretty cheap.

IV
Leaving their mark

THE TOLL which viruses have exacted throughout history suggests that they have left their mark on the human genome: things that kill people off in large numbers are powerful agents of natural selection. In 2016 David Enard, then at Stanford University and now at the University of Arizona, made a stab at showing just how much of the genome had been thus affected.

He and his colleagues started by identifying almost 10,000 proteins that seemed to be produced in all the mammals that had had their genomes sequenced up to that point. They then made a painstaking search of the scientific literature looking for proteins that had been shown to interact with viruses in some way or other. About 1,300 of the 10,000 turned up. About one in five of these proteins was connected to the immune system, and thus could be seen as having a professional interest in viral interaction. The others appeared to be proteins which the virus made use of in its attack on the host. The two cell-surface proteins that SARSCoV-2 uses to make contact with its target cells and inveigle its way into them would fit into this category.

The researchers then compared the human versions of the genes for their 10,000 proteins with those in other mammals, and applied a statistical technique that distinguishes changes that have no real impact from the sort of changes which natural selection finds helpful and thus tries to keep. Genes for virus-associated proteins turned out to be evolutionary hotspots: 30% of all the adaptive change was seen in the genes for the 13% of the proteins which interacted with viruses. As quickly as viruses learn to recognise and subvert such proteins, hosts must learn to modify them.

A couple of years later, working with Dmitri Petrov at Stanford, Dr Enard showed that modern humans have borrowed some of these evolutionary responses to viruses from their nearest relatives. Around 2-3% of the DNA in an average European genome has Neanderthal origins, a result of interbreeding 50,000 to 30,000 years ago. For these genes to have persisted they must be doing something useful—otherwise natural selection would have removed them. Dr Enard and Dr Petrov found that a disproportionate number described virus-interacting proteins; of the bequests humans received from their now vanished relatives, ways to stay ahead of viruses seem to have been among the most important.

Viruses do not just shape the human genome through natural selection, though. They also insert themselves into it. At least a twelfth of the DNA in the human genome is derived from viruses; by some measures the total could be as high as a quarter.

Retroviruses like HIV are called retro because they do things backwards. Where cellular organisms make their RNA from DNA templates, retroviruses do the reverse, making DNA copies of their RNA genomes. The host cell obligingly makes these copies into double-stranded DNA which can be stitched into its own genome. If this happens in a cell destined to give rise to eggs or sperm, the viral genes are passed from parent to offspring, and on down the generations. Such integrated viral sequences, known as endogenous retroviruses (ERVs), account for 8% of the human genome.

This is another example of the way the same viral trick can be discovered a number of times. Many bacteriophages are also able to stitch copies of their genome into their host’s DNA, staying dormant, or “temperate”, for generations. If the cell is doing well and reproducing regularly, this quiescence is a good way for the viral genes to make more copies of themselves. When a virus senses that its easy ride may be coming to an end, though—for example, if the cell it is in shows signs of stress—it will abandon ship. What was latent becomes “lytic” as the viral genes produce a sufficient number of virions to tear the host apart.

Though some of their genes are associated with cancers, in humans ERVs do not burst back into action in later generations. Instead they have proved useful resources of genetic novelty. In the most celebrated example, at least ten different mammalian lineages make use of a retroviral gene for one of their most distinctively mammalian activities: building a placenta.

The placenta is a unique organ because it requires cells from the mother and the fetus to work together in order to pass oxygen and sustenance in one direction and carbon dioxide and waste in the other. One way this intimacy is achieved safely is through the creation of a tissue in which the membranes between cells are broken down to form a continuous sheet of cellular material.

The protein that allows new cells to merge themselves with this layer, syncytin-1, was originally used by retroviruses to join the external membranes of their virions to the external membranes of cells, thus gaining entry for the viral proteins and nucleic acids. Not only have different sorts of mammals co-opted this membrane-merging trick—other creatures have made use of it, too. The mabuya, a long-tailed skink which unusually for a lizard nurtures its young within its body, employs a retroviral syncytin protein to produce a mammalian-looking placenta. The most recent shared ancestor of mabuyas and mammals died out 80m years before the first dinosaur saw the light of day, but both have found the same way to make use of the viral gene.

You put your line-1 in, you take your line-1 out

This is not the only way that animals make use of their ERVs. Evidence has begun to accumulate that genetic sequences derived from ERVs are quite frequently used to regulate the activity of genes of more conventional origin. In particular, RNA molecules transcribed from an ERV called HERV-K play a crucial role in providing the stem cells found in embryos with their “pluripotency”—the ability to create specialised daughter cells of various different types. Unfortunately, when expressed in adults HERV-K can also be responsible for cancers of the testes.

As well as containing lots of semi-decrepit retroviruses that can be stripped for parts, the human genome also holds a great many copies of a “retrotransposon” called LINE-1. This a piece of DNA with a surprisingly virus-like way of life; it is thought by some biologists to have, like ERVs, a viral origin. In its full form, LINE-1 is a 6,000-letter sequence of DNA which describes a “reverse transcriptase” of the sort that retroviruses use to make DNA from their RNA genomes. When LINE-1 is transcribed into an mRNA and that mRNA subsequently translated to make proteins, the reverse transcriptase thus created immediately sets to work on the mRNA used to create it, using it as the template for a new piece of DNA which is then inserted back into the genome. That new piece of DNA is in principle identical to the piece that acted as the mRNA’s original template. The LINE-1 element has made a copy of itself.

In the 100m years or so that this has been going on in humans and the species from which they are descended the LINE-1 element has managed to pepper the genome with a staggering 500,000 copies of itself. All told, 17% of the human genome is taken up by these copies—twice as much as by the ERVs.

Most of the copies are severely truncated and incapable of copying themselves further. But some still have the knack, and this capability may be being put to good use. Fred Gage and his colleagues at the Salk Institute for Biological Studies, in San Diego, argue that LINE-1 elements have an important role in the development of the brain. In 2005 Dr Gage discovered that in mouse embryos—specifically, in the brains of those embryos—about 3,000 LINE-1 elements are still able to operate as retrotransposons, putting new copies of themselves into the genome of a cell and thus of all its descendants.

Brains develop through proliferation followed by pruning. First, nerve cells multiply pell-mell; then the cell-suicide process that makes complex life possible prunes them back in a way that looks a lot like natural selection. Dr Gage suspects that the movement of LINE-1 transposons provides the variety in the cell population needed for this selection process. Choosing between cells with LINE-1 in different places, he thinks, could be a key part of the process from which the eventual neural architecture emerges. What is true in mice is, as he showed in 2009, true in humans, too. He is currently developing a technique for looking at the process in detail by comparing, post mortem, the genomes of different brain cells from single individuals to see if their LINE-1 patterns vary in the ways that his theory would predict.

V
Promised lands

HUMAN EVOLUTION may have used viral genes to make big-brained live-born life possible; but viral evolution has used them to kill off those big brains on a scale that is easily forgotten. Compare the toll to that of war. In the 20th century, the bloodiest in human history, somewhere between 100m and 200m people died as a result of warfare. The number killed by measles was somewhere in the same range; the number who died of influenza probably towards the top of it; and the number killed by smallpox—300m-500m—well beyond it. That is why the eradication of smallpox from the wild, achieved in 1979 by a globally co-ordinated set of vaccination campaigns, stands as one of the all-time-great humanitarian triumphs.

Other eradications should eventually follow. Even in their absence, vaccination has led to a steep decline in viral deaths. But viruses against which there is no vaccine, either because they are very new, like SARSCoV-2, or peculiarly sneaky, like HIV, can still kill millions.

Reducing those tolls is a vital aim both for research and for public-health policy. Understandably, a far lower priority is put on the benefits that viruses can bring. This is mostly because they are as yet much less dramatic. They are also much less well understood.

The viruses most prevalent in the human body are not those which infect human cells. They are those which infect the bacteria that live on the body’s surfaces, internal and external. The average human “microbiome” harbours perhaps 100trn of these bacteria. And where there are bacteria, there are bacteriophages shaping their population.

The microbiome is vital for good health; when it goes wrong it can mess up a lot else. Gut bacteria seem to have a role in maintaining, and possibly also causing, obesity in the well-fed and, conversely, in tipping the poorly fed into a form of malnutrition called kwashiorkor. Ill-regulated gut bacteria have also been linked, if not always conclusively, with diabetes, heart disease, cancers, depression and autism. In light of all this, the question “who guards the bacterial guardians?” is starting to be asked.

The viruses that prey on the bacteria are an obvious answer. Because the health of their host’s host—the possessor of the gut they find themselves in—matters to these phages, they have an interest in keeping the microbiome balanced. Unbalanced microbiomes allow pathogens to get a foothold. This may explain a curious detail of a therapy now being used as a treatment of last resort against Clostridium difficile, a bacterium that causes life-threatening dysentery. The therapy in question uses a transfusion of faecal matter, with its attendant microbes, from a healthy individual to reboot the patient’s microbiome. Such transplants, it appears, are more likely to succeed if their phage population is particularly diverse.

Medicine is a very long way from being able to use phages to fine-tune the microbiome. But if a way of doing so is found, it will not in itself be a revolution. Attempts to use phages to promote human health go back to their discovery in 1917, by Félix d’Hérelle, a French microbiologist, though those early attempts at therapy were not looking to restore balance and harmony. On the basis that the enemy of my enemy is my friend, doctors simply treated bacterial infections with phages thought likely to kill the bacteria.

The arrival of antibiotics saw phage therapy abandoned in most places, though it persisted in the Soviet Union and its satellites. Various biotechnology companies think they may now be able to revive the tradition—and make it more effective. One option is to remove the bits of the viral genome that let phages settle down to a temperate life in a bacterial genome, leaving them no option but to keep on killing. Another is to write their genes in ways that avoid the defences with which bacteria slice up foreign DNA.

The hope is that phage therapy will become a backup in difficult cases, such as infection with antibiotic-resistant bugs. There have been a couple of well-publicised one-off successes outside phage therapy’s post-Soviet homelands. In 2016 Tom Patterson, a researcher at the University of California, San Diego, was successfully treated for an antibiotic-resistant bacterial infection with specially selected (but un-engineered) phages. In 2018 Graham Hatfull of the University of Pittsburgh used a mixture of phages, some engineered so as to be incapable of temperance, to treat a 16-year-old British girl who had a bad bacterial infection after a lung transplant. Clinical trials are now getting under way for phage treatments aimed at urinary-tract infections caused by Escherichia coli, Staphylococcus aureus infections that can lead to sepsis and Pseudomonas aeruginosa infections that cause complications in people who have cystic fibrosis.

Viruses which attack bacteria are not the only ones genetic engineers have their eyes on. Engineered viruses are of increasing interest to vaccine-makers, to cancer researchers and to those who want to treat diseases by either adding new genes to the genome or disabling faulty ones. If you want to get a gene into a specific type of cell, a virion that recognises something about such cells may often prove a good tool.

The vaccine used to contain the Ebola outbreak in the Democratic Republic of Congo over the past two years was made by engineering Indiana vesiculovirus, which infects humans but cannot reproduce in them, so that it expresses a protein found on the surface of the Ebola virus; thus primed, the immune system responds to Ebola much more effectively. The World Health Organisation’s current list of 29 covid-19 vaccines in clinical trials features six versions of other viruses engineered to look a bit like SARS-CoV-2. One is based on a strain of measles that has long been used as a vaccine against that disease.

Viruses engineered to engender immunity against pathogens, to kill cancer cells or to encourage the immune system to attack them, or to deliver needed genes to faulty cells all seem likely to find their way into health care. Other engineered viruses are more worrying. One way to understand how viruses spread and kill is to try and make particularly virulent ones. In 2005, for example, Terrence Tumpey of America’s Centres for Disease Control and Prevention and his colleagues tried to understand the deadliness of the influenza virus responsible for the pandemic of 1918-20 by taking a more benign strain, adding what seemed to be distinctive about the deadlier one and trying out the result on mice. It was every bit as deadly as the original, wholly natural version had been.

The use of engineered pathogens as weapons of war is of dubious utility, completely illegal and repugnant to almost all

Because such “gain of function” research could, if ill-conceived or poorly implemented, do terrible damage, it requires careful monitoring. And although the use of engineered pathogens as weapons of war is of dubious utility—such weapons are hard to aim and hard to stand down, and it is not easy to know how much damage they have done—as well as being completely illegal and repugnant to almost all, such possibilities will and should remain a matter of global concern.

Information which, for billions of years, has only ever come into its own within infected cells can now be inspected on computer screens and rewritten at will. The power that brings is sobering. It marks a change in the history of both viruses and people—a change which is perhaps as important as any of those made by modern biology. It is constraining a small part of the viral world in a way which, so far, has been to people’s benefit. It is revealing that world’s further reaches in a way which cannot but engender awe. ■

Editor’s note: Some of our covid-19 coverage is free for readers of The Economist Today, our daily newsletter. For more stories and our pandemic tracker, see our hub

This article appeared in the Essay section of the print edition under the headline “The outsiders inside”

Did Human Evolution Include a Semi-Aquatic Phase? (The Scientist)

A recent book outlines fossil evidence supporting the controversial hypothesis.

Peter Rhys-Evans
Apr 1, 2020

For the past 150 years, scientists and laypeople alike have accepted a “savanna” scenario of human evolution. The theory, primarily based on fossil evidence, suggests that because our ancestral ape family members were living in the trees of East African forests, and because we humans live on terra firma, our primate ancestors simply came down from the trees onto the grasslands and stood upright to see farther over the vegetation, increasing their efficiency as hunter-gatherers. In the late 19th century, anthropologists only had a few Neanderthal fossils to study, and science had very little knowledge of genetics and evolutionary changes. So this savanna theory of human evolution became ingrained in anthropological dogma and has remained the established explanation of early hominin evolution following the genetic split from our primate cousins 6 million to 7 million years ago.

But in 1960, a different twist on human evolution emerged. That year, marine biologist Sir Alister Hardy wrote an article in New Scientist suggesting a possible aquatic phase in our evolution, noting Homo sapiens’s differences from other primates and similarities to other aquatic and semi-aquatic mammals. In 1967, zoologist Desmond Morris published The Naked Ape, which explored different theories about why modern humans lost their fur. Morris mentioned Hardy’s “aquatic ape” hypothesis as an “ingenious” theory that sufficiently explained “why we are so nimble in the water today and why our closest living relatives, the chimpanzees, are so helpless and quickly drown.”

CRC Press, July 2019

Morris concluded, however, that “despite its most appealing indirect evidence, the aquatic theory lacks solid support.” Even if eventually the aquatic ape hypothesis turns out to be true, he continued, it need not completely rewrite the story of human evolution, but rather add to our species’ evolutionary arc a “salutary christening ceremony.”

In 1992, I published a paper describing a curious ear condition colloquially known as “surfer’s ear,” which I and other ear, nose, and throat doctors frequently see in clinics. Exostoses are small bones that grow in the outer ear canal, but only in humans who swim and dive on a regular, almost daily basis. In modern humans, there is undisputed evidence of aural exostoses in people who swim and dive, with the size and extent being directly dependent on the frequency and length of exposure to water, as well as its temperature.

I predicted that if these exostoses were found in early hominin skulls, it would provide vital fossil evidence for frequent swimming and diving by our ancestors. Researchers have now found these features in 1 million– to 2 million–year-old hominin skulls. In a recent study on nearly two dozen Neanderthal skulls, about 47 percent had exostoses. There are many other references to contemporary, historical, and archaeological coastal and river communities with a significantly increased incidence of aural exostoses. In my latest book, The Waterside Ape, I propose that the presence of exostoses in the skulls of ancient human ancestors is a prime support for an aquatic phase of our evolution, which may explain our unique human phenotype.

Other Homo sapiens–specific features that may be tied to a semi-aquatic stage of human evolution include erect posture, loss of body hair, deposition of subcutaneous fat, a completely different heat-regulation system from other primates, and kidneys that function much like those of aquatic mammals. This combination of characteristics, which do not exist in any other terrestrial mammal, would have gradually arisen over several million years. The finding of the bipedal hominin named “Lucy,” dating to 3.5 million years ago, suggested that walking on two legs was the initial major evolutionary adaptation to a semi-aquatic habitat. By the time the Neanderthals appeared some 400,000 to 300,000 years ago, their semi-aquatic lifestyle—swimming, diving, and perhaps hunting for food on land and in the water—may have been firmly part of day-to-day life.

In my opinion, the accumulated fossil, anatomical, and physiological evidence about early hominin evolution points to our human ancestors learning to survive as semi-aquatic creatures in a changing East African environment. After transitioning to bipedalism, ancient hominins had both forelimbs free from aiding in walking, which may have allowed for increasing manual dexterity and skills. Perhaps a marine diet with lipoproteins that are essential for brain development fueled the unique intellectual advances and ecological dominance of Homo sapiens.

Peter Rhys-Evans works in private practice as an otolaryngologist in London at several hospitals including the Harley Street Clinic. He is the founder and chairman of Oracle Cancer Trust, the largest head and neck cancer charity in the UK. Read an excerpt from The Waterside Ape. Follow Rhys-Evans on Twitter @TheWatersideApe.

Human impact on nature ‘dates back millions of years’ (BBC)

Early human ancestors could have stolen food from other animals. Mauricio Antón

By Helen Briggs BBC News

20 January 2020

The impact of humans on nature has been far greater and longer-lasting than we could ever imagine, according to scientists.

Early human ancestors living millions of years ago may have triggered extinctions, even before our species evolved, a study suggests.

A decline in large mammals seen in Eastern Africa may have been due to early humans, researchers propose.

Extinction rates started to increase from around four million years ago.

This coincides with the period when ancient human populations were living in the area, as judged by fossil evidence.

“We are now negatively impacting the world and the species that live in it more than ever before. But this does not mean that we used to live in true harmony with nature in the past,” said study researcher, Dr Søren Faurby of the University of Gothenburg.

“We are extremely successful in monopolising resources today, and our results show that this may have also been the case with our ancestors.”

Getty Images. A lion feasts on the carcass of a rhinoceros in Kenya

The researchers looked at extinction rates of large and small carnivores and how this correlated with environmental changes such as rainfall and temperature.

They also looked at changes in the brain size of human ancestors such as Australopithecus and Ardipithecus.

They found that extinction rates in large carnivores correlated with increased brain size of human ancestors and with vegetation changes, but not with precipitation or temperature changes.

They found the best explanation for carnivore extinction in East Africa was that these animals were in direct competition for food with our ancestors.

They think human ancestors may have stolen freshly-killed prey from the likes of sabre-toothed cats, depriving them of food.

“Our results suggest that substantial anthropogenic influence on biodiversity started millions of years earlier than currently assumed,” the researchers reported in the journal Ecology Letters.

Co-researcher Alexandre Antonelli of the Royal Botanic Gardens, Kew, said the view that our ancestors had little impact on the animals around them is incorrect, as “the impact of our lineage on nature has been far greater and longer-lasting than we ever could ever imagine”.

A landmark report last year warned that as many as one million species of animals and plants are threatened with extinction in the coming decades.

A more recent study found that the growth of cities, the clearing of forests for farming and the soaring demand for fish had significantly altered nearly three-quarters of the land and more than two-thirds of the oceans.

Interdisciplinary approach yields new insights into human evolution (Vanderbilt University)

PUBLIC RELEASE: 

Vanderbilt biologist Nicole Creanza Nicole Creanza takes interdisciplinary approach to human evolution as guest editor of Royal Society journal

The evolution of human biology should be considered part and parcel with the evolution of humanity itself, proposes Nicole Creanza, assistant professor of biological sciences. She is the guest editor of a new themed issue of the Philosophical Transactions of the Royal Society B, the oldest scientific journal in the world, that focuses on an interdisciplinary approach to human evolution.

Stanford professor Marc Feldman and Stanford postdoc Oren Kolodny collaborated with Creanza on the special issue.

“Within the blink of an eye on a geological timescale, humans advanced from using basic stone tools to examining the rocks on Mars; however, our exact evolutionary path and the relative importance of genetic and cultural evolution remain a mystery,” said Creanza, who specializes in the application of computational and theoretical approaches to human and cultural evolution, particularly language development. “Our cultural capacities-to create new ideas, to communicate and learn from one another, and to form vast social networks-together make us uniquely human, but the origins, the mechanisms, and the evolutionary impact of these capacities remain unknown.”

The special issue brings together researchers in biology, anthropology, archaeology, economics, psychology, computer science and more to explore the cultural forces affecting human evolution from a wider perspective than is usually taken.

“Researchers have begun to recognize that understanding non-genetic inheritance, including culture, ecology, the microbiome, and regulation of gene expression, is fundamental to fully comprehending evolution,” said Creanza. “It is essential to understand the dynamics of cultural inheritance at different temporal and spatial scales, to uncover the underlying mechanisms that drive these dynamics, and to shed light on their implications for our current theory of evolution as well as for our interpretation and predictions regarding human behavior.”

In addition to an essay discussing the need for an interdisciplinary approach to human evolution, Creanza included an interdisciplinary study of her own, examining the origins of English’s contribution to Sranan, a creole that emerged in Suriname following an influx of indentured servants from England in the 17th century.

Creanza, along with linguists Andre Sherriah and Hubert Devonish of the University of the West Indes and psychologist Ewart Thomas from Stanford, sought to determine the geographic origins of the English speakers whose regional dialects formed the backbone of Sranan. Their work combined linguistic, historical and genetic approaches to determine that the English speakers who influenced Sranan the most originated largely from two counties on opposite sides of southern England: Bristol, in the west, and Essex, in the east.

“Thus, analyzing the features of modern-day languages might give us new information about events in human history that left few other traces,” Creanza said.

Language is learned in brain circuits that predate humans (Georgetown University)

PUBLIC RELEASE: 

GEORGETOWN UNIVERSITY MEDICAL CENTER

WASHINGTON — It has often been claimed that humans learn language using brain components that are specifically dedicated to this purpose. Now, new evidence strongly suggests that language is in fact learned in brain systems that are also used for many other purposes and even pre-existed humans, say researchers in PNAS (Early Edition online Jan. 29).

The research combines results from multiple studies involving a total of 665 participants. It shows that children learn their native language and adults learn foreign languages in evolutionarily ancient brain circuits that also are used for tasks as diverse as remembering a shopping list and learning to drive.

“Our conclusion that language is learned in such ancient general-purpose systems contrasts with the long-standing theory that language depends on innately-specified language modules found only in humans,” says the study’s senior investigator, Michael T. Ullman, PhD, professor of neuroscience at Georgetown University School of Medicine.

“These brain systems are also found in animals – for example, rats use them when they learn to navigate a maze,” says co-author Phillip Hamrick, PhD, of Kent State University. “Whatever changes these systems might have undergone to support language, the fact that they play an important role in this critical human ability is quite remarkable.”

The study has important implications not only for understanding the biology and evolution of language and how it is learned, but also for how language learning can be improved, both for people learning a foreign language and for those with language disorders such as autism, dyslexia, or aphasia (language problems caused by brain damage such as stroke).

The research statistically synthesized findings from 16 studies that examined language learning in two well-studied brain systems: declarative and procedural memory.

The results showed that how good we are at remembering the words of a language correlates with how good we are at learning in declarative memory, which we use to memorize shopping lists or to remember the bus driver’s face or what we ate for dinner last night.

Grammar abilities, which allow us to combine words into sentences according to the rules of a language, showed a different pattern. The grammar abilities of children acquiring their native language correlated most strongly with learning in procedural memory, which we use to learn tasks such as driving, riding a bicycle, or playing a musical instrument. In adults learning a foreign language, however, grammar correlated with declarative memory at earlier stages of language learning, but with procedural memory at later stages.

The correlations were large, and were found consistently across languages (e.g., English, French, Finnish, and Japanese) and tasks (e.g., reading, listening, and speaking tasks), suggesting that the links between language and the brain systems are robust and reliable.

The findings have broad research, educational, and clinical implications, says co-author Jarrad Lum, PhD, of Deakin University in Australia.

“Researchers still know very little about the genetic and biological bases of language learning, and the new findings may lead to advances in these areas,” says Ullman. “We know much more about the genetics and biology of the brain systems than about these same aspects of language learning. Since our results suggest that language learning depends on the brain systems, the genetics, biology, and learning mechanisms of these systems may very well also hold for language.”

For example, though researchers know little about which genes underlie language, numerous genes playing particular roles in the two brain systems have been identified. The findings from this new study suggest that these genes may also play similar roles in language. Along the same lines, the evolution of these brain systems, and how they came to underlie language, should shed light on the evolution of language.

Additionally, the findings may lead to approaches that could improve foreign language learning and language problems in disorders, Ullman says.

For example, various pharmacological agents (e.g., the drug memantine) and behavioral strategies (e.g., spacing out the presentation of information) have been shown to enhance learning or retention of information in the brain systems, he says. These approaches may thus also be used to facilitate language learning, including in disorders such as aphasia, dyslexia, and autism.

“We hope and believe that this study will lead to exciting advances in our understanding of language, and in how both second language learning and language problems can be improved,” Ullman concludes.

Human societies evolve along similar paths (University of Exeter)

PUBLIC RELEASE: 

Societies ranging from ancient Rome and the Inca empire to modern Britain and China have evolved along similar paths, a huge new study shows.

Despite their many differences, societies tend to become more complex in “highly predictable” ways, researchers said.

These processes of development – often happening in societies with no knowledge of each other – include the emergence of writing systems and “specialised” government workers such as soldiers, judges and bureaucrats.The international research team, including researchers from the University of Exeter, created a new database of historical and archaeological information using data on 414 societies spanning the last 10,000 years. The database is larger and more systematic than anything that has gone before it.

“Societies evolve along a bumpy path – sometimes breaking apart – but the trend is towards larger, more complex arrangements,” said corresponding author Dr Thomas Currie, of the Human Behaviour and Cultural Evolution Group at the University of Exeter’s Penryn Campus in Cornwall.

“Researchers have long debated whether social complexity can be meaningfully compared across different parts of the world. Our research suggests that, despite surface differences, there are fundamental similarities in the way societies evolve.

“Although societies in places as distant as Mississippi and China evolved independently and followed their own trajectories, the structure of social organisation is broadly shared across all continents and historical eras.”

The measures of complexity examined by the researchers were divided into nine categories. These included:

  • Population size and territory
  • Number of control/decision levels in administrative, religious and military hierarchies
  • Information systems such as writing and record keeping
  • Literature on specialised topics such as history, philosophy and fiction
  • Economic development

The researchers found that these different features showed strong statistical relationships, meaning that variation in societies across space and time could be captured by a single measure of social complexity.

This measure can be thought of as “a composite measure of the various roles, institutions, and technologies that enable the coordination of large numbers of people to act in a politically unified manner”.

Dr Currie said learning lessons from human history could have practical uses.

“Understanding the ways in which societies evolve over time and in particular how humans are able to create large, cohesive groups is important when we think about state building and development,” he said.

“This study shows how the sciences and humanities, which have not always seen eye-to-eye, can actually work together effectively to uncover general rules that have shaped human history.”

###

The new database of historical and archaeological information is known as “Seshat: Global History Databank” and its construction was led by researchers from the University of Exeter, the University of Connecticut, the University of Oxford, Trinity College Dublin and the Evolution Institute. More than 70 expert historians and archaeologists have helped in the data collection process.

The paper, published in Proceedings of the National Academy of Sciences, is entitled: “Quantitative historical analysis uncovers a single dimension of complexity that structures global variation in human social organisation.”

Scientists Seek to Update Evolution (Quanta Magazine)

Recent discoveries have led some researchers to argue that the modern evolutionary synthesis needs to be amended. 

By Carl Zimmer. November 22, 2016

Douglas Futuyma, a biologist at Stony Brook University, defends the “Modern Synthesis” of evolution at the Royal Society earlier this month.  Kevin Laland looked out across the meeting room at a couple hundred people gathered for a conference on the future of evolutionary biology. A colleague sidled up next to him and asked how he thought things were going.

“I think it’s going quite well,” Laland said. “It hasn’t gone to fisticuffs yet.”

Laland is an evolutionary biologist who works at the University of St. Andrews in Scotland. On a chilly gray November day, he came down to London to co-host a meeting at the Royal Society called “New Trends in Evolutionary Biology.” A motley crew of biologists, anthropologists, doctors, computer scientists, and self-appointed visionaries packed the room. The Royal Society is housed in a stately building overlooking St. James’s Park. Today the only thing for Laland to see out of the tall meeting-room windows was scaffolding and gauzy tarps set up for renovation work. Inside, Laland hoped, another kind of renovation would be taking place.

In the mid-1900s, biologists updated Darwin’s theory of evolution with new insights from genetics and other fields. The result is often called the Modern Synthesis, and it has guided evolutionary biology for over 50 years. But in that time, scientists have learned a tremendous amount about how life works. They can sequence entire genomes. They can watch genes turn on and off in developing embryos. They can observe how animals and plants respond to changes in the environment.

As a result, Laland and a like-minded group of biologists argue that the Modern Synthesis needs an overhaul. It has to be recast as a new vision of evolution, which they’ve dubbed the Extended Evolutionary Synthesis. Other biologists have pushed back hard, saying there is little evidence that such a paradigm shift is warranted.

This meeting at the Royal Society was the first public conference where Laland and his colleagues could present their vision. But Laland had no interest in merely preaching to the converted, and so he and his fellow organizers also invited prominent evolutionary biologists who are skeptical about the Extended Evolutionary Synthesis.

Both sides offered their arguments and critiques in a civil way, but sometimes you could sense the tension in the room — the punctuations of tsk-tsks, eye-rolling, and partisan bursts of applause.

But no fisticuffs. At least not yet.

Making Evolution as We Know It

Every science passes through times of revolution and of business as usual. After Galileo and Newton dragged physics out of its ancient errors in the 1600s, it rolled forward from one modest advance to the next until the early 1900s. Then Einstein and other scientists established quantum physics, relativity and other new ways of understanding the universe. None of them claimed that Newton was wrong. But it turns out there’s much more to the universe than matter in motion.

Evolutionary biology has had revolutions of its own. The first, of course, was launched by Charles Darwin in 1859 with his book On the Origin of Species. Darwin wove together evidence from paleontology, embryology and other sciences to show that living things were related to one another by common descent. He also introduced a mechanism to drive that long-term change: natural selection. Each generation of a species was full of variations. Some variations helped organisms survive and reproduce, and those were passed down, thanks to heredity, to the next generation.

Darwin inspired biologists all over the world to study animals and plants in a new way, interpreting their biology as adaptations produced over many generations. But he succeeded in this despite having no idea what a gene was. It wasn’t until the 1930s that geneticists and evolutionary biologists came together and recast evolutionary theory. Heredity became the transmission of genes from generation to generation. Variations were due to mutations, which could be shuffled into new combinations. New species arose when populations built up mutations that made interbreeding impossible.

In 1942, the British biologist Julian Huxley described this emerging framework in a book called Evolution: The Modern Synthesis. Today, scientists still call it by that name. (Sometimes they refer to it instead as neo-Darwinism, although that’s actually a confusing misnomer. The term “neo-Darwinism” was actually coined in the late 1800s, to refer to biologists who were advancing Darwin’s ideas in Darwin’s own lifetime.)

The Modern Synthesis proved to be a powerful tool for asking questions about nature. Scientists used it to make a vast range of discoveries about the history of life, such as why some people are prone to genetic disorders like sickle-cell anemia and why pesticides sooner or later fail to keep farm pests in check. But starting not long after the formation of the Modern Synthesis, various biologists would complain from time to time that it was too rigid. It wasn’t until the past few years, however, that Laland and other researchers got organized and made a concerted effort to formulate an extended synthesis that might take its place.

The researchers don’t argue that the Modern Synthesis is wrong — just that it doesn’t capture the full richness of evolution. Organisms inherit more than just genes, for example: They can inherit other cellular molecules, as well as behaviors they learn and the environments altered by their ancestors. Laland and his colleagues also challenge the pre-eminent place that natural selection gets in explanations for how life got to be the way it is. Other processes can influence the course of evolution, too, from the rules of development to the environments in which organisms have to live.

“It’s not simply bolting more mechanisms on what we already have,” said Laland. “It requires you to think of causation in a different way.”

Adding to Darwin

Eva Jablonka, a biologist at Tel Aviv University, used her talk to explore the evidence for a form of heredity beyond genes.

Our cells use a number of special molecules to control which of their genes make proteins. In a process called methylation, for example, cells put caps on their DNA to keep certain genes shut down. When cells divide, they can reproduce the same caps and other controls on the new DNA. Certain signals from the environment can cause cells to change these so-called “epigenetic” controls, allowing organisms to adjust their behavior to new challenges.

Some studies indicate that — under certain circumstances — an epigenetic change in a parent may get passed down to its offspring. And those children may pass down this altered epigenetic profile to their children. This would be kind of heredity that’s beyond genes.

The evidence for this effect is strongest in plants. In one study, researchers were able to trace down altered methylation patterns for 31 generations in a plant called Arabidopsis. And this sort of inheritance can make a meaningful difference in how an organism works. In another study, researchers found that inherited methylation patterns could change the flowering time of Arabidopsis, as well as the size of its roots. The variation that these patterns created was even bigger than what ordinary mutations caused.

After presenting evidence like this, Jablonka argued that epigenetic differences could determine which organisms survived long enough to reproduce. “Natural selection could work on this system,” she said.

While natural selection is an important force in evolution, the speakers at the meeting presented evidence for how it could be constrained, or biased in a particular direction. Gerd Müller, a University of Vienna biologist, offered an example from his own research on lizards. A number of species of lizards have evolved feet that have lost some toes. Some have only four toes, while others have just one, and some have lost their feet altogether.

The Modern Synthesis, Müller argued, leads scientists to look at these arrangements as simply the product of natural selection, which favors one variant over others because it has a survival advantage. But that approach doesn’t work if you ask what the advantage was for a particular species to lose the first toe and last toe in its foot, instead of some other pair of toes.

“The answer is, there is no real selective advantage,” said Müller.

The key to understanding why lizards lose particular toes is found in the way that lizard embryos develop toes in the first place. A bud sprouts off the side of the body, and then five digits emerge. But the toes always appear in the same sequence. And when lizards lose their toes through evolution, they lose them in the reverse order. Müller suspects this constraint is because mutations can’t create every possible variation. Some combinations of toes are thus off-limits, and natural selection can never select them in the first place.

Development may constrain evolution. On the other hand, it also provides animals and plants with remarkable flexibility. Sonia Sultan, an evolutionary ecologist from Wesleyan University, offered a spectacular case in point during her talk, describing a plant she studies in the genus Polygonum that takes the common name “smartweed.”

The Modern Synthesis, Sultan said, would lead you to look at the adaptations in a smartweed plant as the fine-tuned product of natural selection. If plants grow in low sunlight, then natural selection will favor plants with genetic variants that let them thrive in that environment — for example, by growing broader leaves to catch more photons. Plants that grow in bright sunlight, on the other hand, will evolve adaptations that let them thrive in those different conditions.

“It’s a commitment to that view that we’re here to confront,” Sultan said.

If you raise genetically identical smartweed plants under different conditions, Sultan showed, you’ll end up with plants that may look like they belong to different species.

For one thing, smartweed plants adjust the size of their leaves to the amount of sunlight they get. In bright light, the plants grow narrow, thick leaves, but in low light, the leaves become broad and thin. In dry soil, the plants send roots down deep in search of water, while in flood soil, they grow shallow hairlike roots that that stay near the surface.

Scientists at the meeting argued that this flexibility — known as plasticity — can itself help drive evolution. It allows plants to spread into a range of habitats, for example, where natural selection can then adapt their genes. And in another talk, Susan Antón, a paleoanthropologist at New York University, said that plasticity may play a significant role in human evolution that’s gone underappreciated till now. That’s because the Modern Synthesis has strongly influenced the study of human evolution for the past half century.

Paleoanthropologists tended to treat differences in fossils as the result of genetic differences. That allowed them to draw an evolutionary tree of humans and their extinct relatives. This approach has a lot to show for it, Antón acknowledged. By the 1980s, scientists had figured out that our early ancient relatives were short and small-brained up to about two million years ago. Then one lineage got tall and evolved big brains. That transition marked the origin of our genus, Homo.

But sometimes paleoanthropologists would find variations that were harder to make sense of. Two fossils might look in some ways like they should be in the same species but look too different in other respects. Scientists would usually dismiss those variations as being caused by the environment. “We wanted to get rid of all that stuff and get down to their essence,” Antón said.

But that stuff is now too abundant to ignore. Scientists have found a dizzying variety of humanlike fossils dating back to 1.5 to 2.5 million years ago. Some are tall, and some are short. Some have big brains and some have small ones. They all have some features of Homo in their skeletonbut each has a confusing mix-and-match assortment.

Antón thinks that the Extended Evolutionary Synthesis can help scientists make sense of this profound mystery. In particular, she thinks that her colleagues should take plasticity seriously as an explanation for the weird diversity of early Homo fossils.

To support this idea, Antón pointed out that living humans have their own kinds of plasticity. The quality of food a woman gets while she’s pregnant can influence the size and health of her baby, and those influences can last until adulthood. What’s more, the size of a woman — influenced in part by her own mother’s diet — can influence her own children. Biologists have found that women with longer legs tend to have larger children, for example.

Antón proposed that the weird variations in the fossil record might be even more dramatic examples of plasticity. All these fossils date to when Africa’s climate fell into a period of wild climate swings. Droughts and abundant rains would have changed the food supply in different parts of the world, perhaps causing early Homo to develop differently.

The Extended Evolutionary Synthesis may also help make sense of another chapter in our history: the dawn of agriculture. In Asia, Africa and the Americas, people domesticated crops and livestock. Melinda Zeder, an archaeologist at the Smithsonian Institution, gave a talk at the meeting about the long struggle to understand how this transformation unfolded.

Before people farmed, they foraged for food and hunted wild game. Zeder explained how many scientists treat the behavior of the foragers in a very Modern Synthesis way: as finely tuned by natural selection to deliver the biggest payoff for their effort to find food.

The trouble is that it’s hard to see how such a forager would ever switch to farming. “You don’t get the immediate gratification of grabbing some food and putting it in your mouth,” Zeder told me.

Some researchers suggested that the switch to agriculture might have occurred during a climate shift, when it got harder to find wild plants. But Zeder and other researchers have actually found no evidence of such a crisis when agriculture arose.

Zeder argues that there’s a better way of thinking about this transition. Humans are not passive zombies trying to survive in a fixed environment. They are creative thinkers who can change the environment itself. And in the process, they can steer evolution in a new direction.

Scientists call this process niche construction, and many species do it. The classic case is a beaver. It cuts down trees and makes a dam, creating a pond. In this new environment, some species of plants and animals will do better than others. And they will adapt to their environment in new ways. That’s true not just for the plants and animals that live around a beaver pond, but for the beaver itself.

When Zeder first learned about niche construction, she says, it was a revelation. “Little explosions were going off in my head,” she told me. The archaeological evidence she and others had gathered made sense as a record of how humans changed their own environment.

Early foragers show signs of having moved wild plants away from their native habitats to have them close at hand, for example. As they watered the plants and protected them from herbivores, the plants adapted to their new environment. Weedy species also moved in and became crops of their own. Certain animals adapted to the environment as well, becoming dogs, cats and other domesticated species.

Gradually, the environment changed from sparse patches of wild plants to dense farm fields. That environment didn’t just drive the evolution of the plants. It also began to drive the cultural evolution of the farmers, too. Instead of wandering as nomads, they settled down in villages so that they could work the land around them. Society became more stable because children received an ecological inheritance from their parents. And so civilization began.

Niche construction is just one of many concepts from the Extended Evolutionary Synthesis that can help make sense of domestication, Zeder said. During her talk, she presented slide after slide of predictions it provides, about everything from the movements of early foragers to the pace of plant evolution.

“It felt like an infomercial for the Extended Evolutionary Synthesis,” Zeder told me later with a laugh. “But wait! You can get steak knives!”

The Return of Natural Selection

Among the members of the audience was a biologist named David Shuker. After listening quietly for a day and a half, the University of St Andrews researcher had had enough. At the end of a talk, he shot up his hand.

The talk had been given by Denis Noble, a physiologist with a mop of white hair and a blue blazer. Noble, who has spent most of his career at Oxford, said he started out as a traditional biologist, seeing genes as the ultimate cause of everything in the body. But in recent years he had switched his thinking. He spoke of the genome not as a blueprint for life but as a sensitive organ, detecting stress and rearranging itself to cope with challenges. “I’ve been on a long journey to this view,” Noble said.

To illustrate this new view, Noble discussed an assortment of recent experiments. One of them was published last year by a team at the University of Reading. They did an experiment on bacteria that swim by spinning their long tails.

First, the scientists cut a gene out of the bacteria’s DNA that’s essential for building tails. The researchers then dropped these tailless bacteria into a petri dish with a meager supply of food. Before long, the bacteria ate all the food in their immediate surroundings. If they couldn’t move, they died. In less than four days in these dire conditions, the bacteria were swimming again. On close inspection, the team found they were growing new tails.

“This strategy is to produce rapid evolutionary genome change in response to the unfavorable environment,” Noble declared to the audience. “It’s a self-maintaining system that enables a particular characteristic to occur independent of the DNA.”

That didn’t sound right to Shuker, and he was determined to challenge Noble after the applause died down.

“Could you comment at all on the mechanism underlying that discovery?” Shuker asked.

Noble stammered in reply. “The mechanism in general terms, I can, yes…” he said, and then started talking about networks and regulation and a desperate search for a solution to a crisis. “You’d have to go back to the original paper,” he then said.

While Noble was struggling to respond, Shuker went back to the paper on an iPad. And now he read the abstract in a booming voice.

“‘Our results demonstrate that natural selection can rapidly rewire regulatory networks,’” Shuker said. He put down the iPad. “So it’s a perfect, beautiful example of rapid neo-Darwinian evolution,” he declared.

Shuker distilled the feelings of a lot of skeptics I talked to at the conference. The high-flying rhetoric about a paradigm shift was, for the most part, unwarranted, they said. Nor were these skeptics limited to the peanut gallery. Several of them gave talks of their own.

“I think I’m expected to represent the Jurassic view of evolution,” said Douglas Futuyma when he got up to the podium. Futuyma is a soft-spoken biologist at Stony Brook University in New York and the author of a leading textbook on evolution. In other words, he was the target of many complaints during the meeting that textbooks paid little heed to things like epigenetics and plasticity. In effect, Futuyma had been invited to tell his colleagues why those concepts were ignored.

“We must recognize that the core principles of the Modern Synthesis are strong and well-supported,” Futuyma declared. Not only that, he added, but the kinds of biology being discussed at the Royal Society weren’t actually all that new. The architects of the Modern Synthesis were already talking about them over 50 years ago. And there’s been a lot of research guided by the Modern Synthesis to make sense of them.

Take plasticity. The genetic variations in an animal or a plant govern the range of forms into which organism can develop. Mutations can alter that range. And mathematical models of natural selection show how it can favor some kinds of plasticity over others.

If the Extended Evolutionary Synthesis was so superfluous, then why was it gaining enough attention to warrant a meeting at the Royal Society? Futuyma suggested that its appeal was emotional rather than scientific. It made life an active force rather than the passive vehicle of mutations.

“I think what we find emotionally or aesthetically more appealing is not the basis for science,” Futuyma said.

Still, he went out of his way to say that the kind of research described at the meeting could lead to some interesting insights about evolution. But those insights would only arise with some hard work that leads to hard data. “There have been enough essays and position papers,” he said.

Some members in the audience harangued Futuyma a bit. Other skeptical speakers sometimes got exasperated by arguments they felt didn’t make sense. But the meeting managed to reach its end on the third afternoon without fisticuffs.

“This is likely the first of many, many meetings,” Laland told me. In September, a consortium of scientists in Europe and the United States received $11 million in funding (including $8 million from the John Templeton Foundation) to run 22 studies on the Extended Evolutionary Synthesis.

Many of these studies will test predictions that have emerged from the synthesis in recent years. They will see, for example, if species that build their own environments — spider webs, wasp nests and so on — evolve into more species than ones that don’t. They will look at whether more plasticity allows species to adapt faster to new environments.

“It’s doing the research, which is what our critics are telling us to do,” said Laland. “Go find the evidence.”

Correction: An earlier version of this article misidentified the photograph of Andy Whiten as Gerd Müller.

This article was reprinted on TheAtlantic.com.

Large human brain evolved as a result of ‘sizing each other up’ (Science Daily)

Date:
August 12, 2016
Source:
Cardiff University
Summary:
Humans have evolved a disproportionately large brain as a result of sizing each other up in large cooperative social groups, researchers have proposed.

The brains of humans enlarged over time thanks to our sizing up the competition, say scientists. Credit: © danheighton / Fotolia

Humans have evolved a disproportionately large brain as a result of sizing each other up in large cooperative social groups, researchers have proposed.

A team led by computer scientists at Cardiff University suggest that the challenge of judging a person’s relative standing and deciding whether or not to cooperate with them has promoted the rapid expansion of human brain size over the last 2 million years.

In a study published in Scientific Reports, the team, which also includes leading evolutionary psychologist Professor Robin Dunbar from the University of Oxford, specifically found that evolution favors those who prefer to help out others who are at least as successful as themselves.

Lead author of the study Professor Roger Whitaker, from Cardiff University’s School of Computer Science and Informatics, said: “Our results suggest that the evolution of cooperation, which is key to a prosperous society, is intrinsically linked to the idea of social comparison — constantly sizing each up and making decisions as to whether we want to help them or not.

“We’ve shown that over time, evolution favors strategies to help those who are at least as successful as themselves.”

In their study, the team used computer modelling to run hundreds of thousands of simulations, or ‘donation games’, to unravel the complexities of decision-making strategies for simplified humans and to establish why certain types of behaviour among individuals begins to strengthen over time.

In each round of the donation game, two simulated players were randomly selected from the population. The first player then made a decision on whether or not they wanted to donate to the other player, based on how they judged their reputation. If the player chose to donate, they incurred a cost and the receiver was given a benefit. Each player’s reputation was then updated in light of their action, and another game was initiated.

Compared to other species, including our closest relatives, chimpanzees, the brain takes up much more body weight in human beings. Humans also have the largest cerebral cortex of all mammals, relative to the size of their brains. This area houses the cerebral hemispheres, which are responsible for higher functions like memory, communication and thinking.

The research team propose that making relative judgements through helping others has been influential for human survival, and that the complexity of constantly assessing individuals has been a sufficiently difficult task to promote the expansion of the brain over many generations of human reproduction.

Professor Robin Dunbar, who previously proposed the social brain hypothesis, said: “According to the social brain hypothesis, the disproportionately large brain size in humans exists as a consequence of humans evolving in large and complex social groups.

“Our new research reinforces this hypothesis and offers an insight into the way cooperation and reward may have been instrumental in driving brain evolution, suggesting that the challenge of assessing others could have contributed to the large brain size in humans.”

According to the team, the research could also have future implications in engineering, specifically where intelligent and autonomous machines need to decide how generous they should be towards each other during one-off interactions.

“The models we use can be executed as short algorithms called heuristics, allowing devices to make quick decisions about their cooperative behaviour,” Professor Whitaker said.

“New autonomous technologies, such as distributed wireless networks or driverless cars, will need to self-manage their behaviour but at the same time cooperate with others in their environment.”


Journal Reference:

  1. Roger M. Whitaker, Gualtiero B. Colombo, Stuart M. Allen, Robin I. M. Dunbar. A Dominant Social Comparison Heuristic Unites Alternative Mechanisms for the Evolution of Indirect ReciprocityScientific Reports, 2016; 6: 31459 DOI: 10.1038/srep31459

O bichinho que desafia Deus (El País)

Organismo marinho mostra por que o ser humano não está no topo da evolução

MANUEL ANSEDE

Barcelona 13 JUN 2016 – 21:07 CEST

Os biólogos Ricard Albalat e Cristian Cañestro, com exemplares do 'Oikopleura'.

Os biólogos Ricard Albalat e Cristian Cañestro, com exemplares do ‘Oikopleura’. JUAN BARBOSA 

“Só o acaso pode ser interpretado como uma mensagem. Aquilo que acontece por necessidade, aquilo que é esperado e que se repete todos os dias, não é senão uma coisa muda. Somente o acaso tem voz”, escreveu Milan Kundera em A Insustentável Leveza do Ser. E tem algo que fala, ou melhor, grita, numa praia de Badalona, perto de Barcelona: a que é dominada pela Ponte do Petróleo. Por esse dique de 250 metros, que penetra no mar Mediterrâneo, eram descarregados produtos petrolíferos até o final do século XX. E a seus pés se levanta desde 1870 a fábrica do Anís del Mono, o licor em cujo rótulo aparece um símio com cara de Charles Darwin em referência à teoria da evolução, que gerava polêmica na época.

Hoje, a Ponte do Petróleo é um belo mirante com uma estátua de bronze dedicada ao macaco com rosto darwinista. E, por um acaso que fala, entre seus frequentadores se encontra uma equipe de biólogos evolutivos do departamento de Genética da Universidade de Barcelona. Os cientistas caminham pela passarela sobre o oceano e lançam um cubo para fisgar um animal marinho, o Oikopleura dioica, de apenas três centímetros, mas que possui boca, ânus, cérebro e coração. Parece insignificante, mas, como Darwin, faz estremecer o discurso das religiões. Coloca o ser humano no lugar que lhe corresponde: com o resto dos animais.

“Temos sido mal influenciados pela religião, pensando que estávamos no topo da evolução. Na verdade, estamos no mesmo nível que o dos outros animais”, diz o biólogo Cristian Cañestro. Ele e o colega Ricard Albalat dirigem um dos únicos três centros científicos do mundo dedicados ao estudo do Oikopleura dioica. Os outros dois estão na Noruega e no Japão. O centro espanhol é uma salinha fria, com centenas de exemplares praticamente invisíveis colocados em recipientes de água, num canto da Faculdade de Biologia da Universidade de Barcelona.

O organismo marinho ‘Oikopleura dioica’ indica que a perda de genes ancestrais, compartilhados com os humanos, seria o motor da evolução

“A visão até agora era que, ao evoluir, ganhávamos em complexidade, adquirindo genes. Era o que se pensava quando os primeiros genomas foram sequenciados: de mosca, de minhoca e do ser humano. Mas vimos que não é assim. A maioria de nossos genes está também nas medusas. Nosso ancestral comum os possuía. Não que tenhamos ganhado genes; eles é que perderam. A complexidade genética é ancestral”, diz Cañestro.

Em 2006, o biólogo pesquisava o papel de um derivado da vitamina A, o ácido retinoico, no desenvolvimento embrionário. Essa substância indica às células de um embrião o que têm que fazer para se transformar num corpo adulto. O ácido retinoico ativa os genes necessários, por exemplo, para formar as extremidades, o coração, os olhos e as orelhas dos animais. Cañestro estudava esse processo no Oikopleura. E ficou de boca aberta.

Uma fêmea de 'Oikopleura dioica' cheia de ovos.

Uma fêmea de ‘Oikopleura dioica’ cheia de ovos. CAÑESTRO & ALBALAT LAB

“Os animais utilizam uma grande quantidade de genes para sintetizar o ácido retinoico. Percebi que no Oikopleura dioica faltava um desses genes. Depois vi que faltavam outros. Não encontramos nenhum”, recorda. Esse animal de três milímetros fabrica seu coração, de maneira inexplicável, sem ácido retinoico. “Se você vê um carro se mover sem rodas, nesse dia sua percepção sobre as rodas muda”, diz Cañestro.

O último ancestral comum entre nós e esse minúsculo habitante do oceano viveu há cerca de 500 milhões de anos. Desde então, o Oikopleura perdeu 30% dos genes que nos uniam. E fez isso com sucesso. Se você entrar em qualquer praia do mundo, ali estará ele rodeando o seu corpo. Na batalha da seleção natural, os Oikopleura ganharam. Sua densidade atinge 20.000 indivíduos por metro cúbico de água em alguns ecossistemas marinhos. São perdedores, mas só de genes.

Nosso último ancestral comum viveu há 500 milhões de anos. Desde então, o ‘Oikopleura’ perdeu 30% dos genes que nos uniam

Albalat e Cañestro acabam de publicar na revista especializada Nature Reviews Genetics um artigo que analisa a perda de genes como motor da evolução. Seu texto despertou interesse mundial. Foi recomendado pela F1000Prime, uma publicação internacional que aponta os melhores artigos sobre biologia e medicina. O trabalho começa com uma frase do imperador romano Marco Aurelio, filósofo estoico: “A perda nada mais é do que mudança, e a mudança é um prazer da natureza”.

Os dois biólogos afirmam que a perda de genes pode inclusive ter sido essencial para a origem da espécie humana. “O chimpanzé e o ser humano compartilham mais de 98% do seu genoma. Talvez tenhamos que procurar as diferenças nos genes que foram perdidos de maneira diferente durante a evolução dos humanos e dos demais primatas. Alguns estudos sugerem que a perda de um gene fez com que a musculatura de nossa mandíbula ficasse menor, o que permitiu aumentar o volume do nosso crânio”, diz Albalat. Talvez, perder genes nos tornou mais inteligentes que o resto dos mortais.

Pesquisadores do laboratório de Cristian Cañestro e Ricard Albalat.Pesquisadores do laboratório de Cristian Cañestro e Ricard Albalat. UB

 Em 2012, um estudo do geneticista norte-americano Daniel MacArthur mostrou que, em média, qualquer pessoa saudável tem 20 genes desativados. E isso aparentemente não importa. Albalat e Cañestro, do Instituto de Pesquisa da Biodiversidade (IRBio) da Universidade de Barcelona, citam dois exemplos muito estudados. Em algumas pessoas, os genes que codificam as proteínas CCR5 e DUFFY foram anulados por mutações. São as proteínas usadas, respectivamente, pelo vírus HIV e o parasita causador da malária para entrar nas células. A perda desses genes torna os humanos resistentes a essas doenças.

No laboratório de Cañestro e Albalat, há um cartaz que imita o do filme Cães de Aluguel (“Reservoir Dogs”, em inglês), de Quentin Tarantino: os cientistas e outros membros de sua equipe aparecem vestidos com camisa branca e gravata preta. A montagem se chama Reservoir Oiks, em alusão ao Oikopleura. Os dois biólogos acreditam que o organismo marinho permitirá formular e responder perguntas novas sobre nosso manual de instruções comum: o genoma.

O ‘Oikopleura’ permite estudar quais genes são essenciais: por que algumas mutações são irrelevantes e outras provocam efeitos devastadores em nossa saúde

O cérebro do Oikopleura tem cerca de 100 neurônios e o dos humanos, 86 bilhões. Mas somos muito mais semelhantes do que à primeira vista. Entre 60% e 80% das famílias de genes humanos têm um claro representante no genoma do Oikopleura. “Esse animal nos permite estudar quais genes humanos são essenciais”, diz Albalat. Em outras palavras: por que algumas mutações são irrelevantes e outras provocam efeitos terríveis em nossa saúde.

Os seres vivos possuem um sistema celular que repara as mutações surgidas no DNA. O Oikopleura doica perdeu 16 dos 83 genes ancestrais que regulam esse processo. Essa incapacidade para a autorreparação poderia explicar sua perda extrema de genes, segundo o artigo da Nature Reviews Genetics.

O olhar de Cañestro se ilumina quando ele fala dessas ausências. Os genes costumam atuar em grupo para levar a cabo uma função. Se de um grupo conhecido de oito genes faltam sete no Oikopleura, pois a função foi perdida, a permanência do oitavo gene pode revelar uma segunda função essencial que teria passado despercebida. Esse gene seria como um cruzamento de estradas. Desmantelada uma rodovia, ele sobrevive porque é fundamental em outra. “Essa segunda função já estava no ancestral comum e pode ser importante nos humanos”, diz Cañestro.

“Não existem animais superiores ou inferiores. Nossas peças de Lego são basicamente as mesmas, embora com elas possamos construir coisas diferentes”, afirma. Pense no seu lugar no mundo da próxima vez que mergulhar no mar. Essa neve branca que flutua na água e pode ser vista contra a luz são os excrementos do Oikopleura.

Ancestors of Modern Humans Interbred With Extinct Hominins, Study Finds (N.Y.Times)

Carl Zimmer

Skulls of the Neanderthal man. Credit: European Press photo Agency 

The ancestors of modern humans interbred with Neanderthals and another extinct line of humans known as the Denisovans at least four times in the course of prehistory, according to an analysis of global genomes published Thursday in the journal Science. 

The interbreeding may have given modern humans genes that bolstered immunity to pathogens, the authors concluded.“This is yet another genetic nail in the coffin of our oversimplistic models of human evolution,” said Carles Lalueza-Fox, a research scientist at the Institute of Evolutionary Biology in Barcelona, Spain, who was not involved in the study. 

The new study expands on a series of findings in recent years showing that the ancestors of modern humans once shared the planet with a surprising number of near relatives — lineages like the Neanderthals and Denisovans that became extinct tens of thousands of years ago.

Before disappearing, however, they interbred with our forebears on at least several occasions. Today, we carry DNA from these encounters.

The first clues to ancient interbreeding surfaced in 2010, when scientists discovered that some modern humans — mostly Europeans — carried DNA that matched material recovered from Neanderthal fossils.

Later studies showed that the forebears of modern humans first encountered Neanderthals after expanding out of Africa more than 50,000 years ago.

But the Neanderthals were not the only extinct humans that our own ancestors found. A finger bone discovered in a Siberian cave, called Denisova, yielded DNA from yet another group of humans.

Research later indicated that all three groups — modern humans, Neanderthals and Denisovans — shared a common ancestor who lived roughly 600,000 years ago. And, perhaps no surprise, some ancestors of modern humans also interbred with Denisovans.

Some of their DNA has survived in people in Melanesia, a region of the Pacific that includes New Guinea and the islands around it.

Those initial discoveries left major questions unanswered, such as how often our ancestors interbred with Neanderthals and Denisovans. Scientists have developed new ways to study the DNA of living people to tackle these mysteries.

Joshua M. Akey, a geneticist at the University of Washington, and his colleagues analyzed a database of 1,488 genomes from people around the world. The scientists added 35 genomes from people in New Britain and other Melanesian islands in an effort to learn more about Denisovans in particular.

The researchers found that all of the non-Africans in their study had Neanderthal DNA, while the Africans had very little or none. That finding supported previous studies.

But when Dr. Akey and his colleagues compared DNA from modern Europeans, East Asians and Melanesians, they found that each population carried its own distinctive mix of Neanderthal genes.

The best explanation for these patterns, the scientists concluded, was that the ancestors of modern humans acquired Neanderthal DNA on three occasions.

The first encounter happened when the common ancestor of all non-Africans interbred with Neanderthals.

The second occurred among the ancestors of East Asians and Europeans, after the ancestors of Melanesians split off. Later, the ancestors of East Asians — but not Europeans — interbred a third time with Neanderthals.

Earlier studies had hinted at the possibility that the forebears of modern humans had multiple encounters with Neanderthals, but hard data had been lacking.

“A lot of people have been arguing for that, but now they’re really providing the evidence for it,” said Rasmus Nielsen, a geneticist at the University of California, Berkeley, who was not involved in the new study.

The Melanesians took a different course. After a single interbreeding with Neanderthals, Dr. Akey found, their ancestors went on to interbreed just once with Denisovans as well.

Where that encounter could have taken place remains an enigma. The only place Denisovan remains have been found is Siberia, a long way from New Guinea.

It is possible that Denisovans ranged down to Southeast Asia, Dr. Akey said, crossing paths with modern humans who later settled in Melanesia.

Dr. Akey and his colleagues also identified some regions of Neanderthal and Denisovan DNA that became more common in modern humans as generations passed, suggesting that they provided some kind of a survival advantage.

Many of the regions contain immune system genes, Dr. Akey noted.

“As modern humans are spreading out across the world, they’re encountering pathogens they haven’t experienced before,” he said. Neanderthals and Denisovans may have had genes that were adapted to fight those enemies.

“Maybe they really helped us survive and thrive in these new environments,” he said.

Dr. Akey and his colleagues found that Neanderthal and Denisovan DNA was glaringly absent from four regions of the modern human genome.

That absence may signal that these stretches of the genome are instrumental in making modern humans unique. Intriguingly, one of those regions includes a gene called FOXP2, which is involved in speech.

Scientists suspect that Neanderthals and Denisovans were not the only extinct races our ancestors interbred with.

PingHsun Hsieh, a biologist at the University of Arizona, and his colleagues reported last month that the genomes of African pygmies contained pieces of DNA that came from an unknown source within the last 30,000 years.

Dr. Akey and his colleagues are now following up with an analysis of African populations. “This potentially allows us to find new twigs on the human family tree,” he said.

Fossil analysis pushes back human split from other primates by 2 million years (Los Alamos National Laboratory)

16-FEB-2016

Nature paper places human evolution in Africa, not Eurasia

DOE/Los Alamos National Laboratory

IMAGEIMAGE: TEAM ANALYSIS OF THESE 8-MILLION-YEAR-OLD CHORORAPITHECUS TEETH FOSSILS PROVIDED INSIGHTS INTO THE HUMAN-GORILLA EVOLUTIONARY SPLIT. CREDIT: GEN SUWA

LOS ALAMOS, N.M., February 16, 2015–A paper in the latest issue of the journal Nature suggests a common ancestor of apes and humans, Chororapithecus abyssinicus, evolved in Africa, not Eurasia, two million years earlier than previously thought.

“Our new research supports early divergence: 10 million years ago for the human-gorilla split and 8 million years ago for our split from chimpanzees,” said Los Alamos National Laboratory geologist and senior team member Giday WoldeGabriel. “That’s at least 2 million years earlier than previous estimates, which were based on genetic science that lacked fossil evidence.”

“Our analysis of C. abyssinicus fossils reveals the ape to be only 8 million years old, younger than previously thought. This is the time period when human and African ape lines were thought to have split, but no fossils from this period had been found until now,” WoldeGabriel said.

Chimpanzees, gorillas, orangutans and humans compose the biological family Hominidae. Our knowledge of hominid evolution–that is, when and how humans evolved away from the great ape family tree–has significantly increased in recent years, aided by unearthed fossils from Ethiopia, including the C. abyssinicus, a species of great ape.

The renowned international team that discovered the extinct gorilla-like species C. abyssinicus(reported in the journal Nature in 2007) reports new field observations and geological techniques that the authors say revise the age-constraint of the human split from their brethren.

The authors’ new paper, “New geological and palaeontological age constraint for the gorilla-human lineage split,” was published this week in Nature. WoldeGabriel coauthored the paper and his role was to characterize the volcanic ash and provide chemistry for local and regional correlation of the ashes sandwiching the fossils from Ethiopia’s Chorora area, a region where copious volcanic eruptions and earthquakes entombed fossils recently uplifted via ground motion and erosion.

Filling Gaps in the Fossil Record

Most of the senior members of the Chorora research team also belong to the Middle Awash project team that has recovered the fossil remains of at least eight hominid species, including some of the earliest hominids, spanning nearly 6 million years.

In the 1990s, before this team excavated the gorilla-like C. abyssinicus, they discovered the nearly intact skeleton of the 4.4-million-year-old species Ardipithecus ramidus (nicknamed “Ardi”) and its relative, the million-year-older species Ardipithecus kadabba. These Ardipithecusfossils were the earliest ancestor of humans after they diverged from the main ape lineage of the primate family tree, neither ape-like nor chimp-like, yet not human either. Notably, both were bipedal–they walked upright.

While the team was still investigating Ardi and Kadabba, they published their results about C. abyssinicus. From the collection of nine fossilized teeth from multiple C. abyssinicus individuals, the team surmised that these teeth were gorilla-like, adapted for a fibrous diet. Based on their research from the Chorora, Kadabba and Ardi finds, the team says the common ancestor of chimps and humans lived earlier than had been evidenced by genetic and molecular studies, which placed the split about 5 million years ago.

According to the paper, C. abyssinicus revealed answers about gorilla lineage but also provided fossil evidence that our common ancestor migrated from Africa, not Eurasia, where fossils were more prolific prior to this discovery of multiple skeletons. While some skeptics say that more fossil evidence is needed before they accept this team’s conclusions, many agree that the discovery of a fossil ape from this time period is important since only one other had been found.

Extensive Analysis Provides New Evidence

WoldeGabriel and the research team used a variety of methods to determine the age of teeth they found at the Chorora Formation. They estimated the age of the volcanic rocks and sediments that encased the fossils with argon-dating and paleomagnetic methods. The team investigated patterns of magnetic reversals–another method to determine age based on knowledge about an era’s magnetic orientation–and calibrated the sediments containing the fossils using Geomagnetic Polarity Time Scale (GPTS).

Through fieldwork, volcanic ash chemistry and geochronology, WoldeGabriel helped nail down the age of the fossils to approximately 8 million years old. Based on this new fossil evidence and analysis, the team suggests that the human branch of the tree (shared with chimpanzees) split away from gorillas about 10 million years ago–at least 2 million years earlier than previously claimed.

New appreciation for human micro biome leads to greater understanding of human health (Science Daily)

Date: February 14, 2016

Source: University of Oklahoma

Summary: Anthropologists are studying the ancient and modern human micro biome and the role it plays in human health and disease. By applying genomic and proteomic sequencing technologies to ancient human microbiomes, such as coprolites and dental calculus, as well as to contemporary microbiomes in traditional and industrialized societies, Researchers are advancing the understanding of the evolutionary history of our microbial self and its impact on human health today.


University of Oklahoma anthropologists are studying the ancient and modern human microbiome and the role it plays in human health and disease. By applying genomic and proteomic sequencing technologies to ancient human microbiomes, such as coprolites and dental calculus, as well as to contemporary microbiomes in traditional and industrialized societies, OU researchers are advancing the understanding of the evolutionary history of our microbial self and its impact on human health today.

Christina Warinner, professor in the Department of Anthropology, OU College of Arts and Sciences, will present, “The Evolution and Ecology of Our Microbial Self,” during the American Association for the Advancement of Science panel on Evolutionary Biology Impacts on Medicine and Public Health, at 1:30 pm, Sunday, Feb. 14, 2016 in the Marriott Marshall Ballroom West, Washington, DC. Warinner will discuss how major events, such as the invention of agriculture and the advent of industrialization, have affected the human microbiome.

“We don’t have a complete picture of the microbiome,” Warinner said. “OU research indicates human behavior over the past 2000 years has impacted the gut microbiome. Microbial communities have become disturbed, but before we can improve our health, we have to understand our ancestral microbiome. We cannot make targeted or informed interventions until we know that. Ancient samples allow us to directly measure changes in the human microbiome at specific times and places in the past.”

Warinner and colleague, Cecil M. Lewis, Jr., co-direct OU’s Laboratories of Molecular Anthropology and Microbiome Research and the research focused on reconstructing the ancestral human oral and gut microbiome, addressing questions concerning how the relationship between humans and microbes has changed through time and how our microbiomes influence health and disease in diverse populations, both today and in the past. Warinner and Lewis are leaders in the field of paleogenomics, and the OU laboratories house the largest ancient DNA laboratory in the United States.

Warinner is pioneering the study of ancient human microbiomes, and in 2014 she published the first detailed metagenomics and metaproteomic characterization of the ancient oral microbiome in the journal Nature Genetics. In 2015, she published a study on the identification of milk proteins in ancient dental calculus and the reconstruction of prehistoric European dairying practices. In the same year, she was part of an international team that published the first South American hunter-gatherer gut microbiome and identified Treponema as a key missing ancestral microbe in industrialized societies.

The Big Search to Find Out Where Dogs Come From (New York Times)

An ancient canine skull at the Royal Belgian Institute of Natural Sciences. Scientists are still debating exactly when and where the ancient human-canine bond originated. ANDREW TESTA FOR THE NEW YORK TIMES

By JAMES GORMAN

OXFORD, England — Before humans milked cows, herded goats or raised hogs, before they invented agriculture, or written language, before they had permanent homes, and most certainly before they had cats, they had dogs.

Or dogs had them, depending on how you view the human-canine arrangement. But scientists are still debating exactly when and where the ancient bond originated. And a large new study being run out of the University of Oxford here, with collaborators around the world, may soon provide some answers.

Scientists have come up with a broad picture of the origins of dogs. First off, researchers agree that they evolved from ancient wolves. Scientists once thought that some visionary hunter-gatherer nabbed a wolf puppy from its den one day and started raising tamer and tamer wolves, taking the first steps on the long road to leashes and flea collars. This is oversimplified, of course, but the essence of the idea is that people actively bred wolves to become dogs just the way they now breed dogs to be tiny or large, or to herd sheep.

The prevailing scientific opinion now, however, is that this origin story does not pass muster. Wolves are hard to tame, even as puppies, and many researchers find it much more plausible that dogs, in effect, invented themselves.

Arden Hulme-Beaman cutting a piece from an ancient skull for DNA testing at the Royal Belgian Institute of Natural Sciences in Brussels. ANDREW TESTA FOR THE NEW YORK TIMES

One reason for the conflicting theories, according to Greger Larson, a biologist in the archaeology department at the University of Oxford, is that dog genetics are a mess. In an interview at his office here in November, he noted that most dog breeds were invented in the 19th century during a period of dog obsession that he called “the giant whirlwind blender of the European crazy Victorian dog-breeding frenzy.”

That blender, as well as random breeding by dogs themselves, and interbreeding with wolves at different times over at least the last 15,000 years, created a “tomato soup” of dog genetics, for which the ingredients are very hard to identify, Dr. Larson said.

The way to find the recipe, Dr. Larson is convinced, is to create a large database of ancient DNA to add to the soup of modern canine genetics. And with a colleague, Keith Dobney at the University of Aberdeen, he has persuaded the Who’s Who of dog researchers to join a broad project, with about $2.5 million in funding from the Natural Environment Research Council in England and the European Research Council, to analyze ancient bones and their DNA.

Robert Wayne, an evolutionary biologist at U.C.L.A. who studies the origin of dogs and is part of the research, said, “There’s hardly a person working in canine genetics that’s not working on that project.”

A wolf on display at the Oxford Museum of Natural History. ANDREW TESTA FOR THE NEW YORK TIMES

That is something of a triumph, given the many competing theories in this field. “Almost every group has a different origination hypothesis,” he said.

But Dr. Larson has sold them all on the simple notion that the more data they have, the more cooperative the effort is, the better the answers are going to be. His personality has been crucial to promoting the team effort, said Dr. Wayne, who described Dr. Larson as “very outgoing, gregarious.” Also, Dr. Wayne added, “He has managed not to alienate anyone.”

Scientists at museums and universities who are part of the project are opening up their collections. So to gather data, Dr. Larson and his team at Oxford have traveled the world, collecting tiny samples of bone and measurements of teeth, jaws and occasionally nearly complete skulls from old and recent dogs, wolves and canids that could fall into either category. The collection phase is almost done, said Dr. Larson, who expects to end up with DNA from about 1,500 samples, and photographs and detailed measurements of several thousand.

Scientific papers will start to emerge this year from the work, some originating in Oxford, and some from other institutions, all the work of many collaborators.

Dr. Larson is gambling that the project will be able to determine whether the domestication process occurred closer to 15,000 or 30,000 years ago, and in what region it took place. That’s not quite the date, GPS location and name of the ancient hunter that some dog lovers might hope for.

But it would be a major achievement in the world of canine science, and a landmark in the analysis of ancient DNA to show evolution, migrations and descent, much as studies of ancient hominid DNA have shown how ancient humans populated the globe and interbred with Neanderthals.

And why care about the domestication of dogs, beyond the obsessive interest so many people have in their pets? The emergence of dogs may have been a watershed.

“Maybe dog domestication on some level kicks off this whole change in the way that humans are involved and responding to and interacting with their environment,” he added. “I don’t think that’s outlandish.”

Shepherding the Research

Dr. Larson is no stranger to widely varying points of view. He is an American, but recently became a British citizen as well. His parents are American and he visited the United States often as a child, but he was born in Bahrain and grew up in Turkey and Japan, places where his parents were teaching in schools on American military bases.

He graduated from Claremont McKenna College in California and received his Ph.D. at Oxford. In between college and graduate studies, he spent a year searching for the bed of an ancient river in Turkmenistan, and another couple of years setting up an environmental consulting office in Azerbaijan. He had an interest in science as an undergraduate, and some background from a college major in environment, economics and politics, but no set career plans. Instead, his career grew out of intense curiosity, a knack for making friends and a willingness to jump at an opportunity, like the time he managed to tag along on an archaeological dig.

He was staying in Ashgabat, Turkmenistan, and a local man who had helped him rent an old Soviet truck to explore the desert told him some Westerners were arriving to go on a dig, so he wangled his way onto one of the trucks.

“I think everybody there thought I was with somebody else,” Dr. Larson said.

By the time the group stopped to rest and someone asked him who he was, it was too late to question whether he really belonged. “I was a complete stowaway,” he said.

But he could move dirt and speak Russian, and he had some recently acquired expertise — in college drinking games — that he said was in great demand at night. By luck, he said, the researchers on the dig turned out to be “the great and the good of British neolithic archaeology.” One of them was Chris Gosden, the chairman of European Archaeology at Oxford, who later invited him to do a one-year master’s degree in archaeology at Oxford. That eventually led to a Ph.D. program after he spent some time in graduate school in the United States.

The current project began when he became fed up with the lack of ancient DNA evidence in papers about the origin of dogs. He called Dr. Dobney, of the University of Aberdeen in 2011, and said, “We’re doing dogs.”

After receiving the grant from the council in England, he and Dr. Dobney organized a conference in Aberdeen, Scotland, to gather as many people involved in researching dog origins as they could. His pitch to the group was that despite their different points of view, everyone was interested in the best possible evidence, no matter where it led.

“If we have to eat crow, we eat crow,” he said. “It’s science.”

A 32,000-Year-Old Skull

Mietje Germonpré, a paleontologist at the Royal Belgian Institute of Natural Sciences, is one of the many scientists participating in the dog project. She was one of a number of authors on a 2013 paper in Science that identified a skull about 32,000 years old from a Belgian cave in Goyet as an early dog. Dr. Wayne at U.C.L.A. was the senior author on the paper and Olaf Thalmann from the University of Turku in Finland was the first author.

It is typical of Dr. Larson’s dog project that although he disagreed with the findings of the paper, arguing that the evidence just wasn’t there to call the Goyet skull a dog, all of the authors of the paper are working on the larger project with him.

In November in Brussels, holding the priceless fossil, Dr. Germonpré pointed out the wide skull, crowded teeth and short snout of the ancient skull — all indicators to her that it was not a wolf.

“To me, it’s a dog,” she said. Studies of mitochondrial DNA, passed down from females only, also indicated the skull was not a wolf, according to the 2013 paper.

Dr. Germonpré said she thinks dogs were domesticated some time before this animal died, and she leans toward the idea that humans intentionally bred them from wolves.

She holds up another piece of evidence, a reconstruction of a 30,000-year-old canid skull found near Predmostí, in the Czech Republic, with a bone in its mouth. She reported in 2014 that this was a dog. And she says the bone is part of evidence the animal was buried with care. “We think it was deliberately put there,” she said.

But she recognizes these claims are controversial and is willing, like the rest of the world of canine science, to risk damage to the fossils themselves to get more information on not just the mitochondrial DNA but also the nuclear DNA.

To minimize that risk, she talked with Ardern Hulme-Beaman, a postdoctoral researcher with the Oxford team, about where to cut into it. He was nearing the end of months of traveling to Russia, Turkey, the United States and all over Europe to take samples of canid jaws and skulls.

He and Allowyn Evin, now with the National Center for Scientific Research in Montpelier, France, also took many photographs of each jaw and skull to do geometric morphometrics. Software processes detailed photographs from every angle into 3-D recreations that provide much more information on the shape of a bone than length and width measurements.

Dr. Germonpré and Dr. Hulme-Beaman agreed on a spot in the interior of the skull to cut. In the laboratory, he used a small electric drill with a cutting blade to remove a chunk the size of a bit of chopped walnut. An acrid, burning smell indicated that organic material was intact within the bone — a good sign for the potential retrieval of DNA.

Back in Oxford, researchers will attempt to use the most current techniques to get as much DNA as possible out of the sample. There is no stretch of code that says “wolf” or “dog,” any more than there is a single skull feature that defines a category. What geneticists try to establish is how different the DNA of one animal is from another. Adding ancient DNA gives many more points of reference over a long time span.

Dr. Larson hopes that he and his collaborators will be able to identify a section of DNA in some ancient wolves that was passed on to more doglike descendants and eventually to modern dogs. And he hopes they will be able to identify changes in the skulls or jaws of those wolves that show shifts to more doglike shapes, helping to narrow the origins of domestication.

The usual assumption about domestic animals is that the process of taming and breeding them happened once. But that’s not necessarily so. Dr. Larson and Dr. Dobney showed that pigs were domesticated twice, once in Anatolia and once in China. The same could be true of dogs.

Only the Beginning

Although the gathering of old bones is almost done, Dr. Larson is still negotiating with Chinese researchers for samples from that part of the world, which he says are necessary. But he hopes they will come.

If all goes well, said Dr. Larson, the project will publish a flagship paper from all of the participants describing their general findings. And over the next couple of years, researchers, all using the common data, will continue to publish separate findings.

Other large collaborative efforts are brewing, as well. Dr. Wayne, at U.C.L.A., said that a group in China was forming with the goal of sequencing 10,000 dog genomes. He and Dr. Larson are part of that group.

Last fall, Dr. Larson was becoming more excited with each new bit of data, but not yet ready to tip his hand about what conclusions the data may warrant, or how significant they will be.

But he is growing increasingly confident that they will find what they want, and come close to settling the thorny question of when and where the tearing power of a wolf jaw first gave way to the persuasive force of a nudge from a dog’s cold nose.

“I’m starting to drink my own Kool-Aid,” he said.

Ancient viral molecules essential for human development (Science Daily)

Date: November 23, 2015

Source: Stanford University Medical Center

Summary: Genetic material from ancient viral infections is critical to human development, according to researchers.


Rendering of a virus among blood cells. Credit: © ysfylmz / Fotolia

Genetic material from ancient viral infections is critical to human development, according to researchers at the Stanford University School of Medicine.

They’ve identified several noncoding RNA molecules of viral origins that are necessary for a fertilized human egg to acquire the ability in early development to become all the cells and tissues of the body. Blocking the production of this RNA molecule stops development in its tracks, they found.

The discovery comes on the heels of a Stanford study earlier this year showing that early human embryos are packed full of what appear to be viral particles arising from similar left-behind genetic material.

“We’re starting to accumulate evidence that these viral sequences, which originally may have threatened the survival of our species, were co-opted by our genomes for their own benefit,” said Vittorio Sebastiano, PhD, an assistant professor of obstetrics and gynecology. “In this manner, they may even have contributed species-specific characteristics and fundamental cell processes, even in humans.”

Sebastiano is a co-lead and co-senior author of the study, which will be published online Nov. 23 in Nature Genetics. Postdoctoral scholar Jens Durruthy-Durruthy, PhD, is the other lead author. The other senior author of the paper is Renee Reijo Pera, PhD, a former professor of obstetrics and gynecology at Stanford who is now on the faculty of Montana State University.

Sebastiano and his colleagues were interested in learning how cells become pluripotent, or able to become any tissue in the body. A human egg becomes pluripotent after fertilization, for example. And scientists have learned how to induce other, fully developed human cells to become pluripotent by exposing them to proteins known to be present in the very early human embryo. But the nitty-gritty molecular details of this transformative process are not well understood in either case.

An ancient infection

The researchers knew that a type of RNA molecules called long-intergenic noncoding, or lincRNAs, have been implicated in many important biological processes, including the acquisition of pluripotency. These molecules are made from DNA in the genome, but they don’t go on to make proteins. Instead they function as RNA molecules to affect the expression of other genes.

Sebastiano and Durruthy-Durruthy used recently developed RNA sequencing techniques to examine which lincRNAs are highly expressed in human embryonic stem cells. Previously, this type of analysis was stymied by the fact that many of the molecules contain highly similar, very repetitive regions that are difficult to sequence accurately.

They identified more than 2,000 previously unknown RNA sequences, and found that 146 are specifically expressed in embryonic stem cells. They homed in on the 23 most highly expressed sequences, which they termed HPAT1-23, for further study. Thirteen of these, they found, were made up almost entirely of genetic material left behind after an eons-ago infection by a virus called HERV-H.

HERV-H is what’s known as a retrovirus. These viruses spread by inserting their genetic material into the genome of an infected cell. In this way, the virus can use the cell’s protein-making machinery to generate viral proteins for assembly into a new viral particle. That particle then goes on to infect other cells. If the infected cell is a sperm or an egg, the retroviral sequence can also be passed to future generations.

HIV is one common retrovirus that currently causes disease in humans. But our genomes are also littered with sequences left behind from long-ago retroviral infections. Unlike HIV, which can go on to infect new cells, these retroviral sequences are thought to be relatively inert; millions of years of evolution and accumulated mutations mean that few maintain the capacity to give instructions for functional proteins.

After identifying HPAT1-23 in embryonic stem cells, Sebastiano and his colleagues studied their expression in human blastocysts — the hollow clump of cells that arises from the egg in the first days after fertilization. They found that HPAT2, HPAT3 and HPAT5 were expressed only in the inner cell mass of the blastocyst, which becomes the developing fetus. Blocking their expression in one cell of a two-celled embryo stopped the affected cell from contributing to the embryo’s inner cell mass. Further studies showed that the expression of the three genes is also required for efficient reprogramming of adult cells into induced pluripotent stem cells.

Sequences found only in primates

“This is the first time that these virally derived RNA molecules have been shown to be directly involved with and necessary for vital steps of human development,” Sebastiano said. “What’s really interesting is that these sequences are found only in primates, raising the possibility that their function may have contributed to unique characteristics that distinguish humans from other animals.”

The researchers are continuing their studies of all the HPAT molecules. They’ve learned that HPAT-5 specifically affects pluripotency by interacting with and sequestering members of another family of RNAs involved in pluripotency called let-7.

“Previously retroviral elements were considered to be a class that all functioned in basically the same way,” said Durruthy-Durruthy. “Now we’re learning that they function as individual elements with very specific and important roles in our cells. It’s fascinating to imagine how, during the course of evolution, primates began to recycle these viral leftovers into something that’s beneficial and necessary to our development.”


Journal Reference:

  1. Jens Durruthy-Durruthy, Vittorio Sebastiano, Mark Wossidlo, Diana Cepeda, Jun Cui, Edward J Grow, Jonathan Davila, Moritz Mall, Wing H Wong, Joanna Wysocka, Kin Fai Au, Renee A Reijo Pera. The primate-specific noncoding RNA HPAT5 regulates pluripotency during human preimplantation development and nuclear reprogrammingNature Genetics, 2015; DOI: 10.1038/ng.3449