Arquivo da tag: Evolução

Becoming a centaur (Aeon)

Rounding up wild horses on the edge of the Gobi desert in Mongolia, 1964. Photo by Philip Jones Griffiths/Magnum
The horse is a prey animal, the human a predator. Our shared trust and athleticism is a neurobiological miracle

Janet Jones – 14 January 2022

Horse-and-human teams perform complex manoeuvres in competitions of all sorts. Together, we can gallop up to obstacles standing 8 feet (2.4 metres) high, leave the ground, and fly blind – neither party able to see over the top until after the leap has been initiated. Adopting a flatter trajectory with greater speed, horse and human sail over broad jumps up to 27 feet (more than 8 metres) long. We run as one at speeds of 44 miles per hour (nearly 70 km/h), the fastest velocity any land mammal carrying a rider can achieve. In freestyle dressage events, we dance in place to the rhythm of music, trot sideways across the centre of an arena with huge leg-crossing steps, and canter in pirouettes with the horse’s front feet circling her hindquarters. Galloping again, the best horse-and-human teams can slide 65 feet (nearly 20 metres) to a halt while resting all their combined weight on the horse’s hind legs. Endurance races over extremely rugged terrain test horses and riders in journeys that traverse up to 500 miles (805 km) of high-risk adventure.

Charlotte Dujardin on Valegro, a world-record dressage freestyle at London Olympia, 2014: an example of high-precision brain-to-brain communication between horse and rider. Every step the horse takes is determined in conjunction with many invisible cues from his human rider, using a feedback loop between predator brain and prey brain. Note the horse’s beautiful physical condition and complete willingness to perform these extremely difficult manoeuvres.

No one disputes the athleticism fuelling these triumphs, but few people comprehend the mutual cross-species interaction that is required to accomplish them. The average horse weighs 1,200 pounds (more than 540 kg), makes instantaneous movements, and can become hysterical in a heartbeat. Even the strongest human is unable to force a horse to do anything she doesn’t want to do. Nor do good riders allow the use of force in training our magnificent animals. Instead, we hold ourselves to the higher standard of motivating horses to cooperate freely with us in achieving the goals of elite sports as well as mundane chores. Under these conditions, the horse trained with kindness, expertise and encouragement is a willing, equal participant in the action.

That action is rooted in embodied perception and the brain. In mounted teams, horses, with prey brains, and humans, with predator brains, share largely invisible signals via mutual body language. These signals are received and transmitted through peripheral nerves leading to each party’s spinal cord. Upon arrival in each brain, they are interpreted, and a learned response is generated. It, too, is transmitted through the spinal cord and nerves. This collaborative neural action forms a feedback loop, allowing communication from brain to brain in real time. Such conversations allow horse and human to achieve their immediate goals in athletic performance and everyday life. In a very real sense, each species’ mind is extended beyond its own skin into the mind of another, with physical interaction becoming a kind of neural dance.

Horses in nature display certain behaviours that tempt observers to wonder whether competitive manoeuvres truly require mutual communication with human riders. For example, the feral horse occasionally hops over a stream to reach good food or scrambles up a slope of granite to escape predators. These manoeuvres might be thought the precursors to jumping or rugged trail riding. If so, we might imagine that the performance horse’s extreme athletic feats are innate, with the rider merely a passenger steering from above. If that were the case, little requirement would exist for real-time communication between horse and human brains.

In fact, though, the feral hop is nothing like the trained leap over a competition jump, usually commenced from short distances at high speed. Today’s Grand Prix jump course comprises about 15 obstacles set at sharp angles to each other, each more than 5 feet high and more than 6 feet wide (1.5 x 1.8 metres). The horse-and-human team must complete this course in 80 or 90 seconds, a time allowance that makes for acute turns, diagonal flight paths and high-speed exits. Comparing the wilderness hop with the show jump is like associating a flintstone with a nuclear bomb. Horses and riders undergo many years of daily training to achieve this level of performance, and their brains share neural impulses throughout each experience.

These examples originate in elite levels of horse sport, but the same sort of interaction occurs in pastures, arenas and on simple trails all over the world. Any horse-and-human team can develop deep bonds of mutual trust, and learn to communicate using body language, knowledge and empathy.

Like it or not, we are the horse’s evolutionary enemy, yet they behave toward us as if inclined to become a friend

The critical component of the horse in nature, and her ability to learn how to interact so precisely with a human rider, is not her physical athleticism but her brain. The first precise magnetic resonance image of a horse’s brain appeared only in 2019, allowing veterinary neurologists far greater insight into the anatomy underlying equine mental function. As this new information is disseminated to horse trainers and riders for practical application, we see the beginnings of a revolution in brain-based horsemanship. Not only will this revolution drive competition to higher summits of success, and animal welfare to more humane levels of understanding, it will also motivate scientists to research the unique compatibility between prey and predator brains. Nowhere else in nature do we see such intense and intimate collaboration between two such disparate minds.

Three natural features of the equine brain are especially important when it comes to mind-melding with humans. First, the horse’s brain provides astounding touch detection. Receptor cells in the horse’s skin and muscles transduce – or convert – external pressure, temperature and body position to neural impulses that the horse’s brain can understand. They accomplish this with exquisite sensitivity: the average horse can detect less pressure against her skin than even a human fingertip can.

Second, horses in nature use body language as a primary medium of daily communication with each other. An alpha mare has only to flick an ear toward a subordinate to get him to move away from her food. A younger subordinate, untutored in the ear flick, receives stronger body language – two flattened ears and a bite that draws blood. The notion of animals in nature as kind, gentle creatures who never hurt each other is a myth.

Third, by nature, the equine brain is a learning machine. Untrammelled by the social and cognitive baggage that human brains carry, horses learn in a rapid, pure form that allows them to be taught the meanings of various human cues that shape equine behaviour in the moment. Taken together, the horse’s exceptional touch sensitivity, natural reliance on body language, and purity of learning form the tripod of support for brain-to-brain communication that is so critical in extreme performance.

One of the reasons for budding scientific fascination with neural horse-and-human communication is the horse’s status as a prey animal. Their brains and bodies evolved to survive completely different pressures than our human physiologies. For example, horse eyes are set on either side of their head for a panoramic view of the world, and their horizontal pupils allow clear sight along the horizon but fuzzy vision above and below. Their eyes rotate to maintain clarity along the horizon when their heads lie sideways to reach grass in odd locations. Equine brains are also hardwired to stream commands directly from the perception of environmental danger to the motor cortex where instant evasion is carried out. All of these features evolved to allow the horse to survive predators.

Conversely, human brains evolved in part for the purpose of predation – hunting, chasing, planning… yes, even killing – with front-facing eyes, superb depth perception, and a prefrontal cortex for strategy and reason. Like it or not, we are the horse’s evolutionary enemy, yet they behave toward us as if inclined to become a friend.

The fact that horses and humans can communicate neurally without the external mediation of language or equipment is critical to our ability to initiate the cellular dance between brains. Saddles and bridles are used for comfort and safety, but bareback and bridleless competitions prove they aren’t necessary for highly trained brain-to-brain communication. Scientific efforts to communicate with predators such as dogs and apes have often been hobbled by the use of artificial media including human speech, sign language or symbolic lexigram. By contrast, horses allow us to apply a medium of communication that is completely natural to their lives in the wild and in captivity.

The horse’s prey brain is designed to notice and evade predators. How ironic, and how riveting, then, that this prey brain is the only one today that shares neural communication with a predator brain. It offers humanity a rare view into a prey animal’s world, almost as if we were wolves riding elk or coyotes mind-melding with cottontail bunnies.

Highly trained horses and riders send and receive neural signals using subtle body language. For example, a rider can apply invisible pressure with her left inner calf muscle to move the horse laterally to the right. That pressure is felt on the horse’s side, in his skin and muscle, via proprioceptive receptor cells that detect body position and movement. Then the signal is transduced from mechanical pressure to electrochemical impulse, and conducted up peripheral nerves to the horse’s spinal cord. Finally, it reaches the somatosensory cortex, the region of the brain responsible for interpreting sensory information.

Riders can sometimes guess that an invisible object exists by detecting subtle equine reactions

This interpretation is dependent on the horse’s knowledge that a particular body signal – for example, inward pressure from a rider’s left calf – is associated with a specific equine behaviour. Horse trainers spend years teaching their mounts these associations. In the present example, the horse has learned that this particular amount of pressure, at this speed and location, under these circumstances, means ‘move sideways to the right’. If the horse is properly trained, his motor cortex causes exactly that movement to occur.

By means of our human motion and position sensors, the rider’s brain now senses that the horse has changed his path rightward. Depending on the manoeuvre our rider plans to complete, she will then execute invisible cues to extend or collect the horse’s stride as he approaches a jump that is now centred in his vision, plant his right hind leg and spin in a tight fast circle, push hard off his hindquarters to chase a cow, or any number of other movements. These cues are combined to form that mutual neural dance, occurring in real time, and dependent on natural body language alone.

The example of a horse moving a few steps rightward off the rider’s left leg is extremely simplistic. When you imagine a horse and rider clearing a puissance wall of 7.5 feet (2.4 metres), think of the countless receptor cells transmitting bodily cues between both brains during approach, flight and exit. That is mutual brain-to-brain communication. Horse and human converse via body language to such an extreme degree that they are able to accomplish amazing acts of understanding and athleticism. Each of their minds has extended into the other’s, sending and receiving signals as if one united brain were controlling both bodies.

Franke Sloothaak on Optiebeurs Golo, a world-record puissance jump at Chaudfontaine in Belgium, 1991. This horse-and-human team displays the gentle encouragement that brain-to-brain communication requires. The horse is in perfect condition and health. The rider offers soft, light hands, and rides in perfect balance with the horse. He carries no whip, never uses his spurs, and employs the gentlest type of bit – whose full acceptance is evidenced by the horse’s foamy mouth and flexible neck. The horse is calm but attentive before and after the leap, showing complete willingness to approach the wall without a whiff of coercion. The first thing the rider does upon landing is pat his equine teammate. He strokes or pats the horse another eight times in the next 30 seconds, a splendid example of true horsemanship.

Analysis of brain-to-brain communication between horses and humans elicits several new ideas worthy of scientific notice. Because our minds interact so well using neural networks, horses and humans might learn to borrow neural signals from the party whose brain offers the highest function. For example, horses have a 340-degree range of view when holding their heads still, compared with a paltry 90-degree range in humans. Therefore, horses can see many objects that are invisible to their riders. Yet riders can sometimes guess that an invisible object exists by detecting subtle equine reactions.

Specifically, neural signals from the horse’s eyes carry the shape of an object to his brain. Those signals are transferred to the rider’s brain by a well-established route: equine receptor cells in the retina lead to equine detector cells in the visual cortex, which elicits an equine motor reaction that is then sensed by the rider’s human body. From there, the horse’s neural signals are transmitted up the rider’s spinal cord to the rider’s brain, and a perceptual communication loop is born. The rider’s brain can now respond neurally to something it is incapable of seeing, by borrowing the horse’s superior range of vision.

These brain-to-brain transfers are mutual, so the learning equine brain should also be able to borrow the rider’s vision, with its superior depth perception and focal acuity. This kind of neural interaction results in a horse-and-human team that can sense far more together than either party can detect alone. In effect, they share effort by assigning labour to the party whose skills are superior at a given task.

There is another type of skillset that requires a particularly nuanced cellular dance: sharing attention and focus. Equine vigilance allowed horses to survive 56 million years of evolution – they had to notice slight movements in tall grasses or risk becoming some predator’s dinner. Consequently, today it’s difficult to slip even a tiny change past a horse, especially a young or inexperienced animal who has not yet been taught to ignore certain sights, sounds and smells.

By contrast, humans are much better at concentration than vigilance. The predator brain does not need to notice and react instantly to every stimulus in the environment. In fact, it would be hampered by prey vigilance. While reading this essay, your brain sorts away the sound of traffic past your window, the touch of clothing against your skin, the sight of the masthead that says ‘Aeon’ at the top of this page. Ignoring these distractions allows you to focus on the content of this essay.

Horses and humans frequently share their respective attentional capacities during a performance. A puissance horse galloping toward an enormous wall cannot waste vigilance by noticing the faces of each person in the audience. Likewise, the rider cannot afford to miss a loose dog that runs into the arena outside her narrow range of vision and focus. Each party helps the other through their primary strengths.

Such sharing becomes automatic with practice. With innumerable neural contacts over time, the human brain learns to heed signals sent by the equine brain that say, in effect: ‘Hey, what’s that over there?’ Likewise, the equine brain learns to sense human neural signals that counter: ‘Let’s focus on this gigantic wall right here.’ Each party sends these messages by body language and receives them by body awareness through two spinal cords, then interprets them inside two brains, millisecond by millisecond.

The rider’s physical cues are transmitted by neural activation from the horse’s surface receptors to the horse’s brain

Finally, it is conceivable that horse and rider can learn to share features of executive function – the human brain’s ability to set goals, plan steps to achieve them, assess alternatives, make decisions and evaluate outcomes. Executive function occurs in the prefrontal cortex, an area that does not exist in the equine brain. Horses are excellent at learning, remembering and communicating – but they do not assess, decide, evaluate or judge as humans do.

Shying is a prominent equine behaviour that might be mediated by human executive function in well-trained mounts. When a horse of average size shies away from an unexpected stimulus, riders are sitting on top of 1,200 pounds of muscle that suddenly leaps sideways off all four feet and lands five yards away. It’s a frightening experience, and often results in falls that lead to injury or even death. The horse’s brain causes this reaction automatically by direct connection between his sensory and motor cortices.

Though this possibility must still be studied by rigorous science, brain-to-brain communication suggests that horses might learn to borrow small glimmers of executive function through neural interaction with the human’s prefrontal cortex. Suppose that a horse shies from an umbrella that suddenly opens. By breathing steadily, relaxing her muscles, and flexing her body in rhythm with the horse’s gait, the rider calms the animal using body language. Her physical cues are transmitted by neural activation from his surface receptors to his brain. He responds with body language in which his muscles relax, his head lowers, and his frightened eyes return to their normal size. The rider feels these changes with her body, which transmits the horse’s neural signals to the rider’s brain.

From this point, it’s only a very short step – but an important one – to the transmission and reception of neural signals between the rider’s prefrontal cortex (which evaluates the unexpected umbrella) and the horse’s brain (which instigates the leap away from that umbrella). In practice, to reduce shying, horse trainers teach their young charges to slow their reactions and seek human guidance.

Brain-to-brain communication between horses and riders is an intricate neural dance. These two species, one prey and one predator, are living temporarily in each other’s brains, sharing neural information back and forth in real time without linguistic or mechanical mediation. It is a partnership like no other. Together, a horse-and-human team experiences a richer perceptual and attentional understanding of the world than either member can achieve alone. And, ironically, this extended interspecies mind operates well not because the two brains are similar to each other, but because they are so different.

Janet Jones applies brain research to training horses and riders. She has a PhD from the University of California, Los Angeles, and for 23 years taught the neuroscience of perception, language, memory, and thought. She trained horses at a large stable early in her career, and later ran a successful horse-training business of her own. Her most recent book, Horse Brain, Human Brain (2020), is currently being translated into seven languages.

Edited by Pam Weintraub

The Six Legacies of Edward O. Wilson (This View of Life)

By David Sloan Wilson – Published On: January 5, 2022

Note: An abbreviated version of this article is published in Nautilus Magazine.

Edward O. Wilson, who passed away at the age of 92 on December 26, 2021, is widely recognized as a giant of the Arts and Sciences. I include the Arts because Wilson regarded the creative dimension of science as an artistic endeavor, worked toward unifying the Arts and Sciences, and wrote beautifully for the general public, resulting in two Pulitzer prizes for nonfiction and one novel.

Wilson’s stature is so great, and reflections on his legacy upon his death are so numerous, that another reflection might seem unnecessary. The purpose of my reflection, however, is to make a novel point: Wilson left at least six legacies, which need to be combined to fully realize his vision. Combining the legacies of Edward O. Wilson requires first identifying them separately and then integrating them with each other.

The six legacies are:

1) His contributions to evolutionary biology.

2) His contributions to the conservation of biodiversity.

3) His contributions to a sociobiology that includes humans.

4) His contributions to the unification of knowledge.

5) His encouraging stance toward young scientists and other learners.

6) The new frontier that he was working on at the time of his death was ecosystems.   

My relationship with Edward O. Wilson

Before turning to these legacies and their integration, I will briefly recount my own relationship with Ed. I am 20 years younger so that he was already famous as a Harvard professor when I entered graduate school at Michigan State University in 1971. I first met him during the summer of that year. I was a student in an ecology course at the Marine Biological Laboratory in Woods Hole, Massachusetts. He was sitting in on the student project reports. After I reported my experiments on food size selection in zooplankton, Ed remarked “That’s new, isn’t it?” I was so proud to have impressed the great E.O. Wilson and contributed to the vast storehouse of scientific knowledge that I have remembered his comment ever since!

My graduate education was shaped in part by Ed’s influence on evolutionary biology, as I will elaborate below. My next personal interaction came near the end of my graduate career. I had constructed a mathematical model that provided support for the theory of group selection, which had been almost universally rejected by evolutionary biologists, as I will also elaborate below. Convinced of its importance, I wrote Ed asking if he would consider sponsoring it for the Proceedings of the National Academy of Sciences. Ed invited me to visit him at Harvard’s Museum of Comparative Zoology. As with my first encounter, I have a vivid memory of the visit, which began with a tour of his ant laboratory. Then he stood me in front of a blackboard, sat down in a chair, and said “you have 30 minutes until my next appointment.”

I talked like an auctioneer, filling the board with my equations. Ed was sufficiently intrigued to sponsor my article for PNAS after sending it out for review by two experts in theoretical biology. The article became my Ph.D. thesis, which is probably the shortest in the history of evolutionary science (four pages).

In the years that followed, I became one of the main advocates of group selection without directly crossing paths with Ed. I also took part in most of the other initiatives associated with Ed’s legacies without directly interacting with him. We were both involved in the formation of the Human Behavior and Evolution Society (HBES) and I hosted its third annual conference in 1993. On the theme of consilience, I started the first campus-wide program for teaching evolution across the curriculum and wrote one of the first book-length accounts of religion from an evolutionary perspective. It might seem strange that Ed and I shared so many interests without directly interacting, but just about everything associated with Ed’s legacies are in fact broad developments in the history of science involving many protagonists, a point to which I will return.

My next and by far most substantive interaction with Ed began at the 2006 annual conference of HBES. Ed was a plenary speaker and I was in the audience. Even though HBES members were in the avant-garde of studying human behavior from an evolutionary perspective, most of them were doctrinaire in their rejection of group selection. On his own, Ed had embraced group selection, converging on my own advocacy, and chose to break the news to the unsuspecting audience in his plenary. You could have heard a pin drop. Afterward, we found a corner of the lobby to talk alone.

“Did you like the grenade that I tossed in their midst?” Ed asked with a conspiratorial smile. On the spot, I suggested that we write a major article together, which became “Rethinking the Theoretical Foundation of Sociobiology”, published in the Quarterly Review of Biology in 2007. To reach a larger audience, we also wrote “Evolution for the Good of the Group”, which was published in the American Scientist in 2008. These were written by trading drafts and discussing them by email and phone. I still remember his voicemails, which sometimes went on for several minutes and were spoken in flawless extemporaneous prose.

At the end of our “Rethinking” article, we summarized our argument for group selection as the theoretical foundation of sociobiology by stealing from Rabbi Hillel, who was reputedly asked to explain the meaning of the Torah while standing on one foot and replied “What is hateful to you, do not do to your neighbor. Everything else is commentary.” Our one-foot version of sociobiology was: “Selfishness beats altruism within groups. Altruistic groups beat selfish groups. Everything else is commentary.” This meme has become widely known and Ed repeated it all the way up to his final publications and interviews.

After this intense collaboration, Ed and I went our separate ways to continue pursuing our largely overlapping interests. The last time I saw him was at a conference at MIT, which was close enough to his home that he could attend without arduous travel. In the few minutes that we spoke together, he told me excitedly about ecosystems as the next big topic that he planned to synthesize. He retained his youthful spirit of exploration right up to the end.

I have one more story about Ed to tell before turning to his six legacies. In 2014, the evolutionary psychologist Barry X. Kuhle recorded a series of interviews with pioneers of HBES, including both Ed and myself. Ed must have relished the opportunity to talk at a professional level with someone as well informed as Barry because his interview lasted two hours. I was president of the newly founded Evolution Institute and Editor in Chief of its online magazine This View of Life (TVOL), which was named after the final passage of Darwin’s On the Origin of Species (“There is grandeur in this view of life…”). I was eager to feature a print version of Barry’s interview with Ed on TVOL, so I offered to transcribe it myself. There is something about transcribing a recording, word by word, that burns it into your memory more than merely listening to the recording or reading the transcription. This experience adds to my knowledge of Ed and his legacies, along with his published work and my personal relationship with him.

The Six Legacies

History—including the history of science–is a complex systemic process involving many actors and environmental (including cultural) contingencies. Attention often becomes focused on a few key people, such Albert Einstein, Sigmund Freud, and B.F. Skinner, which under-represents the contributions of many dozens of others. Iconic status is thrust upon a person as much as actively sought by the person. There seems to be a need to personify ideas as a form of simplification, among the general public and even, to a degree, among the experts.  

A few evolutionary biologists such as Ed Wilson, Richard Dawkins, and the late Stephen Jay Gould have achieved this iconic status. Yes, they made outsized contributions as individuals, but they also represent something larger than themselves. I think that Ed would agree. In his book Sociobiology: The New Synthesis, for example, he was relying upon the work of many hundreds of scientists to support his claim that there can be a single theory of social behavior informed by evolution.

The world “catalyst” also bears examination. In chemistry, a catalyst is a substance that increases the rate of a chemical reaction without being used up in the process. The way a catalytic molecule works is by holding other molecules in an orientation that binds them to each other and releases the catalytic molecule to repeat the operation. A person can play a catalytic role in cultural change in much the same way. As we will see, Ed was a catalyst par excellence. He made things happen that otherwise would have occurred much more slowly or not at all.

Against this background, calling Ed an “icon” and a “catalyst” honors the individual while also going beyond the individual to examine systemic trends in the history of science. It is in this spirit that I will review his six legacies.     

1) His contributions to evolutionary biology.

Here is how Ed described his contribution to evolutionary biology in his interview with Barry Khule:

We have to go back to the 1950’s. In the 1950’s, the molecular revolution had begun. It was clear that the golden age of modern biology was going to be molecular and would endure a long time. In fact, it did occupy the second half of the 20th century and beyond. We felt here at Harvard immediately the pressure to start giving up positions to molecular biology. The Dean of the faculty and the President at that time were entirely in accord. We found—I say we, the organismic and evolutionary biologists here, comparative anatomists, comparative zoologists and so on–realized that we would not to be given much additional space anymore, that we probably would not get many if any new positions for a long time. They would be reserved to build up Harvard’s strength in molecular and cellular biology. What this did was have a tremendous impact on me personally because I realized…that those of us, my generation of what we came to call evolutionary biologists and organismic biologists, were not going to get anywhere by complaining by any means but we were going to have to—and we should be tremendously excited to plan this—develop an equivalent to molecular biology on our own. 

Ed then set about trying to modernize the biology of whole organisms, as part of a younger generation following the architects of the Modern Synthesis, which included names such as Ernst Mayr, Julian Huxley, and George Gaylord Simpson. This required finding and collaborating with people who had complementary expertise—especially the ability to build mathematical models of ecological and evolutionary processes. Names that Ed mentions as part of this younger generation include Robert MacArthur, Larry Slobodkin, and Richard Lewontin. These were some of the rock stars whose work I avidly read as a graduate student in the 1970s.

One of Ed’s most productive collaborations was with Robert MacArthur, an ecologist with mathematical training, leading to their landmark book The Theory of Island Biogeography, published by Princeton University Press in 1967 with Ed as the second author. What made the book so important was a theoretical framework that made sense of the great mass of natural history information on the distribution and abundance of species on islands—some of it collected by Ed for ant species around the world. The theory applied not only to actual islands but to all habitats that are island-like, such as mountains separated by valleys or patches of forest separated by deforested areas.  

While Ed played a prominent role in modernizing whole organism biology, he was by no means alone. Also during my time as a graduate student, a Nobel prize was awarded to Konrad Lorenz, Niko Tinbergen, and Carl von Frisch for pioneering the study of animal behavior and the geneticist Theodosius Dobzhansky titled an article for biology teachers “Nothing in biology makes sense except in the light of evolution”. Evolutionary theory was proving its explanatory scope and many people were taking part in the effort. What this meant to me as a graduate student was that I could choose any topic, begin asking intelligent questions based on evolutionary theory (often with the help of mathematical models), and then test my hypotheses on any appropriate organism. I didn’t need to become a taxonomic specialist and I could change topics at will. In short, I could become a polymath, based not on my personal attributes but on a theory that anyone can learn. This is the legacy of evolutionary biology, to which Ed made an outsized contribution.

2) His contributions to the conservation of biodiversity

As first and foremost a naturalist and ant taxonomic expert, Ed was passionate about the conservation of biological diversity and made room for it alongside his scientific career. His book Biophilia argued that we are genetically adapted to be surrounded by nature, with mental and physical health consequences if we are not. This bold conjecture has been largely supported by research. For example, hospital patients recover faster if their room has a window or is decorated with foliage and flowers.

Ed collaborated with Thomas Lovejoy, who coincidentally passed away just a day earlier at the age of 80, to preserve the biodiversity of the Amazon. According to a remembrance in the New Yorker magazine, it was they who coined the term biological diversity, which became shortened to biodiversity. They even drew upon the theory of Island Biogeography by studying the effect of the size of forest reserves on species loss.

With his gift for marketing whole disciplines and initiatives, Ed coined the term “Half Earth” for the goal of preserving half of the earth for nature and the other half for humankind—not in separation, but in a way that is interdigitated, so that humans can live within nature and nature can flow along corridors. Anyone who values nature should want to continue this legacy but doing so requires changing the minds and hearts of people, along with their cultural practices, in the real world.

3) His contributions to a sociobiology that includes humans

Ed’s 1975 book, Sociobiology: The New Synthesis, was in the same mold as Darwin’s “there is grandeur in his view of life” and Dobzhansky’s “nothing in biology makes sense except in the light of evolution”. Ed’s claim was that evolutionary theory provides a single conceptual toolkit for studying the social behaviors of all creatures great and small. Thanks to Ed’s gift for identifying whole fields of inquiry and writing for non-specialists, Sociobiology combined the authority of an academic tome with the look and feel of a coffee table book, complete with over 200 illustrations by the artist Sarah Landry. Thanks to his stature and gift for promotion, its publication was noted on the front page of the New York Times.

It was the last chapter on human social behavior that landed Ed in trouble and a systemic view of the history of science is needed to understand why. For all its explanatory scope, the study of evolution was restricted to genetic evolution for most of the 20th century, as if the only way that offspring can resemble their parents is by sharing the same genes. This is patently false when stated directly since it ignores the cultural transmission of traits entirely, but it essentially describes what became known as the modern synthesis and was consolidated by the molecular biology revolution described by Ed in his interview with Barry Kuhle.

What became of the study of cultural evolution? It was ceded to other disciplines in the human social sciences and humanities. Each discipline developed into a sophisticated body of knowledge, but not in reference and sometimes in perceived opposition to evolutionary theory. And all of those disciplines did not remotely become integrated with each other. Instead, they became an archipelago of knowledge with little communication among the islands. The lack of consilience for human-related knowledge stands in stark contrast with the consilience of biological knowledge, at least when it comes to genetic evolution.

Darwin’s theory is often said to have earned a bad reputation for itself in the human-related disciplines by providing a moral justification for inequality (Social Darwinism). The real history of Darwinism in relation to human affairs is more complex and interesting. Socialists such as Peter Kropotkin and progressive thinkers such as William James and John Dewey were inspired by Darwin along with “nature red and truth in claw” types. The bottom line is that any powerful tool can also be used as a weapon and Darwin’s theory is no different than any other theory in this regard.1

Returning to the reception to Sociobiology, when critics accused Ed of genetic determinism, they were absolutely right. The entire field of evolutionary biology was gene-centric and Ed was no exception. Yet, critics from the human social sciences and humanities had no synthesis of their own.

Only after the publication of Sociobiology did evolutionary thinkers begin to take cultural evolution seriously. Ed was among them with books such as On Human NatureGenes, Mind, and Culture (with Charles J. Lumsden), Promethean Fire (also with Lumsden), and The Social Conquest of Earth. Other major thinkers included Richard Dawkins and his concept of memes, Luigi Luca Cavalli-Sforza and Marcus Feldman (Cultural Transmission and Evolution), and Robert Boyd and Peter Richerson (Culture and the Evolutionary Process, Not By Genes Alone). The importance of symbolic thought began to occupy center stage with books such as The Symbolic Species by Terrence Deacon and Evolution in Four Dimensions by Eva Jablonka and Marion Lamb.

Today, Darwinian evolution is widely defined as any process that combines the three ingredients of variation, selection, and replication, no matter what the mechanism of replication. This definition is true to Darwin’s thought (since he knew nothing about genes) and can accommodate a plurality of inheritance mechanisms such as epigenetics (based on changes in gene expression rather than gene frequency), forms of social learning found in many species, and forms of symbolic thought that are distinctively human. While human cultural inheritance mechanisms evolved by genetic evolution, that doesn’t make them subordinate, as if genes hold cultures on a leash (one of Ed’s metaphors). On the contrary, as the faster evolutionary process, cultural evolution often takes the lead in adapting humans to their environments, with genetic evolution playing a following role (gene-culture co-evolution).

Part of the maturation of human cultural evolutionary theory is the recognition of group selection as an exceptionally strong force in human evolution—something else that Ed got right. According to Harvard evolutionary anthropologist Richard Wrangham in his book The Goodness Paradox, naked aggression is over 100 times more frequent in a chimpanzee community than in small-scale human communities. This is due largely to social control mechanisms in human communities that suppress bullying and other forms of disruptive self-serving behaviors so that cooperation becomes the primary social strategy (this is called a major evolutionary transition). Nearly everything distinctive about our species is a form of cooperation, including our ability to maintain an inventory of symbols with shared meaning that is transmitted across generations. Our capacity for symbolic thought became a full-blown inheritance system that operates alongside genetic inheritance (dual inheritance theory). Cultural evolution is a multilevel process, no less than genetic evolution, and the increasing scale of cooperation over the course of human history can be seen as a process of multilevel cultural evolution.

While the critique of genetic determinism was accurate for Sociobiology and evolutionary biology as a whole in 1975, this is no longer the case for the modern study of humans from an evolutionary perspective—which brings us to Ed’s next legacy.

4) His contributions to the unification of knowledge.

Something that can be said about Ed’s books is that they are all visionary—imagining whole new fields of inquiry—but vary in the degree to which Ed has made progress carrying out the vision. He made the most progress for ants and other social insects, of course, and Sociobiology reflected a thorough reading of the literature on animal social behaviors. A book such as Consilience, however, is long on vision and short on execution.

I do not intend this observation as a criticism. Ed had only 24 hours in a day, like the rest of us, and his visionary gaze is worthwhile even if the execution is left to others. In Consilience, the vision is “a conviction, far deeper than a mere working proposition, that the world is orderly and can be explained by a small number of natural laws (p4)”. While this vision stretches back to antiquity and includes knowledge of the physical world in addition to the living world, there is something about evolutionary theory that fulfills the vision for the living world in an extraordinary way. Here is how Ed describes his first encounter with evolutionary theory in the opening pages of Consilience. He’s an 18-year old kid newly arrived at the University of Alabama, with a passion for identifying plants and animals using field guides.

Then I discovered evolution. Suddenly—that is not too strong a word—I saw the world in a wholly new way. This epiphany I owed to my mentor Ralph Chermock, an intense, chain-smoking young assistant professor newly arrived in the provinces with a Ph.D. in entomology from Cornell University. After listening to me natter for a while about my lofty goal of classifying all the ants of Alabama, he handed me a copy of Ernst Mayr’s 1942 Systematics and the Origin of Species. Read it, he said, if you want to become a real biologist.

The thin volume in the plain blue cover was one of the New Synthesis works, uniting the nineteenth-century Darwinian theory of evolution and modern genetics. By giving a theoretical structure to natural history, it vastly expanded the Linnaean enterprise. A tumbler fell somewhere in my mind, and a door opened to a new world. I was enthralled, couldn’t stop thinking about the implications evolution has for classification and for the rest of biology. And for philosophy. And for just about everything. Static pattern slid into fluid process…A new enthusiasm surged through me. The animals and plants I loved so dearly reentered the stage as lead players in a grand drama. Natural history was validated as real science.

Coincidentally, Ernst Mayr’s Animal Species and Evolution was one of the first evolution books that I read as an undergraduate student. While it was not thin (811 pp!), I was similarly enthralled. Compare Ed’s epiphany with passages from Charles Darwin, such as “I can remember the very spot on the road…” and “he who understands the baboon would do more toward metaphysics than Locke”, which was scribbled in his notebook in 1838. There is something about the simplicity and generality of evolutionary theory that starts working at the very beginning, for Darwin as the originator and Ed Wilson as an unschooled kid. Now recall what I said about being a graduate student in the 1970s—that I could become a polymath, based not on my personal attributes but on a theory that anyone can learn. What this means is that by the 1970s, what Darwin and Ed glimpsed from the start was now proving itself for the length and breadth of the biological sciences. Every time an evolutionary biologist decides to switch to a new topic and/or organism–which happens all the time—consilience is being demonstrated in action.

The prospect that human-related knowledge can become unified in this way is both old and new. It was how Darwin thought and he originated group selection theory as much to explain human morality as “for the good of the group” traits in nonhuman species. But you can’t make sense of humanity without acknowledging its groupish nature and the importance of culturally transmitted symbolic meaning systems. As Emile Durkheim wisely put it: “Social life, then, in every aspect and throughout its history, is only possible thanks to a vast body of symbolism.” Only now are we in a position to synthesize human-related knowledge in the same way as biological knowledge, thanks to an expanded definition of Darwinism as any variation/selection/replication process. Ed’s vision in Consilience is right on and its fulfillment is now in progress.

5) His encouraging stance toward young scientists and other learners.

No remembrance of Ed would be complete without noting the way that he encouraged people to become scientists, to follow their hearts, and to cultivate a reverence for nature. Visit #eowilson on Twitter and you’ll find quotes such as these offered by those whose lives he touched.

“Adults . . . are prone to undervalue the mental growth that occurs during daydreaming and aimless wandering.” — The late great Edward O. Wilson

“Nature first, then theory. Love the organisms for themselves first, then strain for general explanations, and with good fortunes discoveries will follow.”

“You are capable of more than you know. Choose a goal that seems right for you and strive to be the best, however hard the path. Aim high. Behave honorably. Prepare to be alone at times, and to endure failure. Persist! The world needs all you can give.”

“Nature holds the key to our aesthetic, intellectual, cognitive and even spiritual satisfaction.

“There can be no purpose more enspiriting than to begin the age of restoration, reweaving the wondrous diversity of life that still surrounds us.”

“The evolutionary epic is the best myth we will ever have.”

“You teach me, I forget. You show me, I remember. You involve me, I understand.”

“Humanity is part of nature, a species that evolved among other species. The more closely we identify ourselves with the rest of life, the more quickly we will be able to discover the sources of human sensibility and acquire the knowledge on which an enduring ethic, a sense of preferred direction, can be built.”

Passages such as these spell the difference between science and a science-based worldview. By itself, science merely tells us what is. A worldview provides a sense of values and motivates action. A science-based worldview does this based on reverence of the natural world rather than a supernatural agency. Ed is remembered at least as much for the science-based worldview that he offered as his scientific discoveries.

6) Ecosystems as Ed’s final frontier

Ed’s next book was to be titled “Ecosystems and the Harmony of Nature”. I don’t know if it will be published posthumously but we can get a glimpse of what he had in mind from its title, a brief article on the E.O. Wilson Biodiversity Foundation website,2 and a short lecture on YouTube.3

In the article, Ed is quoted as saying: “We know that ecosystems, which are really what we are trying to protect–not just single species but ensembles of species that have come together and have reached the ability—sometimes over thousands or even in some places millions of years—have formed ecosystems that equilibrate. And we don’t really know how equilibration comes about.” Ed also encourages young people to join “the coming development of a new biological science, one of the next big things, which is ecosystem studies.”

I must confess that I am puzzled by these statements since the study of whole ecosystems dates back to the beginning of the 20th century and has become increasingly integrated with evolutionary ecology over the last 50 years. It turns out that multilevel selection theory is essential for understanding the nature of ecosystems, no less than single species societies. I will be fascinated to know if Ed has converged upon this conclusion.

To explain what I mean, a critical distinction needs to be made between two meanings of the term “complex adaptive system (CAS)”: A complex system that is adaptive as a system (CAS1), and a complex system composed of agents following their respective adaptive strategies (CAS2). A human society in the grip of civil war is an example of CAS2. It can be understood in terms of the conflicting interests of the warring factions, but it does not function well at the level of the whole society (CAS1) and no one would expect it to.

Many single-species societies in nature are like my human civil war example. Members of social groups are largely in conflict with each other and at most cooperate in specific contexts. We need look no further than chimpanzee communities for an example, where naked aggression is over 100 times more frequent than in small-scale human communities and the main context for community-wide cooperation is aggression against neighboring communities. Social strife in chimpanzee communities is stable—there is no reason to expect it to change, given the selection pressures that are operating—but that doesn’t make them harmonious or desirable from a human perspective.

Many multispecies ecosystems are also like this. For example, if you want to understand the nature of beaver ecosystems, ask the question “what’s in it for the beavers?” They are modifying the environment for their own benefit, flooding it to protect themselves from predators and eating the most palatable plants. Consequences for biodiversity and ecosystem processes such as nutrient cycling are collateral effects of beavers pursuing their interests. There is no reason to expect the whole ecosystem to be functionally organized and harmonious, any more than a chimpanzee community or a human society in the grip of civil war.

This is a hard lesson to learn about nature. We want it to be harmonious. Religious cosmologies often portray nature as harmonious (e.g., the Garden of Eden) except when disturbed by humans. The early study of ecosystems often treated them axiomatically as harmonious.  But Darwin’s theory of evolution tells a different story. It tells us that functional organization for any given system, at any given scale, requires a process of selection at that scale. That is the only way to achieve the status of CAS1 rather than merely CAS2, where functionally organized agents impose suffering on each other in the course of pursuing their respective adaptive strategies. That statement goes for human society, single-species animal societies, and multispecies ecosystems.   

Are there examples of whole ecosystems that have evolved into superorganisms? Yes! Microbiomes are an example. Every multicellular organism is not only a collection of mostly identical genes but also an ecosystem composed of trillions of microbes comprising thousands of species. When the host organisms differentially survive and reproduce, this is due in part to variation in their microbiomes along with variation in their genes. Thanks to selection at this level, microbiomes have evolved to be largely mutualistic with their hosts. There is also potential for selection among microbes within each host, however, leading to the evolution of pathogenic strains. It all depends on the level of selection.

Nowadays, whole forests are being imagined as mutualistic networks, with trees connected into a network by mycorrhizal fungi. Is such a thing possible? Yes, but only if selection has operated at the scale of whole forests with sufficient strength to counteract selection at lower scales. Otherwise, forests become merely CAS2 systems, composed of species that interact at cross purposes, rather than CAS1 systems.

Above all, it is important to avoid confusing “harmony” with “equilibrium”. Ecologists have started to use the word “regime” to describe stable assemblages of species. This is a well-chosen word because it evokes what we already know about human political regimes. All political regimes have a degree of stability, or we wouldn’t call them regimes, but they span the range from despotic (benefitting a few elites at the expense of everyone else) to inclusive (sharing their benefits with all citizens). Some of the worst regimes are also depressingly the most stable. Using the language of complex systems theory, there are multiple local stable equilibria and positive change requires escaping the gravitational pull of one local equilibrium to enter another local equilibrium. This requires active management and will not necessarily happen by itself. The management of ecosystems must itself be a human cultural evolutionary process informed by multilevel selection theory.

Combining the legacies

In this remembrance of Ed Wilson, I have tried to honor the person while also placing him in the context of broad trends in the history of science. Without mentioning Ed, we can say that Darwin’s theory of evolution has an amazing explanatory scope, that this scope was largely restricted to the study of genetic evolution for most of the 20th century, but now is rapidly expanding to include all aspects of humanity in addition to the rest of life. As I put it in my own book This View of Life: Completing the Darwinian Revolution, Dobzhansky’s statement “nothing in biology makes sense except in the light of evolution” can be extended to include everything associated with the words “human”, “culture”, and “policy”.

Without mentioning Ed, we can also say that evolutionary theory is capable of functioning as a worldview in addition to a body of scientific knowledge. Science only tells us what is, whereas a worldview inspires us psychologically and moves us to action. Creating a worldview informed entirely by science, as opposed to supernatural belief, is part of the enlightenment project that led to humanism as a philosophical worldview and social movement. While humanists accept Darwin’s theory as a matter of course, the recent developments that I have recounted have not been incorporated into the humanist movement for the most part. Thus, humanism and what it stands for is due for a renaissance, along with a renaissance of basic scientific knowledge.

Some simple calculations will help to put Ed’s career into historical perspective. Starting from when he received his Ph.D. in 1955 to his death in 2021, his career lasted for 66 years. If we mark the beginning of evolutionary science with the publication of Darwin’s On the Origin of Species in 1859, then Ed was present for 40% of the history of evolutionary thought. If we mark the beginning of the scientific revolution with the publication of Copernicus’s On the Revolution of the Heavenly Spheres in 1543, then Ed was present for 14% of the scientific revolution. As 20 years Ed’s junior, my numbers work out to 28% and 10% respectively.

These numbers remind us that evolutionary science and the scientific revolution are still works in progress. If science in general and evolutionary science, in particular, have revolutionized the way we see and therefore act upon the world, then we can look forward to further improvements in the near future. This leads to a form of hope and optimism, even in the darkest of times, that is part of Ed’s legacy.

For me, the next frontier is not just ecosystems but becoming wise stewards of evolution in all its forms. Variation/selection/replication processes are taking place all around us at different time scales, including genetic evolution, cultural evolution, and intra-generational personal evolution. Without wise stewardship, these evolutionary processes result merely in CAS2—complex systems composed of agents following their respective adaptive strategies, often inflicting harm on each other and on the entire system over the long term. Work is required to transform CAS2 into CAS1—systems that are adaptive as whole systems. This work will be required for all forms of positive change—individual, cultural, and ecosystemic. The ability to see this clearly and to act upon it has only become available during the last few decades and is currently shared by only a tiny fraction of those who need to know about it. Catalysis is needed, so that positive evolution can take place in a matter of years rather than decades or not at all. The best way to honor Ed’s combined legacies is to join in this catalysis.

References:

[1] For more, see the TVOL special edition titled “Truth and Reconciliation for Social Darwinism”.

[2] https://eowilsonfoundation.org/inspiring-a-new-generation-to-fight-for-biodiversity/

[3] https://thefestivalofdiscovery.com/session/watch-now-e-o-wilson-ecosystems-and-the-harmony-of-nature/

Game theory and economics show how to steer evolution in a better direction (Science Daily)

Date: November 16, 2021

Source: PLOS

Summary: Human behavior drives the evolution of biological organisms in ways that can profoundly adversely impact human welfare. Understanding people’s incentives when they do so is essential to identify policies and other strategies to improve evolutionary outcomes. In a new study, researchers bring the tools of economics and game theory to evolution management.


Human behavior drives the evolution of biological organisms in ways that can profoundly adversely impact human welfare. Understanding people’s incentives when they do so is essential to identify policies and other strategies to improve evolutionary outcomes. In a new study publishing November 16thin the open access journal, PLOS Biology, researchers led by Troy Day at Queens University and David McAdams at Duke University bring the tools of economics and game theory to evolution management.

From antibiotic-resistant bacteria that endanger our health to control-resistant crop pests that threaten to undermine global food production, we are now facing the harmful consequences of our failure to efficiently manage the evolution of the biological world. As Day explains, “By modelling the joint economic and evolutionary consequences of people’s actions we can determine how best to incentivize behavior that is evolutionarily desirable.”

The centerpiece of the new analysis is a simple mathematical formula that determines when physicians, farmers, and other “evolution managers” will have sufficient incentive to steward the biological resources that are under their control, trading off the short-term costs of stewardship against the long-term benefits of delaying adverse evolution.

For instance, when a patient arrives in an urgent-care facility, screening them to determine if they are colonized by a dangerous superbug is costly, but protects future patients by allowing superbug carriers to be isolated from others. Whether the facility itself gains from screening patients depends on how it weighs these costs and benefits.

The researchers take the mathematical model further by implementing game theory, which analyzes how individuals’ decisions are interconnected and can impact each other — such as physicians in the same facility whose patients can infect each other or corn farmers with neighboring fields. Their game-theoretic analysis identifies conditions under which outcomes can be improved through policies that change incentives or facilitate coordination.

“In the example of antibiotic-resistant bacteria, hospitals could go above and beyond to control the spread of superbugs through methods like community contact tracing,” McAdams says. “This would entail additional costs and, alone, a hospital would likely not have an incentive to do so. But if every hospital took this additional step, they might all collectively benefit from slowing the spread of these bacteria. Game theory gives you a systematic way to think through those possibilities and maximize overall welfare.”

“Evolutionary change in response to human interventions, such as the evolution of resistance in response to drug treatment or evolutionary change in response to harvesting, can have significant economic repercussions,” Day adds. “We determine the conditions under which it is economically beneficial to employ costly strategies that limit evolution and thereby preserve the value of biological resources for longer.”


Journal Reference:

  1. Troy Day, David A. Kennedy, Andrew F. Read, David McAdams. The economics of managing evolution. PLOS Biology, 2021; 19 (11): e3001409 DOI: 10.1371/journal.pbio.3001409

Out of Savannastan by Tim Flannery (New York Review of Books)

New York Review of Books, November 4, 2021

By Tim Flannery

Ancient Bones: Unearthing the Astonishing New Story of How We Became Human by Madelaine Böhme, Rüdiger Braun, and Florian Breier, translated from the German by Jane Billinghurst and with a foreword by David R. Begun. Greystone, 337 pp., $34.95

In 1863 the biologist T.H. Huxley proposed an African origin for humanity. Known as “Darwin’s bulldog” for his ferocious defense of Darwin’s evolutionary theory, he had been struck by the distribution in Africa of our nearest living relatives, the common chimpanzee and the gorilla. (The latter had first been described by Europeans just sixteen years earlier, in 1847.) Darwin himself, however, demurred. Aware of the discovery of fossils of apes in Europe dating to the Miocene Epoch (around 23 to 5 million years ago), he opined that “since so remote a period the Earth has certainly undergone many great revolutions, and there has been ample time for migration on the largest scale.”

It was the pioneering and indefatigable Leakey family who found evidence for Huxley’s narrowly supported hypothesis. Louis and Mary Leakey began their search for fossils of human ancestors in Olduvai Gorge, in what is now Tanzania, in the 1930s. Amid the dust, sweat, and inconvenience of remote field camps, they simultaneously dug for fossils and raised three boys, often finding nothing of significance for years at a time. Then, in 1959, Mary discovered a fossilized skull that made headlines around the world. Paranthropus boisei, as it became known, belonged to a male upright ape who had stood around five feet high, weighed 110 pounds, and lived 1.8 million years ago. With powerful teeth and a prominent crest atop his braincase to anchor prodigious chewing muscles, he was an archetypal “ape man.” I recall as a child staring awestruck at a painting of Paranthropus that combined the features of gorillas, chimps, and humans, and that powerfully cemented in my mind the idea that Africa had been humanity’s cradle.

A few months after this discovery, the Leakeys made a second, even more significant find—a jaw attributable to an early member of our own genus. Homo habilis, or “Handy Man,” was a toolmaker hailed as the oldest “true” human ever discovered. After that, the discoveries just kept coming. In 1974 an international team in Ethiopia led by the paleoanthropologist Donald Johanson unearthed the skeleton of the three-foot-tall bipedal ape Australopithecus afarensis, who became popularly known as Lucy. With a catchy name and providing powerful, easy-to-understand support for an African origin, Lucy soon became a household name. Four years later Mary Leakey found 3.6-million-year-old hominin footprints at Laetoli, Tanzania, providing the earliest evidence of bipedalism.

In 1984 a team led by Louis and Mary Leakey’s son Richard unearthed a skeleton of Homo erectus at Lake Turkana in northern Kenya that was 90 percent complete. It seemed as if these astonishing African fossils illustrated most of the important steps in the human evolutionary story. When, beginning in the 1980s, genetic evidence suggested that our species (Homo sapiens) originated in Africa, the case seemed settled: Huxley, rather than Darwin, had been right about our origins. Some researchers began elaborating an all-encompassing Out of Africa theory, which had three components: (1) our hominin lineage (which split from chimpanzees between 13 and 7 million years ago) arose in Africa; (2) our genus, Homo, arose in Africa about 2.3 million years ago, and (3) our species originated in Africa about 300,000 years ago.

But there were always a few dissenters who, like Darwin, felt that the significance of fossilized fragments from Europe and Asia had been overlooked. They pointed to a suspicious gap in the African fossil record between 12 and 6 million years ago, just when the human and chimpanzee lineages were diverging. And some worried that the Leakeys and others had found fossils only where they looked for them—in Africa. If equivalent effort was put in elsewhere, skeptics argued, important finds might be made.

These objections had long been ignored, but now, in her splendid and important new book Ancient Bones, Madelaine Böhme and her collaborators Rüdiger Braun and Florian Breier have taken them up. Scientifically rigorous and written with a clarity and candor that create a gripping tale, it presents a powerful challenge to proponents of the Out of Africa hypothesis. The book begins with a foreword by one of the earliest and most prominent objectors to the hypothesis, the University of Toronto professor David R. Begun. Begun believes that apes became extinct in Africa around 12 million years ago and that our earliest direct ancestors evolved in Europe, which is rich in ape fossils from 12 to 6 million years old. Böhme, a terrestrial paleoclimatologist and paleoanthropologist at the University of Tübingen, has excavated and researched many specimens of European apes herself, and her account of the history of Europe’s lost apes is imbued with the sweat, grime, and triumph that is the lot of the fieldworker, and carries great authority.

As Böhme illustrates, the evolution of the human lineage is complex. A crucial event occurred around 25 million years ago, when the apes and Old World monkeys originated from a common ancestor in East Africa. The monkeys flourished in Africa, but as time went on the apes dwindled, until around 16 million years ago some reached Europe, where they thrived. Climatic changes in Europe, including increased seasonality, seem to have favored their diversification, and twelve genera are now known from the European Miocene, varying from gibbon-like creatures that swung through the forest canopy to gorilla-sized, presumably terrestrial ramblers.

As oak and beech trees started to crowd out the tropical vegetation that had dominated Europe till then, the apes were forced to alter their diet. Depending on which part of Europe they lived in, they had to go for between two and four months without fresh leaves, fruits, or nuts. Around 15 million years ago, a genetic mutation occurred that resulted in their inability to produce uricase, the enzyme used by mammals to break down uric acid so that it can be excreted in urine. This mutation led to high levels of uric acid in the apes’ blood, allowing them to rapidly convert fructose into fat. And fat, stored in the liver and other tissues, is an energy reserve that made it possible for the apes to survive lean seasons.

I often curse this adaptation, for I’m a victim of that singularly painful condition, gout, which is caused by a buildup of uric acid in the blood. Were it not for the availability of uricase in pill form (thank God for modern medicine!), I’d be a bedridden old grouch by now. But gout is just one of the many “diseases of civilization” inflicted on us by this adaptation in our ape ancestors. Diabetes, obesity, high blood pressure, and heart disease are all related to some degree to the loss, in some long-extinct European ape, of the ability to remove uric acid from the blood.

One of Böhme’s most important fossil finds was made near Kaufbeuren, in southern Germany. There, while visiting a lignite pit, she examined small black lumps of what was supposedly coal, only to discover that they were ancient bones. The deposit was about to be mined and destroyed, and, with no alternative, Böhme asked that twenty-five tons of fossil-rich sediment be scooped up and dumped where paleontologists could sort through it without interrupting the quarrying. After two field seasons of arduous work, she recovered 15 percent of the skeleton of a single great ape, along with fragments from three others. Named Danuvius guggenmosi, the creature had lived 11.62 million years ago, in a subtropical environment. At just three feet tall and weighing around sixty-five pounds, Danuvius had big, powerful thumbs and toes and an elongated lower back that permitted an upright stance. Böhme quips that “from the waist up he looked like an ape and from the waist down he looked like an early hominin.” Danuvius is in fact one of the candidates for the last common ancestor of chimps and humans.

As the climate cooled later in the Miocene, savanna replaced forest in some parts of Europe, and this had a big impact on the continent’s apes. According to Böhme, a crucial piece of evidence indicating what happened was unearthed in June 1944, when besieged German soldiers dug a bunker near Athens. Bruno von Freyberg, a geology professor from Erlangen who was then serving in the German army, asked his workers to alert him to any fossils they encountered. Despite having lost an arm in World War I, Freyberg personally unearthed the finds, including the jaw of an ape, then sent his fossils to the Natural History Museum in Berlin for safekeeping. But the museum was bombed on February 3, 1945, and the priceless jawbone was severely damaged, losing most of its teeth.

In 1969 the great paleoanthropologist Gustav Heinrich Ralph von Koenigswald examined the damaged bone and named it Graecopithecus freybergi—Freyberg’s Greek ape. But it was so extensively mangled that other researchers concluded it was not identifiable, and so sought to suppress Koenigswald’s name. The jawbone might have been forgotten altogether but for Böhme, who tracked it down to a long-forgotten safe in a university department. When she had the jaw x-rayed, she saw that the roots of the teeth shared unique features with those of the subfamily Homininae, to which humans belong. She also redated the find, establishing that it was 7.175 million years old.

Her conclusion that the oldest human ancestor had lived in Greece around six to seven million years ago was so inconsistent with the dominant Out of Africa hypothesis that the paleoanthropological community largely reacted with stunned silence. But then, within months of Böhme’s analysis of Graecopithecus being published in 2017, a second, even more stunning and unexpected discovery was announced.

In 2002 the Polish paleontologist Gerard Gierliński had been vacationing with his girlfriend near Trachilos, Crete. On a slab of rock by the water he saw oblong marks that he recognized as fossilized footprints. But he didn’t follow up until 2010, when he mentioned them to a colleague; the two scientists hypothesized that the footprints might have been made by a bipedal ape. Analysis revealed that the feet that had left the tracks were small (between 4 and 8.5 inches long) and had five toes, a pronounced ball of the foot, and a big toe aligned with the other toes. The feet that left the prints undeniably resembled humans’ feet but lacked some features, such as an arch. Astonishingly, dating revealed that the prints were made more than six million years ago, when Crete was a long, southward-projecting peninsula of Europe.

I recall my own skepticism upon reading of this find: the discovery of six-million-year-old humanlike footprints on a Greek island seemed too outlandish. And evidently the paleoanthropological community felt similarly, for Gierliński and his colleagues had tried in vain for six and a half years to get their results published. According to Böhme, the manuscript was repeatedly rejected by anonymous reviewers whose reasoning was often difficult to decipher. But following the publication of Böhme’s reanalysis of Graecopithecus, Gierliński’s paper on the Trachilos footprints finally made it to press.

Böhme thinks that the tracks could have been left by Graecopithecus around the time upright apes migrated from Europe back to Africa, allowing them to repopulate a continent that they had been absent from for six million years. Whatever the case, there is no doubt that Graecopithecus and the Trachilos footprints present a strong challenge to the first part of the Out of Africa theory.

To most proponents of the Out of Africa theory, many of whom have invested lifetimes excavating sites in Africa, claims about human origins in Europe are heretical. A sense of just how high the stakes are can be gained from the controversy surrounding the discovery at the turn of the twenty-first century of the skull of Sahelanthropus tchadensis, a hominid species. The skull—which was found in the desert in Chad and studied by Professor Michel Brunet, then at the University of Poitiers—is thought to be six million years old and has been used to support the theory that the oldest human ancestor lived in North Africa six to seven million years ago. This finding has been widely accepted and celebrated: there is a street on the campus in Poitiers named for Brunet, and a parking garage named for Toumaï, as the skull is popularly known.

The skull is horribly fractured, and the area where it articulated with the spinal column is heavily damaged. The reconstruction by Brunet’s team made it appear that the skull sat atop the vertebral column, as it does in bipedal apes. But others disagreed, saying that the articulation was farther back, as in gorillas. Indeed, critics say, the skull has a number of gorilla-like features and may belong to an ancestral gorilla.

There matters might have remained, if not for the publication of a photograph of the skull as it was upon discovery. It lay in sand, surrounded by a scatter of other bones including a thighbone that was possibly part of the same individual as the Sahelanthropus skull. While Brunet was doing fieldwork, Aude Bergeret, a Ph.D. student who was studying the bones in her lab, concluded that the thighbone belonged to a great ape and that Sahelanthropus was not bipedal. According to Böhme, when Bergeret’s assertion became known, “the thighbone disappeared without a trace and the doctoral student lost her position at the university.”

In 2018 Bergeret and a colleague offered to give a presentation on the thighbone at the annual meeting of the Société d’Anthropologie de Paris, but they were refused. “Could it be,” Böhme asks, “that Michel Brunet, one of the icons of French science, Knight of the Légion d’honneur, recipient of the Ordre national du Mérite, did not want to be challenged?”

Questions about Sahelanthropus continue to pile up. Because the bones were found not in the sediments that preserved them but in sand drifts, it is unclear how old they are. And is Sahelanthropus an early gorilla or a member of the human lineage? The fossil record of gorillas is almost entirely unknown, so the discovery of an ancestral gorilla would be of huge significance. But it’s hard to imagine a street in a university being named for the discoverer of such a fossil.

The second part of the Out of Africa hypothesis states that the genus Homo evolved in Africa. Böhme strongly challenges this, arguing instead that our genus evolved in a great, now fragmented grassy woodland known as Savannastan, which covered parts of Europe, Asia, and Africa 2.6 million years ago. In support of the idea, she cites 1.8-million-year-old Homo skeletons from Georgia and, more intriguingly, a jaw and a few isolated teeth found in cave sediments in Longgupo Cave, in Wushan County in China’s Sichuan Province. The Chinese fossils were named as a new species, Homo wushanensis, by researchers in 1991, and according to Böhme the remains are between 2.6 and 2.48 million years old. As the oldest Homo habilis remains from Africa are only 2.3 million years old, the dating of the Chinese finds, if verified, would pose a direct challenge to part two of the Out of Africa hypothesis.

But interpretation of the fragmented remains of Homo wushanensis is complicated. In 2009 Russell Ciochon, an American researcher who described Homo wushanensis, declared that he had made a mistake. The jaw and some of the teeth did not belong to an early human, he said, but to one or more “mystery apes.”

 His retraction was acclaimed by some as a welcome act of intellectual honesty in a field characterized by fierce rivalry. Yet it has hardly settled matters. Böhme, for example, notes that stone tools were also found in Longgupo Cave, suggesting the presence of early humans. Others have speculated that the tools (along with some of the teeth) may have found their way into the deposit from more recent sediments, but Böhme is not satisfied by this explanation. Instead, she asks of Ciochon’s retraction, “Why the spectacular retreat? Was it to avoid jeopardizing the Out of Africa…hypothesis?”

Böhme, it seems, is just as determined to defend her hypothesis as the Out of Africanistas are to defend theirs.

Ancient Bones makes clear that Graecopithecus and the Trachilos footprints provide convincing evidence that our earliest direct ancestors evolved in Europe, and that they were walking upright as early as six million years ago. But the book, I think, is overly confident in its challenge to the idea that the genus Homo arose in Africa. That’s because, while there are intriguing clues that Homo may have been present in Europe or Asia before the oldest African finds (which date to around 2.3 million years ago), the evidence is far from conclusive. And of course the third part of the Out of Africa hypothesis, that Homo sapiens evolved in Africa, remains unchallenged—though the recent discovery that all living people carry genes from other hominin lineages, such as Neanderthals and Denisovans, which appear to have evolved in Europe and Asia, respectively, adds an intriguing twist to the tale.

What Ancient Bones does make clear, however, is that we place far too much emphasis on rewarding the discovery of our ancestors. In science, a discovery that leads in an unexpected direction, or even to a dead end, is often as productive as a lucky find. If we could only get past the great egos that swell in the field of paleoanthropology and reward the search as much as we do the discovery! But that, perhaps, would require an objectivity and generosity that aren’t entirely human.

Study: Evolution now accepted by majority of Americans (EurekaAlert!)

News Release 20-Aug-2021

Peer-Reviewed Publication

University of Michigan

The level of public acceptance of evolution in the United States is now solidly above the halfway mark, according to a new study based on a series of national public opinion surveys conducted over the last 35 years.

“From 1985 to 2010, there was a statistical dead heat between acceptance and rejection of evolution,” said lead researcher Jon D. Miller of the Institute for Social Research at the University of Michigan. “But acceptance then surged, becoming the majority position in 2016.”

Examining data over 35 years, the study consistently identified aspects of education—civic science literacy, taking college courses in science and having a college degree—as the strongest factors leading to the acceptance of evolution.

“Almost twice as many Americans held a college degree in 2018 as in 1988,” said co-author Mark Ackerman, a researcher at Michigan Engineering, the U-M School of Information and Michigan Medicine. “It’s hard to earn a college degree without acquiring at least a little respect for the success of science.”

The researchers analyzed a collection of biennial surveys from the National Science Board, several national surveys funded by units of the National Science Foundations, and a series focused on adult civic literacy funded by NASA. Beginning in 1985, these national samples of U.S. adults were asked to agree or disagree with this statement: “Human beings, as we know them today, developed from earlier species of animals.”

The series of surveys showed that Americans were evenly divided on the question of evolution from 1985 to 2007. According to a 2005 study of the acceptance of evolution in 34 developed nations, led by Miller, only Turkey, at 27%, scored lower than the United States. But over the last decade, until 2019, the percentage of American adults who agreed with this statement increased from 40% to 54%.

The current study consistently identified religious fundamentalism as the strongest factor leading to the rejection of evolution. While their numbers declined slightly in the last decade, approximately 30% of Americans continue to be religious fundamentalists as defined in the study. But even those who scored highest on the scale of religious fundamentalism shifted toward acceptance of evolution, rising from 8% in 1988 to 32% in 2019.

Miller predicted that religious fundamentalism would continue to impede the public acceptance of evolution. 

“Such beliefs are not only tenacious but also, increasingly, politicized,” he said, citing a widening gap between Republican and Democratic acceptance of evolution. 

As of 2019, 34% of conservative Republicans accepted evolution compared to 83% of liberal Democrats.

The study is published in the journal Public Understanding of Science.

Besides Miller and Ackerman, the authors are Eugenie Scott and Glenn Branch of the National Center for Science Education; Belén Laspra of the University of Oviedo in Spain; and Carmelo Polino of the University of Oviedo and Centre Redes in Argentina; and Jordan Huffaker of U-M.

Study abstract: Public acceptance of evolution in the United States, 1985-2020 

A PDF of the study is available upon request


Journal

Public Understanding of Science

Por que somos a única espécie humana do planeta (El País)

brasil.elpais.com

Nuño Domínguez, 04 jul 2021 – 12:48 BRT

Três grandes descobertas feitas nos últimos dias nos obrigam a repensar as origens da humanidade


Três descobertas nos últimos dias acabam de mudar o que sabíamos sobre a origem da raça humana e da nossa própria espécie, Homo sapiens. Talvez − dizem alguns especialistas − precisemos abandonar esse conceito para nos referir a nós mesmos, pois as novas descobertas sugerem que somos uma criatura de Frankenstein com partes de outras espécies humanas com as quais, não muito tempo atrás, compartilhamos planeta, sexo e filhos.

As descobertas da última semana indicam que cerca de 200.000 anos atrás havia até oito espécies ou grupos humanos diferentes. Todos faziam parte do gênero Homo, que nos engloba. Os recém-chegados apresentam uma interessante mistura de traços primitivos − arcos enormes acima das sobrancelhas, cabeça achatada − e modernos. O “homem dragão” da China tinha uma capacidade craniana tão grande quanto a dos humanos atuais, ou até superior. O Homo de Nesher Ramla, encontrado em Israel, pode ter sido o que deu origem aos neandertais e aos denisovanos que ocuparam, respectivamente, a Europa e a Ásia e com quem nossa espécie teve repetidos encontros sexuais, dos quais nasceram filhos mestiços que foram aceitos em suas respectivas tribos como mais um.

Agora sabemos que devido àqueles cruzamentos todas as pessoas de fora da África têm 3% de DNA neandertal, ou que os habitantes do Tibete têm genes transmitidos pelos denisovanos para poder viver em grandes altitudes. Algo muito mais inquietante foi revelado pela análise genética das populações atuais da Nova Guiné: é possível que os denisovanos − um ramo irmão dos neandertais − tenham vivido até apenas 15.000 anos atrás, uma distância muito pequena em termos evolutivos.

A terceira grande descoberta dos últimos dias é quase detetivesca. Na análise de DNA conservado no solo da caverna de Denisova, na Sibéria, foi encontrado material genético dos humanos autóctones, os denisovanos, de neandertais e de sapiens em períodos tão próximos que poderiam até se sobrepor. Lá foram encontrados há três anos os restos do primeiro híbrido entre espécies humanas que se conhece: uma menina filha de uma neandertal e de um denisovano.

O paleoantropólogo Florent Detroit descobriu para a ciência outra dessas novas espécies humanas: o Homo luzonensis, que viveu em uma ilha das Filipinas há 67.000 anos e que apresenta uma estranha mistura de traços que poderiam ser o resultado de sua longa evolução em isolamento durante mais de um milhão de anos. É um pouco parecido com o que experimentou seu contemporâneo Homo floresiensis, ou “homem de Flores”, um humano de um metro e meio que viveu em uma ilha indonésia. Tinha um cérebro do tamanho do de um chimpanzé, mas se for aplicado a ele o teste de inteligência mais usado pelos paleoantropólogos, podemos dizer que era tão avançado quanto o sapiens, pois suas ferramentas de pedra eram igualmente evoluídas.

Imagem radiográfica da mandíbula do ‘Homo’ de Nesher Ramla descoberta em Israel.
Imagem radiográfica da mandíbula do ‘Homo’ de Nesher Ramla descoberta em Israel.Ariel Pokhojaev

A esses dois habitantes insulares se soma o Homo erectus, o primeiro Homo viajante que saiu da África há cerca de dois milhões de anos. Ele conquistou a Ásia e lá viveu até pelo menos 100.000 anos atrás. O oitavo passageiro desta história seria o Homo daliensis, um fóssil encontrado na China com uma mistura de erectus e sapiens, embora seja possível que acabe sendo incluído na nova linhagem do Homo longi.

“Não me surpreende que houvesse várias espécies humanas vivas ao mesmo tempo”, afirma Detroit. “Se considerarmos o último período geológico que começou há 2,5 milhões de anos, sempre houve diferentes gêneros e espécies de hominídeos compartilhando o planeta. A grande exceção é a atualidade, nunca havia existido apenas uma espécie humana na Terra”, reconhece. Por que nós, os sapiens, somos os únicos sobreviventes?

Para Juan Luis Arsuaga, paleoantropólogo do sítio arqueológico de Atapuerca, no norte da Espanha, a resposta é que “somos uma espécie hipersocial, os únicos capazes de construir laços além do parentesco, ao contrário dos demais mamíferos”. “Compartilhamos ficções consensuais como pátria, religião, língua, times de futebol; e chegamos a sacrificar muitas coisas por elas”, assinala. Nem mesmo a espécie humana mais próxima de nós, os neandertais, que criavam adornos, símbolos e arte, tinham esse comportamento. Arsuaga resume assim: “Os neandertais não tinham bandeira”. Por razões ainda desconhecidas, essa espécie se extinguiu há cerca de 40.000 anos.

Os sapiens não eram “estritamente superiores” a seus congêneres, opina Antonio Rosas, paleoantropólogo do Conselho Superior de Pesquisas Científicas da Espanha. “Agora sabemos que somos o resultado de hibridações com outras espécies, e o conjunto de características que temos foi o perfeito para aquele momento”, explica. Uma possível vantagem adicional é que os grupos sapiens eram mais numerosos que os neandertais, o que significa menos endogamia e melhor saúde das populações.

Detroit acredita que parte da explicação está na própria essência da nossa espécie sapiens, “sábio” em latim. “Temos um cérebro enorme que devemos alimentar, por isso precisamos de muitos recursos e, portanto, de muito território”, assinala. “O Homo sapiens teve uma expansão demográfica enorme e é bem possível que a disputa pelo território fosse muito dura para as demais espécies”, acrescenta.

María Martinón-Torres, diretora do Centro Nacional de Pesquisa sobre Evolução Humana, com sede em Burgos, acredita que o segredo seja a “hiperadaptabilidade”. “A nossa é uma espécie invasiva, não necessariamente mal-intencionada, mas somos como o cavalo de Átila da evolução”, compara. “Por onde passamos, e com nosso estilo de vida, diminui a diversidade biológica, incluindo a humana. Somos uma das forças ecológicas de maior impacto do planeta e essa história, a nossa, começou a se delinear no Pleistoceno [o período que começou há 2,5 milhões de anos e terminou há cerca de 10.000, quando o sapiens já era a única espécie humana que restava no planeta]”, acrescenta.

As descobertas dos últimos dias voltam a expor um problema crescente: os cientistas estão denominando cada vez mais espécies humanas. Tem sentido fazer isso? Para o paleoantropólogo israelense Israel Hershkovitz, autor da descoberta do Homo de Nesher Ramla, não. “Há muitas espécies”, afirma. “A definição clássica diz que duas espécies diferentes não podem ter filhos férteis. O DNA nos diz que sapiens, neandertais e denisovanos tiveram, por isso deveriam ser considerados a mesma espécie”, aponta.

“Se somos sapiens, então essas espécies que são nossos ancestrais por meio da miscigenação também são”, reforça João Zilhão, professor da Instituição Catalã de Pesquisa e Estudos Avançados na Universidade de Barcelona.

Essa questão é objeto de discórdia entre especialistas. “A hibridação é muito comum em espécies atuais, especialmente no mundo vegetal”, lembra José María Bermúdez de Castro, codiretor das pesquisas em Atapuerca. “Pode-se matizar o conceito de espécie, mas acho que não podemos abandoná-lo, porque é muito útil para podermos nos entender”, ressalta.

Escavações no sítio arqueológico de Nesher Ramla.
Escavações no sítio arqueológico de Nesher Ramla. Zaidner

Muitas nuances entram em jogo nessa questão. A evidente diferença entre sapiens e neandertais não é a mesma coisa que a identidade como espécie do Homo luzonensis, do qual só conhecemos alguns poucos ossos e dentes, ou dos denisovanos, dos quais a maioria das informações vem do DNA extraído de fósseis minúsculos.

“Curiosamente, apesar dos cruzamentos frequentes, tanto os sapiens como os neandertais foram espécies perfeitamente reconhecíveis e distinguíveis até o fim”, destaca Martinón-Torres. “Os traços do neandertal tardio são mais marcados que os dos anteriores, em vez de terem se apagado como consequência do cruzamento. Houve trocas biológicas, e talvez culturais também, mas nenhuma das espécies deixou de ser ela, distintiva, reconhecível em sua biologia, seu aspecto, suas adaptações específicas, seu nicho ecológico ao longo de sua história evolutiva. Acredito que esse é o melhor exemplo de que a hibridação não colide necessariamente com o conceito de espécie”, conclui. Seu colega Hershkovitz alerta que o debate continuará: “Estamos fazendo escavações em outras três cavernas em Israel onde encontramos fósseis humanos que nos darão uma nova perspectiva sobre a evolução humana”.

UMaine researchers: Culture drives human evolution more than genetics (Eureka Alert!)

News Release 2-Jun-2021

University of Maine

Research News

In a new study, University of Maine researchers found that culture helps humans adapt to their environment and overcome challenges better and faster than genetics.

After conducting an extensive review of the literature and evidence of long-term human evolution, scientists Tim Waring and Zach Wood concluded that humans are experiencing a “special evolutionary transition” in which the importance of culture, such as learned knowledge, practices and skills, is surpassing the value of genes as the primary driver of human evolution.

Culture is an under-appreciated factor in human evolution, Waring says. Like genes, culture helps people adjust to their environment and meet the challenges of survival and reproduction. Culture, however, does so more effectively than genes because the transfer of knowledge is faster and more flexible than the inheritance of genes, according to Waring and Wood.

Culture is a stronger mechanism of adaptation for a couple of reasons, Waring says. It’s faster: gene transfer occurs only once a generation, while cultural practices can be rapidly learned and frequently updated. Culture is also more flexible than genes: gene transfer is rigid and limited to the genetic information of two parents, while cultural transmission is based on flexible human learning and effectively unlimited with the ability to make use of information from peers and experts far beyond parents. As a result, cultural evolution is a stronger type of adaptation than old genetics.

Waring, an associate professor of social-ecological systems modeling, and Wood, a postdoctoral research associate with the School of Biology and Ecology, have just published their findings in a literature review in the Proceedings of the Royal Society B, the flagship biological research journal of The Royal Society in London.

“This research explains why humans are such a unique species. We evolve both genetically and culturally over time, but we are slowly becoming ever more cultural and ever less genetic,” Waring says.

Culture has influenced how humans survive and evolve for millenia. According to Waring and Wood, the combination of both culture and genes has fueled several key adaptations in humans such as reduced aggression, cooperative inclinations, collaborative abilities and the capacity for social learning. Increasingly, the researchers suggest, human adaptations are steered by culture, and require genes to accommodate.

Waring and Wood say culture is also special in one important way: it is strongly group-oriented. Factors like conformity, social identity and shared norms and institutions — factors that have no genetic equivalent — make cultural evolution very group-oriented, according to researchers. Therefore, competition between culturally organized groups propels adaptations such as new cooperative norms and social systems that help groups survive better together.

According to researchers, “culturally organized groups appear to solve adaptive problems more readily than individuals, through the compounding value of social learning and cultural transmission in groups.” Cultural adaptations may also occur faster in larger groups than in small ones.

With groups primarily driving culture and culture now fueling human evolution more than genetics, Waring and Wood found that evolution itself has become more group-oriented.

“In the very long term, we suggest that humans are evolving from individual genetic organisms to cultural groups which function as superorganisms, similar to ant colonies and beehives,” Waring says. “The ‘society as organism’ metaphor is not so metaphorical after all. This insight can help society better understand how individuals can fit into a well-organized and mutually beneficial system. Take the coronavirus pandemic, for example. An effective national epidemic response program is truly a national immune system, and we can therefore learn directly from how immune systems work to improve our COVID response.”

###

Waring is a member of the Cultural Evolution Society, an international research network that studies the evolution of culture in all species. He applies cultural evolution to the study of sustainability in social-ecological systems and cooperation in organizational evolution.

Wood works in the UMaine Evolutionary Applications Laboratory managed by Michael Kinnison, a professor of evolutionary applications. His research focuses on eco-evolutionary dynamics, particularly rapid evolution during trophic cascades.

Neanderthals carb loaded, helping grow their big brains (Science)

sciencemag.org

By Ann GibbonsMay. 10, 2021 , 3:00 PM 5-7 minutos


A reconstruction of Neanderthal mealtime Mauricio Anton/Science Source

Here’s another blow to the popular image of Neanderthals as brutish meat eaters: A new study of bacteria collected from Neanderthal teeth shows that our close cousins ate so many roots, nuts, or other starchy foods that they dramatically altered the type of bacteria in their mouths. The finding suggests our ancestors had adapted to eating lots of starch by at least 600,000 years ago—about the same time as they needed more sugars to fuel a big expansion of their brains.

The study is “groundbreaking,” says Harvard University evolutionary biologist Rachel Carmody, who was not part of the research. The work suggests the ancestors of both humans and Neanderthals were cooking lots of starchy foods at least 600,000 years ago. And they had already adapted to eating more starchy plants long before the invention of agriculture 10,000 years ago, she says.

The brains of our ancestors doubled in size between 2 million and 700,000 years ago. Researchers have long credited better stone tools and cooperative hunting: As early humans got better at killing animals and processing meat, they ate a higher quality diet, which gave them more energy more rapidly to fuel the growth of their hungrier brains.

Still, researchers have puzzled over how meat did the job. “For human ancestors to efficiently grow a bigger brain, they needed energy dense foods containing glucose”—a type of sugar—says molecular archaeologist Christina Warinner of Harvard and the Max Planck Institute for the Science of Human History. “Meat is not a good source of glucose.”

Researchers analyzed the bacterial DNA preserved in dental plaque of fossilized teeth, such as this one from a prehistoric human. Werner Siemens Foundation/Felix Wey

The starchy plants gathered by many living hunter-gatherers are an excellent source of glucose, however. To figure out whether oral bacteria track changes in diet or the environment, Warinner, Max Planck graduate student James Fellows Yates, and a large international team looked at the oral bacteria stuck to the teeth of Neanderthals, preagriculture modern humans that lived more than 10,000 years ago, chimps, gorillas, and howler monkeys. The researchers analyzed billions of DNA fragments from long-dead bacteria still preserved on the teeth of 124 individuals. One was a Neanderthal who lived 100,000 years ago at Pešturina Cave in Serbia, which produced the oldest oral microbiome genome reconstructed to date.

The communities of bacteria in the mouths of preagricultural humans and Neanderthals strongly resembled each other, the team reports today in the Proceedings of the National Academy of Sciences. In particular, humans and Neanderthals harbored an unusual group of Streptococcus bacteria in their mouths. These microbes had a special ability to bind to an abundant enzyme in human saliva called amylase, which frees sugars from starchy foods. The presence of the strep bacteria that consume sugar on the teeth of Neanderthals and ancient modern humans, but not chimps, shows they were eating more starchy foods, the researchers conclude.

Finding the streptococci on the teeth of both ancient humans and Neanderthals also suggests they inherited these microbes from their common ancestor, who lived more than 600,000 years ago. Although earlier studies found evidence that Neanderthals ate grasses and tubers and cooked barley, the new study indicates they ate so much starch that it dramatically altered the composition of their oral microbiomes.

“This pushes the importance of starch in the diet further back in time,” to when human brains were still expanding, Warinner says. Because the amylase enzyme is much more efficient at digesting cooked rather than raw starch, the finding also suggests cooking, too, was common by 600,000 years ago, Carmody says. Researchers have debated whether cooking became common when the big brain began to expand almost 2 million years ago or it spread later, during a second surge of growth.

The study offers a new way to detect major shifts in diet, says geneticist Ran Blekhman of the University of Minnesota, Twin Cities. In the case of Neanderthals, it reveals how much they depended on plants.

“We sometimes have given short shrift to the plant components of the diet,” says anthropological geneticist Anne Stone of Arizona State University, Tempe. “As we know from modern hunter-gatherers, it’s often the gathering that ends up providing a substantial portion of the calories.”

Humans were apex predators for two million years (Eureka Alert!)

News Release 5-Apr-2021

What did our ancestors eat during the stone age? Mostly meat

Tel-Aviv University

IMAGE
IMAGE: Human Brain. Credit: Dr. Miki Ben Dor

Researchers at Tel Aviv University were able to reconstruct the nutrition of stone age humans. In a paper published in the Yearbook of the American Physical Anthropology Association, Dr. Miki Ben-Dor and Prof. Ran Barkai of the Jacob M. Alkov Department of Archaeology at Tel Aviv University, together with Raphael Sirtoli of Portugal, show that humans were an apex predator for about two million years. Only the extinction of larger animals (megafauna) in various parts of the world, and the decline of animal food sources toward the end of the stone age, led humans to gradually increase the vegetable element in their nutrition, until finally they had no choice but to domesticate both plants and animals – and became farmers.

“So far, attempts to reconstruct the diet of stone-age humans were mostly based on comparisons to 20th century hunter-gatherer societies,” explains Dr. Ben-Dor. “This comparison is futile, however, because two million years ago hunter-gatherer societies could hunt and consume elephants and other large animals – while today’s hunter gatherers do not have access to such bounty. The entire ecosystem has changed, and conditions cannot be compared. We decided to use other methods to reconstruct the diet of stone-age humans: to examine the memory preserved in our own bodies, our metabolism, genetics and physical build. Human behavior changes rapidly, but evolution is slow. The body remembers.”

In a process unprecedented in its extent, Dr. Ben-Dor and his colleagues collected about 25 lines of evidence from about 400 scientific papers from different scientific disciplines, dealing with the focal question: Were stone-age humans specialized carnivores or were they generalist omnivores? Most evidence was found in research on current biology, namely genetics, metabolism, physiology and morphology.

“One prominent example is the acidity of the human stomach,” says Dr. Ben-Dor. “The acidity in our stomach is high when compared to omnivores and even to other predators. Producing and maintaining strong acidity require large amounts of energy, and its existence is evidence for consuming animal products. Strong acidity provides protection from harmful bacteria found in meat, and prehistoric humans, hunting large animals whose meat sufficed for days or even weeks, often consumed old meat containing large quantities of bacteria, and thus needed to maintain a high level of acidity. Another indication of being predators is the structure of the fat cells in our bodies. In the bodies of omnivores, fat is stored in a relatively small number of large fat cells, while in predators, including humans, it’s the other way around: we have a much larger number of smaller fat cells. Significant evidence for the evolution of humans as predators has also been found in our genome. For example, geneticists have concluded that “areas of the human genome were closed off to enable a fat-rich diet, while in chimpanzees, areas of the genome were opened to enable a sugar-rich diet.”

Evidence from human biology was supplemented by archaeological evidence. For instance, research on stable isotopes in the bones of prehistoric humans, as well as hunting practices unique to humans, show that humans specialized in hunting large and medium-sized animals with high fat content. Comparing humans to large social predators of today, all of whom hunt large animals and obtain more than 70% of their energy from animal sources, reinforced the conclusion that humans specialized in hunting large animals and were in fact hypercarnivores.

“Hunting large animals is not an afternoon hobby,” says Dr. Ben-Dor. “It requires a great deal of knowledge, and lions and hyenas attain these abilities after long years of learning. Clearly, the remains of large animals found in countless archaeological sites are the result of humans’ high expertise as hunters of large animals. Many researchers who study the extinction of the large animals agree that hunting by humans played a major role in this extinction – and there is no better proof of humans’ specialization in hunting large animals. Most probably, like in current-day predators, hunting itself was a focal human activity throughout most of human evolution. Other archaeological evidence – like the fact that specialized tools for obtaining and processing vegetable foods only appeared in the later stages of human evolution – also supports the centrality of large animals in the human diet, throughout most of human history.”

The multidisciplinary reconstruction conducted by TAU researchers for almost a decade proposes a complete change of paradigm in the understanding of human evolution. Contrary to the widespread hypothesis that humans owe their evolution and survival to their dietary flexibility, which allowed them to combine the hunting of animals with vegetable foods, the picture emerging here is of humans evolving mostly as predators of large animals.

“Archaeological evidence does not overlook the fact that stone-age humans also consumed plants,” adds Dr. Ben-Dor. “But according to the findings of this study plants only became a major component of the human diet toward the end of the era.”

Evidence of genetic changes and the appearance of unique stone tools for processing plants led the researchers to conclude that, starting about 85,000 years ago in Africa, and about 40,000 years ago in Europe and Asia, a gradual rise occurred in the consumption of plant foods as well as dietary diversity – in accordance with varying ecological conditions. This rise was accompanied by an increase in the local uniqueness of the stone tool culture, which is similar to the diversity of material cultures in 20th-century hunter-gatherer societies. In contrast, during the two million years when, according to the researchers, humans were apex predators, long periods of similarity and continuity were observed in stone tools, regardless of local ecological conditions.

“Our study addresses a very great current controversy – both scientific and non-scientific,” says Prof. Barkai. “For many people today, the Paleolithic diet is a critical issue, not only with regard to the past, but also concerning the present and future. It is hard to convince a devout vegetarian that his/her ancestors were not vegetarians, and people tend to confuse personal beliefs with scientific reality. Our study is both multidisciplinary and interdisciplinary. We propose a picture that is unprecedented in its inclusiveness and breadth, which clearly shows that humans were initially apex predators, who specialized in hunting large animals. As Darwin discovered, the adaptation of species to obtaining and digesting their food is the main source of evolutionary changes, and thus the claim that humans were apex predators throughout most of their development may provide a broad basis for fundamental insights on the biological and cultural evolution of humans.”

Can Evolution Explain All Dark Animal Behaviors? (Discovery)

discovermagazine.com

Many actions that would be considered heinous to humans — cannibalism, eating offspring, torture and rape — have been observed in the animal kingdom. Most (but not all) eyebrow-raising behaviors among animals have an evolutionary underpinning.

By Tim Brinkhof, March 9, 2021 3:00 PM

evil looking chimp - shutterstock
(Credit: Sharon Morris/Shutterstock)

“In sober truth,” wrote the British philosopher John Stuart Mill, “nearly all the things which men are hanged or imprisoned for doing to one another, are nature’s everyday performances.” While it is true that rape, torture and murder are more commonplace in the animal kingdom than they are in human civilization, our fellow creatures almost always seem to have some kind of evolutionary justification for their actions — one that we Homo sapiens lack.

Cats, for instance, are known to toy with small birds and rodents before finally killing them. Although it is easy to conclude that this makes the popular pet a born sadist, some zoologists have proposed that exhausting prey is the safest way of catching them. Similarly, it’s tempting to describe the way African lions and bottlenose dolphins –– large, social mammals –– commit infanticide (the killing of young offspring), as possibly psychopathic. Interestingly, experts suspect that these creatures are in fact doing themselves a favor; by killing offspring, adult males are making their female partners available to mate again.

These behaviors, which initially may seem symptomatic of some sinister psychological defect, turn out to be nothing more than different examples of the kind of selfishness that evolution is full of. Well played, Mother Nature.

But what if harming others is of no benefit to the assailant? In the human world, senseless destruction features on virtually every evening news program. In the animal world, where the laws of nature –– so we’ve been taught –– don’t allow for moral crises, it’s a different story. By all accounts, such undermining behavior shouldn’t be able to occur. Yet it does, and it’s as puzzling to biologists as the existence of somebody like Ted Bundy or Adolf Hitler has been to theodicists –– those who follow a philosophy of religion that ponders why God permits evil.

Cains and Abels

According to Charles Darwin’s theory of evolution, genes that increase an organism’s ability to survive are passed down, while those that don’t are not. Although Darwin remains an important reference point for how humans interpret the natural world, he is not infallible. During the 1960s, biologist W.D. Hamilton proposed that On the Origins of Species failed to account for the persistency of traits that didn’t directly benefit the animal in question.

The first of these two patterns –– altruism –– was amalgamated into Darwin’s theory of evolution when researchers uncovered its evolutionary benefits. One would think that creatures are hardwired to avoid self-sacrifice, but this is not the case. The common vampire bat shares its food with roostmates whose hunt ended in failure. Recently, Antarctic plunder fish have been found to guard the nests of others if they are left unprotected. In both of these cases, altruistic behavior is put on display when the indirect benefit to relatives of the animal in question outweighs the direct cost incurred by that animal.

In Search of Spite

The second animal behavior –– spite –– continues to be difficult to make sense of. For humans, its concept is a familiar yet elusive one, perhaps understood best through the Biblical story of Cain and Abel or the writings of Fyodor Dostoevsky. Although a number of prominent evolutionary biologists –– from Frans de Waal to members of the West Group at the University of Oxford’s Department of Zoology –– have made entire careers out of studying the overlap between animal and human behavior, even they warn against the stubborn tendency to anthropomorphize nonhuman subjects.

As Edward O. Wilson put it in his study, “The Insect Societies,” spite refers to any “behavior that gains nothing or may even diminish the fitness of the individual performing the act, but is definitely harmful to the fitness of another.” Wilson’s definition, which is generally accepted by biologists, allows researchers to study its occurrence in an objective, non-anthropomorphized manner. It initially drew academic attention to species of fish and birds that destroyed the eggs (hatched or unhatched) of rival nests, all at no apparent benefit to them.

Emphasis on “apparent,” though, because –– as those lions and dolphins demonstrated earlier –– certain actions and consequences aren’t always what we think they are. In their research, biologists Andy Gardner and Stuart West maintain that many of the animal behaviors which were once thought spiteful are now understood as selfish. Not in the direct sense of the word (approaching another nest often leads to brutal clashes with its guardian), but an indirect one: With fewer generational competitors, the murderer’s own offspring are more likely to thrive.

For a specific action to be considered true spite, a few more conditions have to be met. The cost incurred by the party acting out the behavior must be “smaller than the product of the negative benefit to the recipient and negative relatedness of the recipient to the actor,” Gardner and West wrote in Current Biology. In other words, a creature can be considered spiteful if harming other creatures does them more bad than good. So far, true spite has only been observed rarely in the animal kingdom, and mostly occurs among smaller creatures.

The larvae of polyembryonic parasitoid wasps, which hatch from eggs that are laid on top of caterpillar eggs, occasionally develop into adults that are not just infertile but have a habit of eating other larvae. From an evolutionary perspective, developing into this infertile form is not a smart move for the wasp because it cannot pass on its genes to the next generation. Nor does it help the creature’s relatives survive, as they are then at risk of being eaten.

That doesn’t mean spite is relegated to the world of insects. It also pops up among monkeys, where it tends to manifest in more recognizable forms. In a 2016 study, Harvard University psychology researchers Kristin Leimgruber and Alexandra Rosati separated chimpanzees and capuchins from the rest of the group during feeding time and gave them the option take away everyone’s food. While the chimps only ever denied food to those who violated their group’s social norms, the capuchins often acted simply out of spite. As Leimgruber explains: “Our study provides the first evidence of a non-human primate choosing to punish others simply because they have more. This sort of ‘if I can’t have it, no one can’ response is consistent with psychological spite, a behavior previously believed unique to humans.”

Beyond the Dark Tetrad

Of course, spite isn’t the only type of complex and curiously human behavior for which the principles of evolution have not produced an easily discoverable (or digestible) answer. Just as confounding are the four components of the Dark Tetrad — a model for categorizing malevolent behaviors, assembled by personality psychologist Delroy Paulhus. The framework’s traits include narcissism, Machiavellianism, psychopathy and everyday sadism.

Traces of all four have been found inside the animal kingdom. The intertribal warfare among chimpanzees is, first and foremost, a means of controlling resources. At the same time, many appear to actively enjoy partaking in hyperviolent patrols. Elsewhere, primate researchers who have made advances in the assessment of great ape psychology suggest the existence of psychotic personality types. As for Machiavellianism, the willingness to hurt relatives in order to protect oneself has been observed in both rhesus macaques and Nile tilapia.

Although the reasons for certain types of animal behavior are still debated, the nature of these discussions tend to be markedly different from discourse around, say, the motivations of serial killers. And often, researchers have a solid understanding of the motivations and feelings of their own study subjects but not those outside of their purview. Regardless of whether the academic community is talking about humans or animals, however, the underlying conviction guiding the conversation — that every action, no matter how upsetting or implacable, must have a logical explanation — is one and the same. 

Israeli Archaeologists Present Groundbreaking Universal Theory of Human Evolution (Haaretz)

Tel Aviv University archaeologists Miki Ben-Dor and Ran Barkai proffer novel hypothesis, showing how the greed of Homo erectus set us careening down an anomalous evolutionary path

Ruth Schuster, Feb. 25, 2021

Why the human brain evolved as it did never has been plausibly explained. Apparently, not since the first life-form billions of years ago did a single species gain dominance over all others – until we came along. Now, in a groundbreaking paper, two Israeli researchers propose that our anomalous evolution was propelled by the very mass extinctions we helped cause. Or: As we sawed off the culinary branches from which we swung, we had to get ever more inventive in order to survive.

As ambling, slow-to-reproduce large animals diminished and gradually went extinct, we were forced to resort to smaller, nimbler animals that flee as a strategy to escape predation. To catch them, we had to get smarter, nimbler and faster, according to the universal theory of human evolution proposed by researchers Miki Ben-Dor and Prof. Ran Barkai of Tel Aviv University, in a paper published in the journal Quaternary.

In fact, the great African megafauna began to decline about 4.6 million years ago. But our story begins with Homo habilis, which lived about 2.6 million years ago and apparently used crude stone tools to help it eat flesh, and with Homo erectus, which thronged Africa and expanded to Eurasia about 2 million years ago. The thing is, erectus wasn’t an omnivore: it was a carnivore, Ben-Dor explains to Haaretz.

“Eighty percent of mammals are omnivores but still specialize in a narrow food range. If anything, it seems Homo erectus was a hyper-carnivore,” he observes.

And in the last couple of million years, our brains grew threefold to a maximum capacity of about 1,500 cranial capacity, a size achieved about 300,000 years ago. We also gradually but consistently ramped up in technology and culture – until the Neolithic revolution and advent of the sedentary lifestyle, when our brains shrank to about 1,400 to 1,300cc, but more on that anomaly later.

The hypothesis suggested by Ben-Dor and Barkai – that we ate our way to our present physical, cultural and ecological state – is an original unifying explanation for the behavioral, physiological and cultural evolution of the human species.

Out of chaos

Evolution is chaotic. Charles Darwin came up with the theory of the survival of the fittest, and nobody has a better suggestion yet, but mutations aren’t “planned.” Bodies aren’t “designed,” if we leave genetic engineering out of it. The point is, evolution isn’t linear but chaotic, and that should theoretically apply to humans too.

Hence, it is strange that certain changes in the course of millions of years of human history, including the expansion of our brain, tool manufacture techniques and use of fire, for example, were uncharacteristically progressive, say Ben-Dor and Barkai.

“Uncharacteristically progressive” means that certain traits such as brain size, or cultural developments such as fire usage, evolved in one direction over a long time, in the direction of escalation. That isn’t what chaos is expected to produce over vast spans of time, Barkai explains to Haaretz: it is bizarre. Very few parameters behave like that.

So, their discovery of correlation between contraction of the average weight of African animals, the extinction of megafauna and the development of the human brain is intriguing.

From mammoth marrow to joint of rat

To be clear, just this month a new paper posited that the late Quaternary extinction of megafauna, in the last few tens of thousands of years, wasn’t entirely the fault of humanity. In North America specifically, it was due primarily to climate change, with the late-arriving humans apparently providing the coup de grâce to some species.

In the Old World, however, a human role is clearer. African megafauna apparently began to decline 4.6 million years ago, but during the Pleistocene (2.6 million to 11,600 years ago) the size of African animals trended sharply down, in what the authors term an abrupt reversal from a continuous growth trend of 65 million years (i.e., since the dinosaurs almost died out).

When Homo erectus the carnivore began to roam Africa around 2 million years ago, land mammals averaged nearly 500 kilograms. Barkai’s team and others have demonstrated that hominins ate elephants and large animals when they could. In fact, originally Africa had six elephant species (today there are two: the bush elephant and forest elephant). By the end of the Pleistocene, by which time all hominins other than modern humans were extinct too, that average weight of the African animal had shrunk by more than 90 percent.

And during the Pleistocene, as the African animals shrank, the Homo genus grew taller and more gracile, and our stone tool technology improved (which in no way diminished our affection for archaic implements like the hand ax or chopper, both of which remained in use for more than a million years, even as more sophisticated technologies were developed).

If we started some 3.3 million years ago with large, crude stone hammers that may have been used to bang big animals on the head or break bones to get at the marrow, over the epochs we invented the spear for remote slaughter. By about 80,000 years ago, the bow and arrow was making its appearance, which was more suitable for bringing down small fry like small deer and birds. Over a million years ago, we began to use fire, and later achieved better control of it, meaning the ability to ignite it at will. Later we domesticated the dog from the wolf, and it would help us hunt smaller, fleet animals.

Why did the earliest humans hunt large animals anyway? Wouldn’t a peeved elephant be more dangerous than a rat? Arguably, but catching one elephant is easier than catching a large number of rats. And megafauna had more fat.

A modern human can only derive up to about 50 percent of calories from lean meat (protein): past a certain point, our livers can’t digest more protein. We need energy from carbs or fat, but before developing agriculture about 10,000 years ago, a key source of calories had to be animal fat.

Big animals have a lot of fat. Small animals don’t. In Africa and Europe, and in Israel too, the researchers found a significant decline in the prevalence of animals weighing over 200 kilograms correlated to an increase in the volume of the human brain. Thus, Ben-Dor and Barkai deduce that the declining availability of large prey seems to have been a key element in the natural selection from Homo erectus onward. Catching one elephant is more efficient than catching 1,000 rabbits, but if we must catch 1,000 rabbits, improved cunning, planning and tools are in order.

Say it with fat

Our changing hunting habits would have had cultural impacts too, Ben-Dor and Barkai posit. “Cultural evolution in archaeology usually refers to objects, such as stone tools,” Ben-Dor tells Haaretz. But cultural evolution also refers to learned behavior, such as our choice of which animals to hunt, and how.

Thus, they posit, our hunting conundrum may have also been a key element to that enigmatic human characteristic: complex language. When language began, with what ancestor of Homo sapiens, if any before us, is hotly debated.

Ben-Dor, an economist by training prior to obtaining a Ph.D. in archaeology, believes it began early. “We just need to follow the money. When speaking of evolution, one must follow the energy. Language is energetically costly. Speaking requires devotion of part of the brain, which is costly. Our brain consumes huge amounts of energy. It’s an investment, and language has to produce enough benefit to make it worthwhile. What did language bring us? It had to be more energetically efficient hunting.”

Domestication of the dog also requires resources and, therefore, also had to bring sufficient compensation in the form of more efficient hunting of smaller animals, he points out. That may help explain the fact that Neolithic humans not only embraced the dog but ate it too, going by archaeological evidence of butchered dogs.

At the end of the day, wherever we went, humans devastated the local ecologies, given enough time.

There is a lot of thinking about the Neolithic agricultural revolution. Some think grain farming was driven by the desire to make beer. Given residue analysis indicating that it’s been around for over 10,000 years, that theory isn’t as far-fetched as one might think. Ben-Dor and Barkai suggest that once we could grow our own food and husband herbivores, the megafauna almost entirely gone, hunting for them became too energy-costly. So we had to use our large brains to develop agriculture.

And as the hunter-gathering lifestyle gave way to permanent settlement, our brain size decreased.

Note, Ben-Dor adds, that the brains of wolves which have to hunt to survive are larger than the brain of the domesticated wolf, i.e., dogs. We did promise more on that. That was it. Also: The chimpanzee brain has remained stable for 7 million years, since the split with the Homo line, Barkai points out.

“Why does any of this matter?” Ben-Dor asks. “People think humans reached this condition because it was ‘meant to be.’ But in the Earth’s 4.5 billion years, there have been billions of species. They rose and fell. What’s the probability that we would take over the world? It’s an accident of nature. It never happened before that one species achieved dominance over all, and now it’s all over. How did that happen? This is the answer: A non-carnivore entered the niche of carnivore, and ate out its niche. We can’t eat that much protein: we need fat too. Because we needed the fat, we began with the big animals. We hunted the prime adult animals which have more fat than the kiddies and the old. We wiped out the prime adults who were crucial to survival of species. Because of our need for fat, we wiped out the animals we depended on. And this required us to keep getting smarter and smarter, and thus we took over the world.”

Study suggests environmental factors had a role in the evolution of human tolerance (Eureka Alert)

News Release 3-Feb-2021

Study suggests environmental factors had a role in the evolution of human tolerance and friendliness

University of York

Environmental pressures may have led humans to become more tolerant and friendly towards each other as the need to share food and raw materials became mutually beneficial, a new study suggests.

This behaviour was not an inevitable natural progression, but subject to ecological pressures, the University of York study concludes.

Humans have a remarkable capacity to care about people well outside their own kin or local group. Whilst most other animals tend to be defensive towards those in other groups our natural tolerance allows us to collaborate today on a global scale, as seen with trade or international relief efforts to provide aid for natural disasters.

Using computer simulations of many thousands of individuals gathering resources for their group and interacting with individuals from other groups, the research team attempted to establish what key evolutionary pressures may have prompted human intergroup tolerance.

The study suggests this may have begun when humans began to leave Africa and during a period of increasingly harsh and variable environments.

The study was concerned with the period 300,000 to 30,000 years ago where archaeological evidence indicated greater mobility and more frequent interactions between different groups. In particular, this is a time in which there is a movement of raw materials over much longer distances and between groups.

The researchers found that populations which shared resources were more likely to be more successful and more likely to survive harsh environments, where extinctions occur, than those populations which do not share across borders.

However, in resource rich environments sharing was less advantageous and in extremely harsh environments populations are too low for sharing to be feasible.

Penny Spikins, Professor in the Archaeology of Human Origins at the University of York, said: “That our study demonstrates the importance of tolerance to human success is perhaps surprising, especially when we often think of prehistory as a time of competition, however we have seen that in situations where people with surplus share across borders with those in need everyone benefits in the long term.”

Dr Jennifer C. French, lecturer in Palaeolithic Archaeology at the University of Liverpool added: “Our study’s findings also have important implications for wider debates about the increases in examples of innovation and greater rates of cultural evolution that occurred during this period.

“They help to explain previously enigmatic changes in the archaeological record between 300,000 and 30,000 years ago.”

###

The study is published in the Journal of Archaeological Method and Theory.

Are Humans Still Evolving? Scientists Weigh In (Science Alert)

sciencealert.com

Eva Hamrud, Metafact – 20 Sept. 2020


As a species, humans have populated almost every corner of the earth. We have developed technologies and cultures which shape the world we live in.

The idea of ‘natural selection’ or ‘survival of the fittest’ seems to make sense in Stone Age times when we were fighting over scraps of meat, but does it still apply now?

We asked 12 experts whether humans are still evolving. The expert consensus is unanimously ‘yes’, however scientists say we might have the wrong idea of what evolution actually is.

Evolution is not the same as natural selection

Evolution is often used interchangeable with the phrases ‘survival of the fittest’ or ‘natural selection’. Actually, these are not quite the same thing.

‘Evolution’ simply means the gradual change of a population over time.

‘Natural selection’ is a mechanism by which evolution can occur. Our Stone Age ancestors who were faster runners avoided being trampled by mammoths and were more likely to have children. That is ‘natural selection’.

Overtime, the human population became faster at running. That’s evolution.

Evolution can happen without natural selection

That makes sense for Stone Age humans, but what about nowadays? We don’t need to outrun mammoths, we have medicines for when we’re sick and we can go to the shops to get food.

Natural selection needs a ‘selection pressure’ (e.g. dangerous trampling mammoths), so if we don’t have these anymore, does this mean we stop evolving?

Even with no selection pressures, experts say evolution still occurs by other mechanisms.

Professor Stanley Ambrose, an anthropologist from the University of Illinois, explains that “any change in the proportions of genes or gene variants over time is also considered evolution. The variants may be functionally equivalent, so evolution does not automatically equate with ‘improvement'”.

Whilst some genes can be affected by natural selection (e.g. genes that help us run faster), other changes in our DNA might have no obvious effect on us. ‘Neutral’ variations can also spread through a population by a different mechanism called ‘genetic drift’.

Genetic drift works by chance: some individuals might be unlucky and die for reasons which have nothing to do with their genes. Their unique gene variations will not be passed on to the next generation, and so the population will change.

Genetic drift doesn’t need any selection pressures, and it is still happening today.

Natural selection is still happening in humans

As much as we have made things easier for ourselves, there are still selection pressures around us, which mean that natural selection is still happening.

Like all mammals, humans lose the ability to digest milk when they stop breastfeeding. This is because we stop making an enzyme called lactase. In some countries, the population has acquired ‘lactase persistence’, meaning that people make lactase throughout their lives.

In European countries we can thank one specific gene variation for our lactase persistence, which is called ‘-13910*T’. By studying this specific gene variation in modern and ancient DNA samples, researchers suggest that it became common after humans started domesticated and milking animals.

This is an example of natural selection where we have actually made the selection pressure ourselves – we started drinking milk, so we evolved to digest it!

Another example of humans undergoing natural selection to adapt to a lifestyle is the Bajau people, who traditionally live in houseboats in the waters of South East Asia and spend much of their lives diving to hunt fish or collect shellfish.

Ultrasound imaging has found that Bajau people have larger spleens than their neighbours – an adaption which allows them to stay underwater for longer.

There are always selective pressures around us, even ones that we create ourselves.

As Dr Benjamin Hunt from the University of Birmingham puts it, “Our technological and cultural changes alter the strength and composition of the selection pressures within our environment, but selection pressures still exist.”

Evolution can’t be stopped

So, evolution can happen by different mechanisms like natural selection and genetic drift. As our environment is always changing, natural selection is always happening. And even if our environment was ‘just right’ for us, we would evolve anyway!

Dr Alywyn Scally, an expert in evolution and genetics from the University of Cambridge, explains: “As long as human reproduction involves randomness and genetic mutation (and the laws of the Universe pretty much guarantee that this will always be the case at some level), there will continue to be differences from one generation to the next, meaning that the process of evolution can never be truly halted.”

Takeaway: Evolution means change in a population. That includes both easy-to-spot changes to adapt to an environment as well as more subtle, genetic changes.

Humans are still evolving, and that is unlikely to change in the future.

Article based on 12 expert answers to this question: Are humans still evolving?

This expert response was published in partnership with independent fact-checking platform Metafact.io. Subscribe to their weekly newsletter here.

Love the Fig (The New Yorker)

newyorker.com

Ben Crair, August 10, 2020

The produce section of the grocery store is a botanical disaster. Most people know that a tomato is technically a fruit, but so is an eggplant, a cucumber, and a spaghetti squash. A banana, which grows from a flower with a single ovary, is actually a berry, while a strawberry, which grows from a flower with several ovaries, isn’t a berry at all but an aggregate fruit. The most confusing classification, though, will start showing up on American shelves this month. Shoppers will find mission figs with the grapes, kiwis, and other fruit, but a clever botanist would sell them at the florist, with the fresh-cut roses. Although many people dismiss figs as a geriatric delicacy or the sticky stuff inside bad cookies, they are, in fact, something awesome: enclosed flowers that bloom modestly inward, unlike the flamboyant showoffs on other plants. Bite a fig in half and you’ll discover a core of tiny blossoms.

All kinds of critters, not only humans, frequent fig trees, but the plants owe their existence to what may be evolution’s most intimate partnership between two species. Because a fig is actually a ball of flowers, it requires pollination to reproduce, but, because the flowers are sealed, not just any bug can crawl inside.* That task belongs to a minuscule insect known as the fig wasp, whose life cycle is intertwined with the fig’s. Mother wasps lay their eggs in an unripe fig. After their offspring hatch and mature, the males mate and then chew a tunnel to the surface, dying when their task is complete. The females follow and take flight, riding the winds until they smell another fig tree. (One species of wasp, in Africa, travels ten times farther than any other known pollinator.) When the insects discover the right specimen, they go inside and deposit the pollen from their birthplace. Then the females lay new eggs, and the cycle begins again. For the wasp mother, however, devotion to the fig plant soon turns tragic. A fig’s entranceway is booby-trapped to destroy her wings, so that she can never visit another plant. When you eat a dried fig, you’re probably chewing fig-wasp mummies, too.

The fig and the fig wasp are a superlative example of what biologists call codependent evolution. The plants and insects have been growing old together for more than sixty million years. Almost every species of fig plant—more than seven hundred and fifty in total—has its own species of wasp, although some commercial fig production favors varieties that do not require pollination. (They are grown from cuttings and produce fruit without any seeds.) But codependence hasn’t made the fig and the fig wasp weak, like it can with humans. The figs and fig wasps’ pollination system is extremely efficient compared with that of other plants, some of which just trust the wind to blow their pollen where it needs to go. And the figs’ specialized flowers, far from isolating them in an evolutionary niche, have allowed them to radiate throughout the natural world. Fig plants can be shrubs, vines, or trees. Strangler figs sprout in the branches of another tree, drop their roots to the forest floor, and slowly envelop their host. The branches of a large strangler fig can stretch over acres and produce a million figs in one flowering. Figs themselves can be brown, red, white, orange, yellow, or green. (Wild figs are not as sweet as the plump and purple mission figs you buy at the farmers’ market.) And their seeds sprout where other plants’ would flounder: rooftops, cliff sides, volcanic islands. The fig genus, Ficus, is the most varied one in the tropics. It also routinely shows up in the greenhouse and the garden.

The variety and adaptability of fig plants make them a favorite foodstuff among animals. In 2001, a team of researchers published a review of the scientific literature and found records of fig consumption for nearly thirteen hundred bird and mammal species. One of the researchers, Mike Shanahan—a rain-forest ecologist and the author of a forthcoming book about figs, “Gods, Wasps, and Stranglers”—had spent time studying Malaysian fig trees as a Ph.D. candidate, in 1997. He would sometimes lie beneath a huge strangler fig and record its visitors, returning repeatedly for several days. “I would typically see twenty-five to thirty different species,” Shanahan told me. “The animals would include lots of different squirrel species and some curious creatures called tree shrews. There would be some monkeys and a whole range of different bird species, from tiny little flowerpeckers up to the hornbills, which are the biggest fruit-eating birds in Asia.” There were also pigeons, fruit doves, fairy bluebirds, barbets, and parrots. As the biologist Daniel Janzen put it in “How to Be a Fig,” an article from 1979, “Who eats figs? Everybody.”

With good reason, too. Figs are high in calcium, easy to chew and digest, and, unlike plants that fruit seasonally, can be found year-round. This is the fig plant’s accommodation of the fig wasp. A fig wasp departs a ripe fig to find an unripe fig, which means that there must always be figs at different stages. As a result, an animal can usually fall back on a fig when a mango or a lychee is not in season. Sometimes figs are the only things between an animal and starvation. According to a 2003 study of Uganda’s Budongo Forest, for instance, figs are the sole source of fruit for chimpanzees at certain times of year. Our pre-human ancestors probably filled up on figs, too. The plants are what is known as a keystone species: yank them from the jungle and the whole ecosystem would collapse.

Figs’ popularity means they can play a central role in bringing deforested land back to life. The plants grow quickly in inhospitable places and, thanks to the endurance of the fig wasps, can survive at low densities. And the animals they attract will, to put it politely, deposit nearby the seeds of other fruits they’ve eaten, thereby introducing a healthy variety of new plants. Nigel Tucker, a restoration ecologist in Australia, has recommended that ten per cent of new plants in tropical-reforestation projects be fig seedlings. Rhett Harrison, a former fig biologist, told me that the ratio could be even higher. “My inclination is that we should be going to some of these places and just planting a few figs,” he said.

Fig trees are also sometimes the only trees left standing from former forests. In parts of India, for instance, they are considered holy, and farmers are reluctant to chop them down. “Diverse cultures developed taboos against felling fig trees,” Shanahan told me. “They said they were homes to gods and spirits, and made them places of prayer and symbols of their society.” You can’t really taste the fig’s spiritual aura in a Fig Newton, but it shines in the mythologies of world religions. Buddha found enlightenment under a fig tree, and the Egyptian pharaohs built wooden sarcophagi from Ficus sycomorus. An apple tree might have cost Adam and Eve their innocence, but a fig tree, whose leaves they used to cover their nudity, gave them back some dignity. If only they had preferred figs in the first place, we might all still live in Eden.

*This article has been revised to clarify the fact that not all fig plants require pollination to produce edible fruit.

Viruses have big impacts on ecology and evolution as well as human health (The Economist)

economist.com

Aug 20th 2020 – 32-41 minutos


I
The outsiders inside

HUMANS ARE lucky to live a hundred years. Oak trees may live a thousand; mayflies, in their adult form, a single day. But they are all alive in the same way. They are made up of cells which embody flows of energy and stores of information. Their metabolisms make use of that energy, be it from sunlight or food, to build new molecules and break down old ones, using mechanisms described in the genes they inherited and may, or may not, pass on.

It is this endlessly repeated, never quite perfect reproduction which explains why oak trees, humans, and every other plant, fungus or single-celled organism you have ever seen or felt the presence of are all alive in the same way. It is the most fundamental of all family resemblances. Go far enough up any creature’s family tree and you will find an ancestor that sits in your family tree, too. Travel further and you will find what scientists call the last universal common ancestor, LUCA. It was not the first living thing. But it was the one which set the template for the life that exists today.

And then there are viruses. In viruses the link between metabolism and genes that binds together all life to which you are related, from bacteria to blue whales, is broken. Viral genes have no cells, no bodies, no metabolism of their own. The tiny particles, “virions”, in which those genes come packaged—the dot-studded disks of coronaviruses, the sinister, sinuous windings of Ebola, the bacteriophages with their science-fiction landing-legs that prey on microbes—are entirely inanimate. An individual animal, or plant, embodies and maintains the restless metabolism that made it. A virion is just an arrangement of matter.

The virus is not the virion. The virus is a process, not a thing. It is truly alive only in the cells of others, a virtual organism running on borrowed hardware to produce more copies of its genome. Some bide their time, letting the cell they share the life of live on. Others immediately set about producing enough virions to split their hosts from stem to stern.

The virus has no plan or desire. The simplest purposes of the simplest life—to maintain the difference between what is inside the cell and what is outside, to move towards one chemical or away from another—are entirely beyond it. It copies itself in whatever way it does simply because it has copied itself that way before, in other cells, in other hosts.

That is why, asked whether viruses are alive, Eckard Wimmer, a chemist and biologist who works at the State University of New York, Stony Brook, offers a yes-and-no. Viruses, he says, “alternate between nonliving and living phases”. He should know. In 2002 he became the first person in the world to take an array of nonliving chemicals and build a virion from scratch—a virion which was then able to get itself reproduced by infecting cells.

The fact that viruses have only a tenuous claim to being alive, though, hardly reduces their impact on things which are indubitably so. No other biological entities are as ubiquitous, and few as consequential. The number of copies of their genes to be found on Earth is beyond astronomical. There are hundreds of billions of stars in the Milky Way galaxy and a couple of trillion galaxies in the observable universe. The virions in the surface waters of any smallish sea handily outnumber all the stars in all the skies that science could ever speak of.

Back on Earth, viruses kill more living things than any other type of predator. They shape the balance of species in ecosystems ranging from those of the open ocean to that of the human bowel. They spur evolution, driving natural selection and allowing the swapping of genes.

They may have been responsible for some of the most important events in the history of life, from the appearance of complex multicellular organisms to the emergence of DNA as a preferred genetic material. The legacy they have left in the human genome helps produce placentas and may shape the development of the brain. For scientists seeking to understand life’s origin, they offer a route into the past separate from the one mapped by humans, oak trees and their kin. For scientists wanting to reprogram cells and mend metabolisms they offer inspiration—and powerful tools.

II
A lifestyle for genes

THE IDEA of a last universal common ancestor provides a plausible and helpful, if incomplete, answer to where humans, oak trees and their ilk come from. There is no such answer for viruses. Being a virus is not something which provides you with a place in a vast, coherent family tree. It is more like a lifestyle—a way of being which different genes have discovered independently at different times. Some viral lineages seem to have begun quite recently. Others have roots that comfortably predate LUCA itself.

Disparate origins are matched by disparate architectures for information storage and retrieval. In eukaryotes—creatures, like humans, mushrooms and kelp, with complex cells—as in their simpler relatives, the bacteria and archaea, the genes that describe proteins are written in double-stranded DNA. When a particular protein is to be made, the DNA sequence of the relevant gene acts as a template for the creation of a complementary molecule made from another nucleic acid, RNA. This messenger RNA (mRNA) is what the cellular machinery tasked with translating genetic information into proteins uses in order to do so.

Because they, too, need to have proteins made to their specifications, viruses also need to produce mRNAs. But they are not restricted to using double-stranded DNA as a template. Viruses store their genes in a number of different ways, all of which require a different mechanism to produce mRNAs. In the early 1970s David Baltimore, one of the great figures of molecular biology, used these different approaches to divide the realm of viruses into seven separate classes (see diagram).

In four of these seven classes the viruses store their genes not in DNA but in RNA. Those of Baltimore group three use double strands of RNA. In Baltimore groups four and five the RNA is single-stranded; in group four the genome can be used directly as an mRNA; in group five it is the template from which mRNA must be made. In group six—the retroviruses, which include HIV—the viral RNA is copied into DNA, which then provides a template for mRNAs.

Because uninfected cells only ever make RNA on the basis of a DNA template, RNA-based viruses need distinctive molecular mechanisms those cells lack. Those mechanisms provide medicine with targets for antiviral attacks. Many drugs against HIV take aim at the system that makes DNA copies of RNA templates. Remdesivir (Veklury), a drug which stymies the mechanism that the simpler RNA viruses use to recreate their RNA genomes, was originally developed to treat hepatitis C (group four) and subsequently tried against the Ebola virus (group five). It is now being used against SARSCoV-2 (group four), the covid-19 virus.

Studies of the gene for that RNA-copying mechanism, RdRp, reveal just how confusing virus genealogy can be. Some viruses in groups three, four and five seem, on the basis of their RdRp-gene sequence, more closely related to members of one of the other groups than they are to all the other members of their own group. This may mean that quite closely related viruses can differ in the way they store their genomes; it may mean that the viruses concerned have swapped their RdRp genes. When two viruses infect the same cell at the same time such swaps are more or less compulsory. They are, among other things, one of the mechanisms by which viruses native to one species become able to infect another.

How do genes take on the viral lifestyle in the first place? There are two plausible mechanisms. Previously free-living creatures could give up metabolising and become parasitic, using other creatures’ cells as their reproductive stage. Alternatively genes allowed a certain amount of independence within one creature could have evolved the means to get into other creatures.

Living creatures contain various apparently independent bits of nucleic acid with an interest in reproducing themselves. The smallest, found exclusively in plants, are tiny rings of RNA called viroids, just a few hundred genetic letters long. Viroids replicate by hijacking a host enzyme that normally makes mRNAs. Once attached to a viroid ring, the enzyme whizzes round and round it, unable to stop, turning out a new copy of the viroid with each lap.

Viroids describe no proteins and do no good. Plasmids—somewhat larger loops of nucleic acid found in bacteria—do contain genes, and the proteins they describe can be useful to their hosts. Plasmids are sometimes, therefore, regarded as detached parts of a bacteria’s genome. But that detachment provides a degree of autonomy. Plasmids can migrate between bacterial cells, not always of the same species. When they do so they can take genetic traits such as antibiotic resistance from their old host to their new one.

Recently, some plasmids have been implicated in what looks like a progression to true virus-hood. A genetic analysis by Mart Krupovic of the Pasteur Institute suggests that the Circular Rep-Encoding Single-Strand-DNA (CRESSDNA) viruses, which infect bacteria, evolved from plasmids. He thinks that a DNA copy of the genes that another virus uses to create its virions, copied into a plasmid by chance, provided it with a way out of the cell. The analysis strongly suggests that CRESSDNA viruses, previously seen as a pretty closely related group, have arisen from plasmids this way on three different occasions.

Such jailbreaks have probably been going on since very early on in the history of life. As soon as they began to metabolise, the first proto-organisms would have constituted a niche in which other parasitic creatures could have lived. And biology abhors a vacuum. No niche goes unfilled if it is fillable.

It is widely believed that much of the evolutionary period between the origin of life and the advent of LUCA was spent in an “RNA world”—one in which that versatile substance both stored information, as DNA now does, and catalysed chemical reactions, as proteins now do. Set alongside the fact that some viruses use RNA as a storage medium today, this strongly suggests that the first to adopt the viral lifestyle did so too. Patrick Forterre, an evolutionary biologist at the Pasteur Institute with a particular interest in viruses (and the man who first popularised the term LUCA) thinks that the “RNA world” was not just rife with viruses. He also thinks they may have brought about its end.

The difference between DNA and RNA is not large: just a small change to one of the “letters” used to store genetic information and a minor modification to the backbone to which these letters are stuck. And DNA is a more stable molecule in which to store lots of information. But that is in part because DNA is inert. An RNA-world organism which rewrote its genes into DNA would cripple its metabolism, because to do so would be to lose the catalytic properties its RNA provided.

An RNA-world virus, having no metabolism of its own to undermine, would have had no such constraints if shifting to DNA offered an advantage. Dr Forterre suggests that this advantage may have lain in DNA’s imperviousness to attack. Host organisms today have all sorts of mechanisms for cutting up viral nucleic acids they don’t like the look of—mechanisms which biotechnologists have been borrowing since the 1970s, most recently in the form of tools based on a bacterial defence called CRISPR. There is no reason to imagine that the RNA-world predecessors of today’s cells did not have similar shears at their disposal. And a virus that made the leap to DNA would have been impervious to their blades.

Genes and the mechanisms they describe pass between viruses and hosts, as between viruses and viruses, all the time. Once some viruses had evolved ways of writing and copying DNA, their hosts would have been able to purloin them in order to make back-up copies of their RNA molecules. And so what began as a way of protecting viral genomes would have become the way life stores all its genes—except for those of some recalcitrant, contrary viruses.

III
The scythes of the seas

IT IS A general principle in biology that, although in terms of individual numbers herbivores outnumber carnivores, in terms of the number of species carnivores outnumber herbivores. Viruses, however, outnumber everything else in every way possible.

This makes sense. Though viruses can induce host behaviours that help them spread—such as coughing—an inert virion boasts no behaviour of its own that helps it stalk its prey. It infects only that which it comes into contact with. This is a clear invitation to flood the zone. In 1999 Roger Hendrix, a virologist, suggested that a good rule of thumb might be ten virions for every living individual creature (the overwhelming majority of which are single-celled bacteria and archaea). Estimates of the number of such creatures on the planet come out in the region of 1029-1030. If the whole Earth were broken up into pebbles, and each of those pebbles smashed into tens of thousands of specks of grit, you would still have fewer pieces of grit than the world has virions. Measurements, as opposed to estimates, produce numbers almost as arresting. A litre of seawater may contain more than 100bn virions; a kilogram of dried soil perhaps a trillion.

Metagenomics, a part of biology that looks at all the nucleic acid in a given sample to get a sense of the range of life forms within it, reveals that these tiny throngs are highly diverse. A metagenomic analysis of two surveys of ocean life, the Tara Oceans and Malaspina missions, by Ahmed Zayed of Ohio State University, found evidence of 200,000 different species of virus. These diverse species play an enormous role in the ecology of the oceans.

A litre of seawater may contain 100bn virions; a kilogram of dried soil perhaps a trillion

On land, most of the photosynthesis which provides the biomass and energy needed for life takes place in plants. In the oceans, it is overwhelmingly the business of various sorts of bacteria and algae collectively known as phytoplankton. These creatures reproduce at a terrific rate, and viruses kill them at a terrific rate, too. According to work by Curtis Suttle of the University of British Columbia, bacterial phytoplankton typically last less than a week before being killed by viruses.

This increases the overall productivity of the oceans by helping bacteria recycle organic matter (it is easier for one cell to use the contents of another if a virus helpfully lets them free). It also goes some way towards explaining what the great mid-20th-century ecologist G. Evelyn Hutchinson called “the paradox of the plankton”. Given the limited nature of the resources that single-celled plankton need, you would expect a few species particularly well adapted to their use to dominate the ecosystem. Instead, the plankton display great variety. This may well be because whenever a particular form of plankton becomes dominant, its viruses expand with it, gnawing away at its comparative success.

It is also possible that this endless dance of death between viruses and microbes sets the stage for one of evolution’s great leaps forward. Many forms of single-celled plankton have molecular mechanisms that allow them to kill themselves. They are presumably used when one cell’s sacrifice allows its sister cells—which are genetically identical—to survive. One circumstance in which such sacrifice seems to make sense is when a cell is attacked by a virus. If the infected cell can kill itself quickly (a process called apoptosis) it can limit the number of virions the virus is able to make. This lessens the chances that other related cells nearby will die. Some bacteria have been shown to use this strategy; many other microbes are suspected of it.

There is another situation where self-sacrifice is becoming conduct for a cell: when it is part of a multicellular organism. As such organisms grow, cells that were once useful to them become redundant; they have to be got rid of. Eugene Koonin of America’s National Institutes of Health and his colleagues have explored the idea that virus-thwarting self-sacrifice and complexity-permitting self-sacrifice may be related, with the latter descended from the former. Dr Koonin’s model also suggests that the closer the cells are clustered together, the more likely this act of self-sacrifice is to have beneficial consequences.

For such profound propinquity, move from the free-flowing oceans to the more structured world of soil, where potential self-sacrificers can nestle next to each other. Its structure makes soil harder to sift for genes than water is. But last year Mary Firestone of the University of California, Berkeley, and her colleagues used metagenomics to count 3,884 new viral species in a patch of Californian grassland. That is undoubtedly an underestimate of the total diversity; their technique could see only viruses with RNA genomes, thus missing, among other things, most bacteriophages.

Metagenomics can also be applied to biological samples, such as bat guano in which it picks up viruses from both the bats and their food. But for the most part the finding of animal viruses requires more specific sampling. Over the course of the 2010s PREDICT, an American-government project aimed at finding animal viruses, gathered over 160,000 animal and human tissue samples from 35 countries and discovered 949 novel viruses.

The people who put together PREDICT now have grander plans. They want a Global Virome Project to track down all the viruses native to the world’s 7,400 species of mammals and waterfowl—the reservoirs most likely to harbour viruses capable of making the leap into human beings. In accordance with the more-predator-species-than-prey rule they expect such an effort would find about 1.5m viruses, of which around 700,000 might be able to infect humans. A planning meeting in 2018 suggested that such an undertaking might take ten years and cost $4bn. It looked like a lot of money then. Today those arguing for a system that can provide advance warning of the next pandemic make it sound pretty cheap.

IV
Leaving their mark

THE TOLL which viruses have exacted throughout history suggests that they have left their mark on the human genome: things that kill people off in large numbers are powerful agents of natural selection. In 2016 David Enard, then at Stanford University and now at the University of Arizona, made a stab at showing just how much of the genome had been thus affected.

He and his colleagues started by identifying almost 10,000 proteins that seemed to be produced in all the mammals that had had their genomes sequenced up to that point. They then made a painstaking search of the scientific literature looking for proteins that had been shown to interact with viruses in some way or other. About 1,300 of the 10,000 turned up. About one in five of these proteins was connected to the immune system, and thus could be seen as having a professional interest in viral interaction. The others appeared to be proteins which the virus made use of in its attack on the host. The two cell-surface proteins that SARSCoV-2 uses to make contact with its target cells and inveigle its way into them would fit into this category.

The researchers then compared the human versions of the genes for their 10,000 proteins with those in other mammals, and applied a statistical technique that distinguishes changes that have no real impact from the sort of changes which natural selection finds helpful and thus tries to keep. Genes for virus-associated proteins turned out to be evolutionary hotspots: 30% of all the adaptive change was seen in the genes for the 13% of the proteins which interacted with viruses. As quickly as viruses learn to recognise and subvert such proteins, hosts must learn to modify them.

A couple of years later, working with Dmitri Petrov at Stanford, Dr Enard showed that modern humans have borrowed some of these evolutionary responses to viruses from their nearest relatives. Around 2-3% of the DNA in an average European genome has Neanderthal origins, a result of interbreeding 50,000 to 30,000 years ago. For these genes to have persisted they must be doing something useful—otherwise natural selection would have removed them. Dr Enard and Dr Petrov found that a disproportionate number described virus-interacting proteins; of the bequests humans received from their now vanished relatives, ways to stay ahead of viruses seem to have been among the most important.

Viruses do not just shape the human genome through natural selection, though. They also insert themselves into it. At least a twelfth of the DNA in the human genome is derived from viruses; by some measures the total could be as high as a quarter.

Retroviruses like HIV are called retro because they do things backwards. Where cellular organisms make their RNA from DNA templates, retroviruses do the reverse, making DNA copies of their RNA genomes. The host cell obligingly makes these copies into double-stranded DNA which can be stitched into its own genome. If this happens in a cell destined to give rise to eggs or sperm, the viral genes are passed from parent to offspring, and on down the generations. Such integrated viral sequences, known as endogenous retroviruses (ERVs), account for 8% of the human genome.

This is another example of the way the same viral trick can be discovered a number of times. Many bacteriophages are also able to stitch copies of their genome into their host’s DNA, staying dormant, or “temperate”, for generations. If the cell is doing well and reproducing regularly, this quiescence is a good way for the viral genes to make more copies of themselves. When a virus senses that its easy ride may be coming to an end, though—for example, if the cell it is in shows signs of stress—it will abandon ship. What was latent becomes “lytic” as the viral genes produce a sufficient number of virions to tear the host apart.

Though some of their genes are associated with cancers, in humans ERVs do not burst back into action in later generations. Instead they have proved useful resources of genetic novelty. In the most celebrated example, at least ten different mammalian lineages make use of a retroviral gene for one of their most distinctively mammalian activities: building a placenta.

The placenta is a unique organ because it requires cells from the mother and the fetus to work together in order to pass oxygen and sustenance in one direction and carbon dioxide and waste in the other. One way this intimacy is achieved safely is through the creation of a tissue in which the membranes between cells are broken down to form a continuous sheet of cellular material.

The protein that allows new cells to merge themselves with this layer, syncytin-1, was originally used by retroviruses to join the external membranes of their virions to the external membranes of cells, thus gaining entry for the viral proteins and nucleic acids. Not only have different sorts of mammals co-opted this membrane-merging trick—other creatures have made use of it, too. The mabuya, a long-tailed skink which unusually for a lizard nurtures its young within its body, employs a retroviral syncytin protein to produce a mammalian-looking placenta. The most recent shared ancestor of mabuyas and mammals died out 80m years before the first dinosaur saw the light of day, but both have found the same way to make use of the viral gene.

You put your line-1 in, you take your line-1 out

This is not the only way that animals make use of their ERVs. Evidence has begun to accumulate that genetic sequences derived from ERVs are quite frequently used to regulate the activity of genes of more conventional origin. In particular, RNA molecules transcribed from an ERV called HERV-K play a crucial role in providing the stem cells found in embryos with their “pluripotency”—the ability to create specialised daughter cells of various different types. Unfortunately, when expressed in adults HERV-K can also be responsible for cancers of the testes.

As well as containing lots of semi-decrepit retroviruses that can be stripped for parts, the human genome also holds a great many copies of a “retrotransposon” called LINE-1. This a piece of DNA with a surprisingly virus-like way of life; it is thought by some biologists to have, like ERVs, a viral origin. In its full form, LINE-1 is a 6,000-letter sequence of DNA which describes a “reverse transcriptase” of the sort that retroviruses use to make DNA from their RNA genomes. When LINE-1 is transcribed into an mRNA and that mRNA subsequently translated to make proteins, the reverse transcriptase thus created immediately sets to work on the mRNA used to create it, using it as the template for a new piece of DNA which is then inserted back into the genome. That new piece of DNA is in principle identical to the piece that acted as the mRNA’s original template. The LINE-1 element has made a copy of itself.

In the 100m years or so that this has been going on in humans and the species from which they are descended the LINE-1 element has managed to pepper the genome with a staggering 500,000 copies of itself. All told, 17% of the human genome is taken up by these copies—twice as much as by the ERVs.

Most of the copies are severely truncated and incapable of copying themselves further. But some still have the knack, and this capability may be being put to good use. Fred Gage and his colleagues at the Salk Institute for Biological Studies, in San Diego, argue that LINE-1 elements have an important role in the development of the brain. In 2005 Dr Gage discovered that in mouse embryos—specifically, in the brains of those embryos—about 3,000 LINE-1 elements are still able to operate as retrotransposons, putting new copies of themselves into the genome of a cell and thus of all its descendants.

Brains develop through proliferation followed by pruning. First, nerve cells multiply pell-mell; then the cell-suicide process that makes complex life possible prunes them back in a way that looks a lot like natural selection. Dr Gage suspects that the movement of LINE-1 transposons provides the variety in the cell population needed for this selection process. Choosing between cells with LINE-1 in different places, he thinks, could be a key part of the process from which the eventual neural architecture emerges. What is true in mice is, as he showed in 2009, true in humans, too. He is currently developing a technique for looking at the process in detail by comparing, post mortem, the genomes of different brain cells from single individuals to see if their LINE-1 patterns vary in the ways that his theory would predict.

V
Promised lands

HUMAN EVOLUTION may have used viral genes to make big-brained live-born life possible; but viral evolution has used them to kill off those big brains on a scale that is easily forgotten. Compare the toll to that of war. In the 20th century, the bloodiest in human history, somewhere between 100m and 200m people died as a result of warfare. The number killed by measles was somewhere in the same range; the number who died of influenza probably towards the top of it; and the number killed by smallpox—300m-500m—well beyond it. That is why the eradication of smallpox from the wild, achieved in 1979 by a globally co-ordinated set of vaccination campaigns, stands as one of the all-time-great humanitarian triumphs.

Other eradications should eventually follow. Even in their absence, vaccination has led to a steep decline in viral deaths. But viruses against which there is no vaccine, either because they are very new, like SARSCoV-2, or peculiarly sneaky, like HIV, can still kill millions.

Reducing those tolls is a vital aim both for research and for public-health policy. Understandably, a far lower priority is put on the benefits that viruses can bring. This is mostly because they are as yet much less dramatic. They are also much less well understood.

The viruses most prevalent in the human body are not those which infect human cells. They are those which infect the bacteria that live on the body’s surfaces, internal and external. The average human “microbiome” harbours perhaps 100trn of these bacteria. And where there are bacteria, there are bacteriophages shaping their population.

The microbiome is vital for good health; when it goes wrong it can mess up a lot else. Gut bacteria seem to have a role in maintaining, and possibly also causing, obesity in the well-fed and, conversely, in tipping the poorly fed into a form of malnutrition called kwashiorkor. Ill-regulated gut bacteria have also been linked, if not always conclusively, with diabetes, heart disease, cancers, depression and autism. In light of all this, the question “who guards the bacterial guardians?” is starting to be asked.

The viruses that prey on the bacteria are an obvious answer. Because the health of their host’s host—the possessor of the gut they find themselves in—matters to these phages, they have an interest in keeping the microbiome balanced. Unbalanced microbiomes allow pathogens to get a foothold. This may explain a curious detail of a therapy now being used as a treatment of last resort against Clostridium difficile, a bacterium that causes life-threatening dysentery. The therapy in question uses a transfusion of faecal matter, with its attendant microbes, from a healthy individual to reboot the patient’s microbiome. Such transplants, it appears, are more likely to succeed if their phage population is particularly diverse.

Medicine is a very long way from being able to use phages to fine-tune the microbiome. But if a way of doing so is found, it will not in itself be a revolution. Attempts to use phages to promote human health go back to their discovery in 1917, by Félix d’Hérelle, a French microbiologist, though those early attempts at therapy were not looking to restore balance and harmony. On the basis that the enemy of my enemy is my friend, doctors simply treated bacterial infections with phages thought likely to kill the bacteria.

The arrival of antibiotics saw phage therapy abandoned in most places, though it persisted in the Soviet Union and its satellites. Various biotechnology companies think they may now be able to revive the tradition—and make it more effective. One option is to remove the bits of the viral genome that let phages settle down to a temperate life in a bacterial genome, leaving them no option but to keep on killing. Another is to write their genes in ways that avoid the defences with which bacteria slice up foreign DNA.

The hope is that phage therapy will become a backup in difficult cases, such as infection with antibiotic-resistant bugs. There have been a couple of well-publicised one-off successes outside phage therapy’s post-Soviet homelands. In 2016 Tom Patterson, a researcher at the University of California, San Diego, was successfully treated for an antibiotic-resistant bacterial infection with specially selected (but un-engineered) phages. In 2018 Graham Hatfull of the University of Pittsburgh used a mixture of phages, some engineered so as to be incapable of temperance, to treat a 16-year-old British girl who had a bad bacterial infection after a lung transplant. Clinical trials are now getting under way for phage treatments aimed at urinary-tract infections caused by Escherichia coli, Staphylococcus aureus infections that can lead to sepsis and Pseudomonas aeruginosa infections that cause complications in people who have cystic fibrosis.

Viruses which attack bacteria are not the only ones genetic engineers have their eyes on. Engineered viruses are of increasing interest to vaccine-makers, to cancer researchers and to those who want to treat diseases by either adding new genes to the genome or disabling faulty ones. If you want to get a gene into a specific type of cell, a virion that recognises something about such cells may often prove a good tool.

The vaccine used to contain the Ebola outbreak in the Democratic Republic of Congo over the past two years was made by engineering Indiana vesiculovirus, which infects humans but cannot reproduce in them, so that it expresses a protein found on the surface of the Ebola virus; thus primed, the immune system responds to Ebola much more effectively. The World Health Organisation’s current list of 29 covid-19 vaccines in clinical trials features six versions of other viruses engineered to look a bit like SARS-CoV-2. One is based on a strain of measles that has long been used as a vaccine against that disease.

Viruses engineered to engender immunity against pathogens, to kill cancer cells or to encourage the immune system to attack them, or to deliver needed genes to faulty cells all seem likely to find their way into health care. Other engineered viruses are more worrying. One way to understand how viruses spread and kill is to try and make particularly virulent ones. In 2005, for example, Terrence Tumpey of America’s Centres for Disease Control and Prevention and his colleagues tried to understand the deadliness of the influenza virus responsible for the pandemic of 1918-20 by taking a more benign strain, adding what seemed to be distinctive about the deadlier one and trying out the result on mice. It was every bit as deadly as the original, wholly natural version had been.

The use of engineered pathogens as weapons of war is of dubious utility, completely illegal and repugnant to almost all

Because such “gain of function” research could, if ill-conceived or poorly implemented, do terrible damage, it requires careful monitoring. And although the use of engineered pathogens as weapons of war is of dubious utility—such weapons are hard to aim and hard to stand down, and it is not easy to know how much damage they have done—as well as being completely illegal and repugnant to almost all, such possibilities will and should remain a matter of global concern.

Information which, for billions of years, has only ever come into its own within infected cells can now be inspected on computer screens and rewritten at will. The power that brings is sobering. It marks a change in the history of both viruses and people—a change which is perhaps as important as any of those made by modern biology. It is constraining a small part of the viral world in a way which, so far, has been to people’s benefit. It is revealing that world’s further reaches in a way which cannot but engender awe. ■

Editor’s note: Some of our covid-19 coverage is free for readers of The Economist Today, our daily newsletter. For more stories and our pandemic tracker, see our hub

This article appeared in the Essay section of the print edition under the headline “The outsiders inside”

Did Human Evolution Include a Semi-Aquatic Phase? (The Scientist)

A recent book outlines fossil evidence supporting the controversial hypothesis.

Peter Rhys-Evans
Apr 1, 2020

For the past 150 years, scientists and laypeople alike have accepted a “savanna” scenario of human evolution. The theory, primarily based on fossil evidence, suggests that because our ancestral ape family members were living in the trees of East African forests, and because we humans live on terra firma, our primate ancestors simply came down from the trees onto the grasslands and stood upright to see farther over the vegetation, increasing their efficiency as hunter-gatherers. In the late 19th century, anthropologists only had a few Neanderthal fossils to study, and science had very little knowledge of genetics and evolutionary changes. So this savanna theory of human evolution became ingrained in anthropological dogma and has remained the established explanation of early hominin evolution following the genetic split from our primate cousins 6 million to 7 million years ago.

But in 1960, a different twist on human evolution emerged. That year, marine biologist Sir Alister Hardy wrote an article in New Scientist suggesting a possible aquatic phase in our evolution, noting Homo sapiens’s differences from other primates and similarities to other aquatic and semi-aquatic mammals. In 1967, zoologist Desmond Morris published The Naked Ape, which explored different theories about why modern humans lost their fur. Morris mentioned Hardy’s “aquatic ape” hypothesis as an “ingenious” theory that sufficiently explained “why we are so nimble in the water today and why our closest living relatives, the chimpanzees, are so helpless and quickly drown.”

CRC Press, July 2019

Morris concluded, however, that “despite its most appealing indirect evidence, the aquatic theory lacks solid support.” Even if eventually the aquatic ape hypothesis turns out to be true, he continued, it need not completely rewrite the story of human evolution, but rather add to our species’ evolutionary arc a “salutary christening ceremony.”

In 1992, I published a paper describing a curious ear condition colloquially known as “surfer’s ear,” which I and other ear, nose, and throat doctors frequently see in clinics. Exostoses are small bones that grow in the outer ear canal, but only in humans who swim and dive on a regular, almost daily basis. In modern humans, there is undisputed evidence of aural exostoses in people who swim and dive, with the size and extent being directly dependent on the frequency and length of exposure to water, as well as its temperature.

I predicted that if these exostoses were found in early hominin skulls, it would provide vital fossil evidence for frequent swimming and diving by our ancestors. Researchers have now found these features in 1 million– to 2 million–year-old hominin skulls. In a recent study on nearly two dozen Neanderthal skulls, about 47 percent had exostoses. There are many other references to contemporary, historical, and archaeological coastal and river communities with a significantly increased incidence of aural exostoses. In my latest book, The Waterside Ape, I propose that the presence of exostoses in the skulls of ancient human ancestors is a prime support for an aquatic phase of our evolution, which may explain our unique human phenotype.

Other Homo sapiens–specific features that may be tied to a semi-aquatic stage of human evolution include erect posture, loss of body hair, deposition of subcutaneous fat, a completely different heat-regulation system from other primates, and kidneys that function much like those of aquatic mammals. This combination of characteristics, which do not exist in any other terrestrial mammal, would have gradually arisen over several million years. The finding of the bipedal hominin named “Lucy,” dating to 3.5 million years ago, suggested that walking on two legs was the initial major evolutionary adaptation to a semi-aquatic habitat. By the time the Neanderthals appeared some 400,000 to 300,000 years ago, their semi-aquatic lifestyle—swimming, diving, and perhaps hunting for food on land and in the water—may have been firmly part of day-to-day life.

In my opinion, the accumulated fossil, anatomical, and physiological evidence about early hominin evolution points to our human ancestors learning to survive as semi-aquatic creatures in a changing East African environment. After transitioning to bipedalism, ancient hominins had both forelimbs free from aiding in walking, which may have allowed for increasing manual dexterity and skills. Perhaps a marine diet with lipoproteins that are essential for brain development fueled the unique intellectual advances and ecological dominance of Homo sapiens.

Peter Rhys-Evans works in private practice as an otolaryngologist in London at several hospitals including the Harley Street Clinic. He is the founder and chairman of Oracle Cancer Trust, the largest head and neck cancer charity in the UK. Read an excerpt from The Waterside Ape. Follow Rhys-Evans on Twitter @TheWatersideApe.

Human impact on nature ‘dates back millions of years’ (BBC)

Early human ancestors could have stolen food from other animals. Mauricio Antón

By Helen Briggs BBC News

20 January 2020

The impact of humans on nature has been far greater and longer-lasting than we could ever imagine, according to scientists.

Early human ancestors living millions of years ago may have triggered extinctions, even before our species evolved, a study suggests.

A decline in large mammals seen in Eastern Africa may have been due to early humans, researchers propose.

Extinction rates started to increase from around four million years ago.

This coincides with the period when ancient human populations were living in the area, as judged by fossil evidence.

“We are now negatively impacting the world and the species that live in it more than ever before. But this does not mean that we used to live in true harmony with nature in the past,” said study researcher, Dr Søren Faurby of the University of Gothenburg.

“We are extremely successful in monopolising resources today, and our results show that this may have also been the case with our ancestors.”

Getty Images. A lion feasts on the carcass of a rhinoceros in Kenya

The researchers looked at extinction rates of large and small carnivores and how this correlated with environmental changes such as rainfall and temperature.

They also looked at changes in the brain size of human ancestors such as Australopithecus and Ardipithecus.

They found that extinction rates in large carnivores correlated with increased brain size of human ancestors and with vegetation changes, but not with precipitation or temperature changes.

They found the best explanation for carnivore extinction in East Africa was that these animals were in direct competition for food with our ancestors.

They think human ancestors may have stolen freshly-killed prey from the likes of sabre-toothed cats, depriving them of food.

“Our results suggest that substantial anthropogenic influence on biodiversity started millions of years earlier than currently assumed,” the researchers reported in the journal Ecology Letters.

Co-researcher Alexandre Antonelli of the Royal Botanic Gardens, Kew, said the view that our ancestors had little impact on the animals around them is incorrect, as “the impact of our lineage on nature has been far greater and longer-lasting than we ever could ever imagine”.

A landmark report last year warned that as many as one million species of animals and plants are threatened with extinction in the coming decades.

A more recent study found that the growth of cities, the clearing of forests for farming and the soaring demand for fish had significantly altered nearly three-quarters of the land and more than two-thirds of the oceans.

Interdisciplinary approach yields new insights into human evolution (Vanderbilt University)

PUBLIC RELEASE: 

Vanderbilt biologist Nicole Creanza Nicole Creanza takes interdisciplinary approach to human evolution as guest editor of Royal Society journal

The evolution of human biology should be considered part and parcel with the evolution of humanity itself, proposes Nicole Creanza, assistant professor of biological sciences. She is the guest editor of a new themed issue of the Philosophical Transactions of the Royal Society B, the oldest scientific journal in the world, that focuses on an interdisciplinary approach to human evolution.

Stanford professor Marc Feldman and Stanford postdoc Oren Kolodny collaborated with Creanza on the special issue.

“Within the blink of an eye on a geological timescale, humans advanced from using basic stone tools to examining the rocks on Mars; however, our exact evolutionary path and the relative importance of genetic and cultural evolution remain a mystery,” said Creanza, who specializes in the application of computational and theoretical approaches to human and cultural evolution, particularly language development. “Our cultural capacities-to create new ideas, to communicate and learn from one another, and to form vast social networks-together make us uniquely human, but the origins, the mechanisms, and the evolutionary impact of these capacities remain unknown.”

The special issue brings together researchers in biology, anthropology, archaeology, economics, psychology, computer science and more to explore the cultural forces affecting human evolution from a wider perspective than is usually taken.

“Researchers have begun to recognize that understanding non-genetic inheritance, including culture, ecology, the microbiome, and regulation of gene expression, is fundamental to fully comprehending evolution,” said Creanza. “It is essential to understand the dynamics of cultural inheritance at different temporal and spatial scales, to uncover the underlying mechanisms that drive these dynamics, and to shed light on their implications for our current theory of evolution as well as for our interpretation and predictions regarding human behavior.”

In addition to an essay discussing the need for an interdisciplinary approach to human evolution, Creanza included an interdisciplinary study of her own, examining the origins of English’s contribution to Sranan, a creole that emerged in Suriname following an influx of indentured servants from England in the 17th century.

Creanza, along with linguists Andre Sherriah and Hubert Devonish of the University of the West Indes and psychologist Ewart Thomas from Stanford, sought to determine the geographic origins of the English speakers whose regional dialects formed the backbone of Sranan. Their work combined linguistic, historical and genetic approaches to determine that the English speakers who influenced Sranan the most originated largely from two counties on opposite sides of southern England: Bristol, in the west, and Essex, in the east.

“Thus, analyzing the features of modern-day languages might give us new information about events in human history that left few other traces,” Creanza said.

Language is learned in brain circuits that predate humans (Georgetown University)

PUBLIC RELEASE: 

GEORGETOWN UNIVERSITY MEDICAL CENTER

WASHINGTON — It has often been claimed that humans learn language using brain components that are specifically dedicated to this purpose. Now, new evidence strongly suggests that language is in fact learned in brain systems that are also used for many other purposes and even pre-existed humans, say researchers in PNAS (Early Edition online Jan. 29).

The research combines results from multiple studies involving a total of 665 participants. It shows that children learn their native language and adults learn foreign languages in evolutionarily ancient brain circuits that also are used for tasks as diverse as remembering a shopping list and learning to drive.

“Our conclusion that language is learned in such ancient general-purpose systems contrasts with the long-standing theory that language depends on innately-specified language modules found only in humans,” says the study’s senior investigator, Michael T. Ullman, PhD, professor of neuroscience at Georgetown University School of Medicine.

“These brain systems are also found in animals – for example, rats use them when they learn to navigate a maze,” says co-author Phillip Hamrick, PhD, of Kent State University. “Whatever changes these systems might have undergone to support language, the fact that they play an important role in this critical human ability is quite remarkable.”

The study has important implications not only for understanding the biology and evolution of language and how it is learned, but also for how language learning can be improved, both for people learning a foreign language and for those with language disorders such as autism, dyslexia, or aphasia (language problems caused by brain damage such as stroke).

The research statistically synthesized findings from 16 studies that examined language learning in two well-studied brain systems: declarative and procedural memory.

The results showed that how good we are at remembering the words of a language correlates with how good we are at learning in declarative memory, which we use to memorize shopping lists or to remember the bus driver’s face or what we ate for dinner last night.

Grammar abilities, which allow us to combine words into sentences according to the rules of a language, showed a different pattern. The grammar abilities of children acquiring their native language correlated most strongly with learning in procedural memory, which we use to learn tasks such as driving, riding a bicycle, or playing a musical instrument. In adults learning a foreign language, however, grammar correlated with declarative memory at earlier stages of language learning, but with procedural memory at later stages.

The correlations were large, and were found consistently across languages (e.g., English, French, Finnish, and Japanese) and tasks (e.g., reading, listening, and speaking tasks), suggesting that the links between language and the brain systems are robust and reliable.

The findings have broad research, educational, and clinical implications, says co-author Jarrad Lum, PhD, of Deakin University in Australia.

“Researchers still know very little about the genetic and biological bases of language learning, and the new findings may lead to advances in these areas,” says Ullman. “We know much more about the genetics and biology of the brain systems than about these same aspects of language learning. Since our results suggest that language learning depends on the brain systems, the genetics, biology, and learning mechanisms of these systems may very well also hold for language.”

For example, though researchers know little about which genes underlie language, numerous genes playing particular roles in the two brain systems have been identified. The findings from this new study suggest that these genes may also play similar roles in language. Along the same lines, the evolution of these brain systems, and how they came to underlie language, should shed light on the evolution of language.

Additionally, the findings may lead to approaches that could improve foreign language learning and language problems in disorders, Ullman says.

For example, various pharmacological agents (e.g., the drug memantine) and behavioral strategies (e.g., spacing out the presentation of information) have been shown to enhance learning or retention of information in the brain systems, he says. These approaches may thus also be used to facilitate language learning, including in disorders such as aphasia, dyslexia, and autism.

“We hope and believe that this study will lead to exciting advances in our understanding of language, and in how both second language learning and language problems can be improved,” Ullman concludes.

Human societies evolve along similar paths (University of Exeter)

PUBLIC RELEASE: 

Societies ranging from ancient Rome and the Inca empire to modern Britain and China have evolved along similar paths, a huge new study shows.

Despite their many differences, societies tend to become more complex in “highly predictable” ways, researchers said.

These processes of development – often happening in societies with no knowledge of each other – include the emergence of writing systems and “specialised” government workers such as soldiers, judges and bureaucrats.The international research team, including researchers from the University of Exeter, created a new database of historical and archaeological information using data on 414 societies spanning the last 10,000 years. The database is larger and more systematic than anything that has gone before it.

“Societies evolve along a bumpy path – sometimes breaking apart – but the trend is towards larger, more complex arrangements,” said corresponding author Dr Thomas Currie, of the Human Behaviour and Cultural Evolution Group at the University of Exeter’s Penryn Campus in Cornwall.

“Researchers have long debated whether social complexity can be meaningfully compared across different parts of the world. Our research suggests that, despite surface differences, there are fundamental similarities in the way societies evolve.

“Although societies in places as distant as Mississippi and China evolved independently and followed their own trajectories, the structure of social organisation is broadly shared across all continents and historical eras.”

The measures of complexity examined by the researchers were divided into nine categories. These included:

  • Population size and territory
  • Number of control/decision levels in administrative, religious and military hierarchies
  • Information systems such as writing and record keeping
  • Literature on specialised topics such as history, philosophy and fiction
  • Economic development

The researchers found that these different features showed strong statistical relationships, meaning that variation in societies across space and time could be captured by a single measure of social complexity.

This measure can be thought of as “a composite measure of the various roles, institutions, and technologies that enable the coordination of large numbers of people to act in a politically unified manner”.

Dr Currie said learning lessons from human history could have practical uses.

“Understanding the ways in which societies evolve over time and in particular how humans are able to create large, cohesive groups is important when we think about state building and development,” he said.

“This study shows how the sciences and humanities, which have not always seen eye-to-eye, can actually work together effectively to uncover general rules that have shaped human history.”

###

The new database of historical and archaeological information is known as “Seshat: Global History Databank” and its construction was led by researchers from the University of Exeter, the University of Connecticut, the University of Oxford, Trinity College Dublin and the Evolution Institute. More than 70 expert historians and archaeologists have helped in the data collection process.

The paper, published in Proceedings of the National Academy of Sciences, is entitled: “Quantitative historical analysis uncovers a single dimension of complexity that structures global variation in human social organisation.”

Scientists Seek to Update Evolution (Quanta Magazine)

Recent discoveries have led some researchers to argue that the modern evolutionary synthesis needs to be amended. 

By Carl Zimmer. November 22, 2016

Douglas Futuyma, a biologist at Stony Brook University, defends the “Modern Synthesis” of evolution at the Royal Society earlier this month.  Kevin Laland looked out across the meeting room at a couple hundred people gathered for a conference on the future of evolutionary biology. A colleague sidled up next to him and asked how he thought things were going.

“I think it’s going quite well,” Laland said. “It hasn’t gone to fisticuffs yet.”

Laland is an evolutionary biologist who works at the University of St. Andrews in Scotland. On a chilly gray November day, he came down to London to co-host a meeting at the Royal Society called “New Trends in Evolutionary Biology.” A motley crew of biologists, anthropologists, doctors, computer scientists, and self-appointed visionaries packed the room. The Royal Society is housed in a stately building overlooking St. James’s Park. Today the only thing for Laland to see out of the tall meeting-room windows was scaffolding and gauzy tarps set up for renovation work. Inside, Laland hoped, another kind of renovation would be taking place.

In the mid-1900s, biologists updated Darwin’s theory of evolution with new insights from genetics and other fields. The result is often called the Modern Synthesis, and it has guided evolutionary biology for over 50 years. But in that time, scientists have learned a tremendous amount about how life works. They can sequence entire genomes. They can watch genes turn on and off in developing embryos. They can observe how animals and plants respond to changes in the environment.

As a result, Laland and a like-minded group of biologists argue that the Modern Synthesis needs an overhaul. It has to be recast as a new vision of evolution, which they’ve dubbed the Extended Evolutionary Synthesis. Other biologists have pushed back hard, saying there is little evidence that such a paradigm shift is warranted.

This meeting at the Royal Society was the first public conference where Laland and his colleagues could present their vision. But Laland had no interest in merely preaching to the converted, and so he and his fellow organizers also invited prominent evolutionary biologists who are skeptical about the Extended Evolutionary Synthesis.

Both sides offered their arguments and critiques in a civil way, but sometimes you could sense the tension in the room — the punctuations of tsk-tsks, eye-rolling, and partisan bursts of applause.

But no fisticuffs. At least not yet.

Making Evolution as We Know It

Every science passes through times of revolution and of business as usual. After Galileo and Newton dragged physics out of its ancient errors in the 1600s, it rolled forward from one modest advance to the next until the early 1900s. Then Einstein and other scientists established quantum physics, relativity and other new ways of understanding the universe. None of them claimed that Newton was wrong. But it turns out there’s much more to the universe than matter in motion.

Evolutionary biology has had revolutions of its own. The first, of course, was launched by Charles Darwin in 1859 with his book On the Origin of Species. Darwin wove together evidence from paleontology, embryology and other sciences to show that living things were related to one another by common descent. He also introduced a mechanism to drive that long-term change: natural selection. Each generation of a species was full of variations. Some variations helped organisms survive and reproduce, and those were passed down, thanks to heredity, to the next generation.

Darwin inspired biologists all over the world to study animals and plants in a new way, interpreting their biology as adaptations produced over many generations. But he succeeded in this despite having no idea what a gene was. It wasn’t until the 1930s that geneticists and evolutionary biologists came together and recast evolutionary theory. Heredity became the transmission of genes from generation to generation. Variations were due to mutations, which could be shuffled into new combinations. New species arose when populations built up mutations that made interbreeding impossible.

In 1942, the British biologist Julian Huxley described this emerging framework in a book called Evolution: The Modern Synthesis. Today, scientists still call it by that name. (Sometimes they refer to it instead as neo-Darwinism, although that’s actually a confusing misnomer. The term “neo-Darwinism” was actually coined in the late 1800s, to refer to biologists who were advancing Darwin’s ideas in Darwin’s own lifetime.)

The Modern Synthesis proved to be a powerful tool for asking questions about nature. Scientists used it to make a vast range of discoveries about the history of life, such as why some people are prone to genetic disorders like sickle-cell anemia and why pesticides sooner or later fail to keep farm pests in check. But starting not long after the formation of the Modern Synthesis, various biologists would complain from time to time that it was too rigid. It wasn’t until the past few years, however, that Laland and other researchers got organized and made a concerted effort to formulate an extended synthesis that might take its place.

The researchers don’t argue that the Modern Synthesis is wrong — just that it doesn’t capture the full richness of evolution. Organisms inherit more than just genes, for example: They can inherit other cellular molecules, as well as behaviors they learn and the environments altered by their ancestors. Laland and his colleagues also challenge the pre-eminent place that natural selection gets in explanations for how life got to be the way it is. Other processes can influence the course of evolution, too, from the rules of development to the environments in which organisms have to live.

“It’s not simply bolting more mechanisms on what we already have,” said Laland. “It requires you to think of causation in a different way.”

Adding to Darwin

Eva Jablonka, a biologist at Tel Aviv University, used her talk to explore the evidence for a form of heredity beyond genes.

Our cells use a number of special molecules to control which of their genes make proteins. In a process called methylation, for example, cells put caps on their DNA to keep certain genes shut down. When cells divide, they can reproduce the same caps and other controls on the new DNA. Certain signals from the environment can cause cells to change these so-called “epigenetic” controls, allowing organisms to adjust their behavior to new challenges.

Some studies indicate that — under certain circumstances — an epigenetic change in a parent may get passed down to its offspring. And those children may pass down this altered epigenetic profile to their children. This would be kind of heredity that’s beyond genes.

The evidence for this effect is strongest in plants. In one study, researchers were able to trace down altered methylation patterns for 31 generations in a plant called Arabidopsis. And this sort of inheritance can make a meaningful difference in how an organism works. In another study, researchers found that inherited methylation patterns could change the flowering time of Arabidopsis, as well as the size of its roots. The variation that these patterns created was even bigger than what ordinary mutations caused.

After presenting evidence like this, Jablonka argued that epigenetic differences could determine which organisms survived long enough to reproduce. “Natural selection could work on this system,” she said.

While natural selection is an important force in evolution, the speakers at the meeting presented evidence for how it could be constrained, or biased in a particular direction. Gerd Müller, a University of Vienna biologist, offered an example from his own research on lizards. A number of species of lizards have evolved feet that have lost some toes. Some have only four toes, while others have just one, and some have lost their feet altogether.

The Modern Synthesis, Müller argued, leads scientists to look at these arrangements as simply the product of natural selection, which favors one variant over others because it has a survival advantage. But that approach doesn’t work if you ask what the advantage was for a particular species to lose the first toe and last toe in its foot, instead of some other pair of toes.

“The answer is, there is no real selective advantage,” said Müller.

The key to understanding why lizards lose particular toes is found in the way that lizard embryos develop toes in the first place. A bud sprouts off the side of the body, and then five digits emerge. But the toes always appear in the same sequence. And when lizards lose their toes through evolution, they lose them in the reverse order. Müller suspects this constraint is because mutations can’t create every possible variation. Some combinations of toes are thus off-limits, and natural selection can never select them in the first place.

Development may constrain evolution. On the other hand, it also provides animals and plants with remarkable flexibility. Sonia Sultan, an evolutionary ecologist from Wesleyan University, offered a spectacular case in point during her talk, describing a plant she studies in the genus Polygonum that takes the common name “smartweed.”

The Modern Synthesis, Sultan said, would lead you to look at the adaptations in a smartweed plant as the fine-tuned product of natural selection. If plants grow in low sunlight, then natural selection will favor plants with genetic variants that let them thrive in that environment — for example, by growing broader leaves to catch more photons. Plants that grow in bright sunlight, on the other hand, will evolve adaptations that let them thrive in those different conditions.

“It’s a commitment to that view that we’re here to confront,” Sultan said.

If you raise genetically identical smartweed plants under different conditions, Sultan showed, you’ll end up with plants that may look like they belong to different species.

For one thing, smartweed plants adjust the size of their leaves to the amount of sunlight they get. In bright light, the plants grow narrow, thick leaves, but in low light, the leaves become broad and thin. In dry soil, the plants send roots down deep in search of water, while in flood soil, they grow shallow hairlike roots that that stay near the surface.

Scientists at the meeting argued that this flexibility — known as plasticity — can itself help drive evolution. It allows plants to spread into a range of habitats, for example, where natural selection can then adapt their genes. And in another talk, Susan Antón, a paleoanthropologist at New York University, said that plasticity may play a significant role in human evolution that’s gone underappreciated till now. That’s because the Modern Synthesis has strongly influenced the study of human evolution for the past half century.

Paleoanthropologists tended to treat differences in fossils as the result of genetic differences. That allowed them to draw an evolutionary tree of humans and their extinct relatives. This approach has a lot to show for it, Antón acknowledged. By the 1980s, scientists had figured out that our early ancient relatives were short and small-brained up to about two million years ago. Then one lineage got tall and evolved big brains. That transition marked the origin of our genus, Homo.

But sometimes paleoanthropologists would find variations that were harder to make sense of. Two fossils might look in some ways like they should be in the same species but look too different in other respects. Scientists would usually dismiss those variations as being caused by the environment. “We wanted to get rid of all that stuff and get down to their essence,” Antón said.

But that stuff is now too abundant to ignore. Scientists have found a dizzying variety of humanlike fossils dating back to 1.5 to 2.5 million years ago. Some are tall, and some are short. Some have big brains and some have small ones. They all have some features of Homo in their skeletonbut each has a confusing mix-and-match assortment.

Antón thinks that the Extended Evolutionary Synthesis can help scientists make sense of this profound mystery. In particular, she thinks that her colleagues should take plasticity seriously as an explanation for the weird diversity of early Homo fossils.

To support this idea, Antón pointed out that living humans have their own kinds of plasticity. The quality of food a woman gets while she’s pregnant can influence the size and health of her baby, and those influences can last until adulthood. What’s more, the size of a woman — influenced in part by her own mother’s diet — can influence her own children. Biologists have found that women with longer legs tend to have larger children, for example.

Antón proposed that the weird variations in the fossil record might be even more dramatic examples of plasticity. All these fossils date to when Africa’s climate fell into a period of wild climate swings. Droughts and abundant rains would have changed the food supply in different parts of the world, perhaps causing early Homo to develop differently.

The Extended Evolutionary Synthesis may also help make sense of another chapter in our history: the dawn of agriculture. In Asia, Africa and the Americas, people domesticated crops and livestock. Melinda Zeder, an archaeologist at the Smithsonian Institution, gave a talk at the meeting about the long struggle to understand how this transformation unfolded.

Before people farmed, they foraged for food and hunted wild game. Zeder explained how many scientists treat the behavior of the foragers in a very Modern Synthesis way: as finely tuned by natural selection to deliver the biggest payoff for their effort to find food.

The trouble is that it’s hard to see how such a forager would ever switch to farming. “You don’t get the immediate gratification of grabbing some food and putting it in your mouth,” Zeder told me.

Some researchers suggested that the switch to agriculture might have occurred during a climate shift, when it got harder to find wild plants. But Zeder and other researchers have actually found no evidence of such a crisis when agriculture arose.

Zeder argues that there’s a better way of thinking about this transition. Humans are not passive zombies trying to survive in a fixed environment. They are creative thinkers who can change the environment itself. And in the process, they can steer evolution in a new direction.

Scientists call this process niche construction, and many species do it. The classic case is a beaver. It cuts down trees and makes a dam, creating a pond. In this new environment, some species of plants and animals will do better than others. And they will adapt to their environment in new ways. That’s true not just for the plants and animals that live around a beaver pond, but for the beaver itself.

When Zeder first learned about niche construction, she says, it was a revelation. “Little explosions were going off in my head,” she told me. The archaeological evidence she and others had gathered made sense as a record of how humans changed their own environment.

Early foragers show signs of having moved wild plants away from their native habitats to have them close at hand, for example. As they watered the plants and protected them from herbivores, the plants adapted to their new environment. Weedy species also moved in and became crops of their own. Certain animals adapted to the environment as well, becoming dogs, cats and other domesticated species.

Gradually, the environment changed from sparse patches of wild plants to dense farm fields. That environment didn’t just drive the evolution of the plants. It also began to drive the cultural evolution of the farmers, too. Instead of wandering as nomads, they settled down in villages so that they could work the land around them. Society became more stable because children received an ecological inheritance from their parents. And so civilization began.

Niche construction is just one of many concepts from the Extended Evolutionary Synthesis that can help make sense of domestication, Zeder said. During her talk, she presented slide after slide of predictions it provides, about everything from the movements of early foragers to the pace of plant evolution.

“It felt like an infomercial for the Extended Evolutionary Synthesis,” Zeder told me later with a laugh. “But wait! You can get steak knives!”

The Return of Natural Selection

Among the members of the audience was a biologist named David Shuker. After listening quietly for a day and a half, the University of St Andrews researcher had had enough. At the end of a talk, he shot up his hand.

The talk had been given by Denis Noble, a physiologist with a mop of white hair and a blue blazer. Noble, who has spent most of his career at Oxford, said he started out as a traditional biologist, seeing genes as the ultimate cause of everything in the body. But in recent years he had switched his thinking. He spoke of the genome not as a blueprint for life but as a sensitive organ, detecting stress and rearranging itself to cope with challenges. “I’ve been on a long journey to this view,” Noble said.

To illustrate this new view, Noble discussed an assortment of recent experiments. One of them was published last year by a team at the University of Reading. They did an experiment on bacteria that swim by spinning their long tails.

First, the scientists cut a gene out of the bacteria’s DNA that’s essential for building tails. The researchers then dropped these tailless bacteria into a petri dish with a meager supply of food. Before long, the bacteria ate all the food in their immediate surroundings. If they couldn’t move, they died. In less than four days in these dire conditions, the bacteria were swimming again. On close inspection, the team found they were growing new tails.

“This strategy is to produce rapid evolutionary genome change in response to the unfavorable environment,” Noble declared to the audience. “It’s a self-maintaining system that enables a particular characteristic to occur independent of the DNA.”

That didn’t sound right to Shuker, and he was determined to challenge Noble after the applause died down.

“Could you comment at all on the mechanism underlying that discovery?” Shuker asked.

Noble stammered in reply. “The mechanism in general terms, I can, yes…” he said, and then started talking about networks and regulation and a desperate search for a solution to a crisis. “You’d have to go back to the original paper,” he then said.

While Noble was struggling to respond, Shuker went back to the paper on an iPad. And now he read the abstract in a booming voice.

“‘Our results demonstrate that natural selection can rapidly rewire regulatory networks,’” Shuker said. He put down the iPad. “So it’s a perfect, beautiful example of rapid neo-Darwinian evolution,” he declared.

Shuker distilled the feelings of a lot of skeptics I talked to at the conference. The high-flying rhetoric about a paradigm shift was, for the most part, unwarranted, they said. Nor were these skeptics limited to the peanut gallery. Several of them gave talks of their own.

“I think I’m expected to represent the Jurassic view of evolution,” said Douglas Futuyma when he got up to the podium. Futuyma is a soft-spoken biologist at Stony Brook University in New York and the author of a leading textbook on evolution. In other words, he was the target of many complaints during the meeting that textbooks paid little heed to things like epigenetics and plasticity. In effect, Futuyma had been invited to tell his colleagues why those concepts were ignored.

“We must recognize that the core principles of the Modern Synthesis are strong and well-supported,” Futuyma declared. Not only that, he added, but the kinds of biology being discussed at the Royal Society weren’t actually all that new. The architects of the Modern Synthesis were already talking about them over 50 years ago. And there’s been a lot of research guided by the Modern Synthesis to make sense of them.

Take plasticity. The genetic variations in an animal or a plant govern the range of forms into which organism can develop. Mutations can alter that range. And mathematical models of natural selection show how it can favor some kinds of plasticity over others.

If the Extended Evolutionary Synthesis was so superfluous, then why was it gaining enough attention to warrant a meeting at the Royal Society? Futuyma suggested that its appeal was emotional rather than scientific. It made life an active force rather than the passive vehicle of mutations.

“I think what we find emotionally or aesthetically more appealing is not the basis for science,” Futuyma said.

Still, he went out of his way to say that the kind of research described at the meeting could lead to some interesting insights about evolution. But those insights would only arise with some hard work that leads to hard data. “There have been enough essays and position papers,” he said.

Some members in the audience harangued Futuyma a bit. Other skeptical speakers sometimes got exasperated by arguments they felt didn’t make sense. But the meeting managed to reach its end on the third afternoon without fisticuffs.

“This is likely the first of many, many meetings,” Laland told me. In September, a consortium of scientists in Europe and the United States received $11 million in funding (including $8 million from the John Templeton Foundation) to run 22 studies on the Extended Evolutionary Synthesis.

Many of these studies will test predictions that have emerged from the synthesis in recent years. They will see, for example, if species that build their own environments — spider webs, wasp nests and so on — evolve into more species than ones that don’t. They will look at whether more plasticity allows species to adapt faster to new environments.

“It’s doing the research, which is what our critics are telling us to do,” said Laland. “Go find the evidence.”

Correction: An earlier version of this article misidentified the photograph of Andy Whiten as Gerd Müller.

This article was reprinted on TheAtlantic.com.

Large human brain evolved as a result of ‘sizing each other up’ (Science Daily)

Date:
August 12, 2016
Source:
Cardiff University
Summary:
Humans have evolved a disproportionately large brain as a result of sizing each other up in large cooperative social groups, researchers have proposed.

The brains of humans enlarged over time thanks to our sizing up the competition, say scientists. Credit: © danheighton / Fotolia

Humans have evolved a disproportionately large brain as a result of sizing each other up in large cooperative social groups, researchers have proposed.

A team led by computer scientists at Cardiff University suggest that the challenge of judging a person’s relative standing and deciding whether or not to cooperate with them has promoted the rapid expansion of human brain size over the last 2 million years.

In a study published in Scientific Reports, the team, which also includes leading evolutionary psychologist Professor Robin Dunbar from the University of Oxford, specifically found that evolution favors those who prefer to help out others who are at least as successful as themselves.

Lead author of the study Professor Roger Whitaker, from Cardiff University’s School of Computer Science and Informatics, said: “Our results suggest that the evolution of cooperation, which is key to a prosperous society, is intrinsically linked to the idea of social comparison — constantly sizing each up and making decisions as to whether we want to help them or not.

“We’ve shown that over time, evolution favors strategies to help those who are at least as successful as themselves.”

In their study, the team used computer modelling to run hundreds of thousands of simulations, or ‘donation games’, to unravel the complexities of decision-making strategies for simplified humans and to establish why certain types of behaviour among individuals begins to strengthen over time.

In each round of the donation game, two simulated players were randomly selected from the population. The first player then made a decision on whether or not they wanted to donate to the other player, based on how they judged their reputation. If the player chose to donate, they incurred a cost and the receiver was given a benefit. Each player’s reputation was then updated in light of their action, and another game was initiated.

Compared to other species, including our closest relatives, chimpanzees, the brain takes up much more body weight in human beings. Humans also have the largest cerebral cortex of all mammals, relative to the size of their brains. This area houses the cerebral hemispheres, which are responsible for higher functions like memory, communication and thinking.

The research team propose that making relative judgements through helping others has been influential for human survival, and that the complexity of constantly assessing individuals has been a sufficiently difficult task to promote the expansion of the brain over many generations of human reproduction.

Professor Robin Dunbar, who previously proposed the social brain hypothesis, said: “According to the social brain hypothesis, the disproportionately large brain size in humans exists as a consequence of humans evolving in large and complex social groups.

“Our new research reinforces this hypothesis and offers an insight into the way cooperation and reward may have been instrumental in driving brain evolution, suggesting that the challenge of assessing others could have contributed to the large brain size in humans.”

According to the team, the research could also have future implications in engineering, specifically where intelligent and autonomous machines need to decide how generous they should be towards each other during one-off interactions.

“The models we use can be executed as short algorithms called heuristics, allowing devices to make quick decisions about their cooperative behaviour,” Professor Whitaker said.

“New autonomous technologies, such as distributed wireless networks or driverless cars, will need to self-manage their behaviour but at the same time cooperate with others in their environment.”


Journal Reference:

  1. Roger M. Whitaker, Gualtiero B. Colombo, Stuart M. Allen, Robin I. M. Dunbar. A Dominant Social Comparison Heuristic Unites Alternative Mechanisms for the Evolution of Indirect ReciprocityScientific Reports, 2016; 6: 31459 DOI: 10.1038/srep31459

O bichinho que desafia Deus (El País)

Organismo marinho mostra por que o ser humano não está no topo da evolução

MANUEL ANSEDE

Barcelona 13 JUN 2016 – 21:07 CEST

Os biólogos Ricard Albalat e Cristian Cañestro, com exemplares do 'Oikopleura'.

Os biólogos Ricard Albalat e Cristian Cañestro, com exemplares do ‘Oikopleura’. JUAN BARBOSA 

“Só o acaso pode ser interpretado como uma mensagem. Aquilo que acontece por necessidade, aquilo que é esperado e que se repete todos os dias, não é senão uma coisa muda. Somente o acaso tem voz”, escreveu Milan Kundera em A Insustentável Leveza do Ser. E tem algo que fala, ou melhor, grita, numa praia de Badalona, perto de Barcelona: a que é dominada pela Ponte do Petróleo. Por esse dique de 250 metros, que penetra no mar Mediterrâneo, eram descarregados produtos petrolíferos até o final do século XX. E a seus pés se levanta desde 1870 a fábrica do Anís del Mono, o licor em cujo rótulo aparece um símio com cara de Charles Darwin em referência à teoria da evolução, que gerava polêmica na época.

Hoje, a Ponte do Petróleo é um belo mirante com uma estátua de bronze dedicada ao macaco com rosto darwinista. E, por um acaso que fala, entre seus frequentadores se encontra uma equipe de biólogos evolutivos do departamento de Genética da Universidade de Barcelona. Os cientistas caminham pela passarela sobre o oceano e lançam um cubo para fisgar um animal marinho, o Oikopleura dioica, de apenas três centímetros, mas que possui boca, ânus, cérebro e coração. Parece insignificante, mas, como Darwin, faz estremecer o discurso das religiões. Coloca o ser humano no lugar que lhe corresponde: com o resto dos animais.

“Temos sido mal influenciados pela religião, pensando que estávamos no topo da evolução. Na verdade, estamos no mesmo nível que o dos outros animais”, diz o biólogo Cristian Cañestro. Ele e o colega Ricard Albalat dirigem um dos únicos três centros científicos do mundo dedicados ao estudo do Oikopleura dioica. Os outros dois estão na Noruega e no Japão. O centro espanhol é uma salinha fria, com centenas de exemplares praticamente invisíveis colocados em recipientes de água, num canto da Faculdade de Biologia da Universidade de Barcelona.

O organismo marinho ‘Oikopleura dioica’ indica que a perda de genes ancestrais, compartilhados com os humanos, seria o motor da evolução

“A visão até agora era que, ao evoluir, ganhávamos em complexidade, adquirindo genes. Era o que se pensava quando os primeiros genomas foram sequenciados: de mosca, de minhoca e do ser humano. Mas vimos que não é assim. A maioria de nossos genes está também nas medusas. Nosso ancestral comum os possuía. Não que tenhamos ganhado genes; eles é que perderam. A complexidade genética é ancestral”, diz Cañestro.

Em 2006, o biólogo pesquisava o papel de um derivado da vitamina A, o ácido retinoico, no desenvolvimento embrionário. Essa substância indica às células de um embrião o que têm que fazer para se transformar num corpo adulto. O ácido retinoico ativa os genes necessários, por exemplo, para formar as extremidades, o coração, os olhos e as orelhas dos animais. Cañestro estudava esse processo no Oikopleura. E ficou de boca aberta.

Uma fêmea de 'Oikopleura dioica' cheia de ovos.

Uma fêmea de ‘Oikopleura dioica’ cheia de ovos. CAÑESTRO & ALBALAT LAB

“Os animais utilizam uma grande quantidade de genes para sintetizar o ácido retinoico. Percebi que no Oikopleura dioica faltava um desses genes. Depois vi que faltavam outros. Não encontramos nenhum”, recorda. Esse animal de três milímetros fabrica seu coração, de maneira inexplicável, sem ácido retinoico. “Se você vê um carro se mover sem rodas, nesse dia sua percepção sobre as rodas muda”, diz Cañestro.

O último ancestral comum entre nós e esse minúsculo habitante do oceano viveu há cerca de 500 milhões de anos. Desde então, o Oikopleura perdeu 30% dos genes que nos uniam. E fez isso com sucesso. Se você entrar em qualquer praia do mundo, ali estará ele rodeando o seu corpo. Na batalha da seleção natural, os Oikopleura ganharam. Sua densidade atinge 20.000 indivíduos por metro cúbico de água em alguns ecossistemas marinhos. São perdedores, mas só de genes.

Nosso último ancestral comum viveu há 500 milhões de anos. Desde então, o ‘Oikopleura’ perdeu 30% dos genes que nos uniam

Albalat e Cañestro acabam de publicar na revista especializada Nature Reviews Genetics um artigo que analisa a perda de genes como motor da evolução. Seu texto despertou interesse mundial. Foi recomendado pela F1000Prime, uma publicação internacional que aponta os melhores artigos sobre biologia e medicina. O trabalho começa com uma frase do imperador romano Marco Aurelio, filósofo estoico: “A perda nada mais é do que mudança, e a mudança é um prazer da natureza”.

Os dois biólogos afirmam que a perda de genes pode inclusive ter sido essencial para a origem da espécie humana. “O chimpanzé e o ser humano compartilham mais de 98% do seu genoma. Talvez tenhamos que procurar as diferenças nos genes que foram perdidos de maneira diferente durante a evolução dos humanos e dos demais primatas. Alguns estudos sugerem que a perda de um gene fez com que a musculatura de nossa mandíbula ficasse menor, o que permitiu aumentar o volume do nosso crânio”, diz Albalat. Talvez, perder genes nos tornou mais inteligentes que o resto dos mortais.

Pesquisadores do laboratório de Cristian Cañestro e Ricard Albalat.Pesquisadores do laboratório de Cristian Cañestro e Ricard Albalat. UB

 Em 2012, um estudo do geneticista norte-americano Daniel MacArthur mostrou que, em média, qualquer pessoa saudável tem 20 genes desativados. E isso aparentemente não importa. Albalat e Cañestro, do Instituto de Pesquisa da Biodiversidade (IRBio) da Universidade de Barcelona, citam dois exemplos muito estudados. Em algumas pessoas, os genes que codificam as proteínas CCR5 e DUFFY foram anulados por mutações. São as proteínas usadas, respectivamente, pelo vírus HIV e o parasita causador da malária para entrar nas células. A perda desses genes torna os humanos resistentes a essas doenças.

No laboratório de Cañestro e Albalat, há um cartaz que imita o do filme Cães de Aluguel (“Reservoir Dogs”, em inglês), de Quentin Tarantino: os cientistas e outros membros de sua equipe aparecem vestidos com camisa branca e gravata preta. A montagem se chama Reservoir Oiks, em alusão ao Oikopleura. Os dois biólogos acreditam que o organismo marinho permitirá formular e responder perguntas novas sobre nosso manual de instruções comum: o genoma.

O ‘Oikopleura’ permite estudar quais genes são essenciais: por que algumas mutações são irrelevantes e outras provocam efeitos devastadores em nossa saúde

O cérebro do Oikopleura tem cerca de 100 neurônios e o dos humanos, 86 bilhões. Mas somos muito mais semelhantes do que à primeira vista. Entre 60% e 80% das famílias de genes humanos têm um claro representante no genoma do Oikopleura. “Esse animal nos permite estudar quais genes humanos são essenciais”, diz Albalat. Em outras palavras: por que algumas mutações são irrelevantes e outras provocam efeitos terríveis em nossa saúde.

Os seres vivos possuem um sistema celular que repara as mutações surgidas no DNA. O Oikopleura doica perdeu 16 dos 83 genes ancestrais que regulam esse processo. Essa incapacidade para a autorreparação poderia explicar sua perda extrema de genes, segundo o artigo da Nature Reviews Genetics.

O olhar de Cañestro se ilumina quando ele fala dessas ausências. Os genes costumam atuar em grupo para levar a cabo uma função. Se de um grupo conhecido de oito genes faltam sete no Oikopleura, pois a função foi perdida, a permanência do oitavo gene pode revelar uma segunda função essencial que teria passado despercebida. Esse gene seria como um cruzamento de estradas. Desmantelada uma rodovia, ele sobrevive porque é fundamental em outra. “Essa segunda função já estava no ancestral comum e pode ser importante nos humanos”, diz Cañestro.

“Não existem animais superiores ou inferiores. Nossas peças de Lego são basicamente as mesmas, embora com elas possamos construir coisas diferentes”, afirma. Pense no seu lugar no mundo da próxima vez que mergulhar no mar. Essa neve branca que flutua na água e pode ser vista contra a luz são os excrementos do Oikopleura.

Ancestors of Modern Humans Interbred With Extinct Hominins, Study Finds (N.Y.Times)

Carl Zimmer

Skulls of the Neanderthal man. Credit: European Press photo Agency 

The ancestors of modern humans interbred with Neanderthals and another extinct line of humans known as the Denisovans at least four times in the course of prehistory, according to an analysis of global genomes published Thursday in the journal Science. 

The interbreeding may have given modern humans genes that bolstered immunity to pathogens, the authors concluded.“This is yet another genetic nail in the coffin of our oversimplistic models of human evolution,” said Carles Lalueza-Fox, a research scientist at the Institute of Evolutionary Biology in Barcelona, Spain, who was not involved in the study. 

The new study expands on a series of findings in recent years showing that the ancestors of modern humans once shared the planet with a surprising number of near relatives — lineages like the Neanderthals and Denisovans that became extinct tens of thousands of years ago.

Before disappearing, however, they interbred with our forebears on at least several occasions. Today, we carry DNA from these encounters.

The first clues to ancient interbreeding surfaced in 2010, when scientists discovered that some modern humans — mostly Europeans — carried DNA that matched material recovered from Neanderthal fossils.

Later studies showed that the forebears of modern humans first encountered Neanderthals after expanding out of Africa more than 50,000 years ago.

But the Neanderthals were not the only extinct humans that our own ancestors found. A finger bone discovered in a Siberian cave, called Denisova, yielded DNA from yet another group of humans.

Research later indicated that all three groups — modern humans, Neanderthals and Denisovans — shared a common ancestor who lived roughly 600,000 years ago. And, perhaps no surprise, some ancestors of modern humans also interbred with Denisovans.

Some of their DNA has survived in people in Melanesia, a region of the Pacific that includes New Guinea and the islands around it.

Those initial discoveries left major questions unanswered, such as how often our ancestors interbred with Neanderthals and Denisovans. Scientists have developed new ways to study the DNA of living people to tackle these mysteries.

Joshua M. Akey, a geneticist at the University of Washington, and his colleagues analyzed a database of 1,488 genomes from people around the world. The scientists added 35 genomes from people in New Britain and other Melanesian islands in an effort to learn more about Denisovans in particular.

The researchers found that all of the non-Africans in their study had Neanderthal DNA, while the Africans had very little or none. That finding supported previous studies.

But when Dr. Akey and his colleagues compared DNA from modern Europeans, East Asians and Melanesians, they found that each population carried its own distinctive mix of Neanderthal genes.

The best explanation for these patterns, the scientists concluded, was that the ancestors of modern humans acquired Neanderthal DNA on three occasions.

The first encounter happened when the common ancestor of all non-Africans interbred with Neanderthals.

The second occurred among the ancestors of East Asians and Europeans, after the ancestors of Melanesians split off. Later, the ancestors of East Asians — but not Europeans — interbred a third time with Neanderthals.

Earlier studies had hinted at the possibility that the forebears of modern humans had multiple encounters with Neanderthals, but hard data had been lacking.

“A lot of people have been arguing for that, but now they’re really providing the evidence for it,” said Rasmus Nielsen, a geneticist at the University of California, Berkeley, who was not involved in the new study.

The Melanesians took a different course. After a single interbreeding with Neanderthals, Dr. Akey found, their ancestors went on to interbreed just once with Denisovans as well.

Where that encounter could have taken place remains an enigma. The only place Denisovan remains have been found is Siberia, a long way from New Guinea.

It is possible that Denisovans ranged down to Southeast Asia, Dr. Akey said, crossing paths with modern humans who later settled in Melanesia.

Dr. Akey and his colleagues also identified some regions of Neanderthal and Denisovan DNA that became more common in modern humans as generations passed, suggesting that they provided some kind of a survival advantage.

Many of the regions contain immune system genes, Dr. Akey noted.

“As modern humans are spreading out across the world, they’re encountering pathogens they haven’t experienced before,” he said. Neanderthals and Denisovans may have had genes that were adapted to fight those enemies.

“Maybe they really helped us survive and thrive in these new environments,” he said.

Dr. Akey and his colleagues found that Neanderthal and Denisovan DNA was glaringly absent from four regions of the modern human genome.

That absence may signal that these stretches of the genome are instrumental in making modern humans unique. Intriguingly, one of those regions includes a gene called FOXP2, which is involved in speech.

Scientists suspect that Neanderthals and Denisovans were not the only extinct races our ancestors interbred with.

PingHsun Hsieh, a biologist at the University of Arizona, and his colleagues reported last month that the genomes of African pygmies contained pieces of DNA that came from an unknown source within the last 30,000 years.

Dr. Akey and his colleagues are now following up with an analysis of African populations. “This potentially allows us to find new twigs on the human family tree,” he said.