Arquivo mensal: outubro 2013

This Is How Cats See the World (Wired)



The blurriness at the edge of the photos represents the area of peripheral vision in humans (20 degrees, top) and cats (30 degrees, bottom). 

No one ever talks about what the world looks like if you’re a cat. Instead, we speak of the bird’s-eye view and use fish-eye lenses to make things look weird.

But we rarely consider how the internet’s favorite subject sees the world. Luckily, artist Nickolay Lammhas volunteered to act as cat-vision conduit. Here, Lamm presents his idea of what different scenes might look like if you were a cat, taking into consideration the way feline eyes work, and using input from veterinarians and ophthalmologists.

For starters, cats’ visual fields are broader than ours, spanning roughly 200 degrees instead of 180 degrees, and their visual acuity isn’t as good. So, the things humans can sharply resolve at distances of 100-200 feet look blurry to cats, which can see these objects at distances of up to 20 feet. That might not sound so great, but there’s a trade-off: Because of the various photoreceptors parked in cats’ retinas, they kick our asses at seeing in dim light. Instead of the color-resolving, detail-loving cone cells that populate the center of human retinas, cats (and dogs) have many more rod cells, which excel in dim light and are responsible for night-vision capability. The rod cells also refresh more quickly, which lets cats pick up very rapid movements — like, for example, the quickly shifting path a marauding laser dot might trace.

Lastly, cats see colors differently than we do, which is why the cat-versions of these images look less vibrant than the people-versions. Scientists used to think cats were dichromats — able to only see two colors — but they’re not, exactly. While feline photoreceptors are most sensitive to wavelengths in the blue-violet and greenish-yellow ranges, it appears they might be able to see a little bit of green as well. In other words, cats are mostly red-green color blind, as are many of us, with a little bit of green creeping in.

All Photos: Nickolay Lamm, in consultation with Kerry L. Ketring, DVM, DACVO of All Animal Eye Clinic, Dr. DJ Haeussler of The Animal Eye Institute, and the Ophthalmology group at Penn Vet.


Some Monkeys Have Conversations That Resemble Ours (Wired)



A pair of common marmosets. Image: Bart van Dorp/Flickr

The sounds of marmoset monkeys chattering may hint at the mysterious origins of human language.

A new study shows that marmosets exchange calls in a precisely timed, back-and-forth fashion typical of human conversation, but not found in other primates. The monkeys don’t appear to have a language, but the timing suggests the foundations of our own.

“That could be the foundation of more sophisticated things, like syntax,” said psychologist Asif Ghazanfar of Princeton University, co-author of the study, which was published today in Current Biology. “You can’t have any of those other really cool aspects of language without first having this.”

‘If you went back 10 million years, you’d be hard-pressed to predict that an ape would end up with the planet’s most complex vocal communication.’

How language, so complex and information-rich, evolved in Homo sapiens and, as far as we know, no other species, is one of anthropology’s outstanding questions. The traditional, seemingly intuitive answer is that it arose from the vocalizations of ancestors who were capable of a few rudimentary noises and wanted to say more.

Confounding that narrative, though, is the comparatively less-vocal nature of many other primates, including our closest living relatives, chimpanzees and bonobos. They do vocalize, of course, and even say some interesting things, but not with the same flow expected of some proto-human linguistic capability.

That conundrum has led researchers to propose another possible origin of language, one rooted not in our voices but rather our bodies, and in particular our hands. According to this narrative, gesture would have been as important to our ancestors as sound. Indeed, neurological processes underlying speech and language are also intimately linked with motor skills, raising the possibility that language formed on the cognitive scaffold of gesture — and chimpanzees do have a large repertoire of hand movements.

But many scientists, including Ghazanfar and the study’s lead author, fellow Princeton psychologist Daniel Takahashi, aren’t convinced. If human language did follow on gesture, they wonder, why don’t chimps talk more? There’s also no evidence in chimpanzees for vocal turn-taking, or waiting for another person to finish speaking before replying, which is universal in human languages. “If we don’t take turns, if we’re overlapping, it’s very difficult to understand each other,” said Ghazanfar. “Turn-taking is foundational.”

Yet even if chimps don’t take turns, Ghazanfar and Takahashi found that marmosets do. In the new study, they placed pairs of marmosets in the opposite corners of a room, separated by a curtain that allowed them to hear but not see each other, and recorded the ensuing chatter.

These proved to follow turn-taking patterns, with a pause of several seconds between the completion of one monkey’s whistles and the other’s beginning. And unlike the duets of birds, which are often highly synchronized, the exchanges had nothing to do with mating or territoriality. The monkeys were conversing.

Monkey Conversation:
Whistles encoding information about the caller’s identity are exchanged back and forth according to rules of timing also found in human conversation.

Audio: Takahashi et al./Ethology

As for what they said, marmoset whistles are thought to encode information about a caller’s identity, age, gender and location. Ghazanfar thinks the conversations are a sort of “vocal grooming,” a way of easing stress or conveying affection, but delivered at a distance. It only works when monkeys know they’re being addressed individually, which is conveyed by the turn-taking form.

“It could be a pre-adaptation for language,” said evolutionary biologist Thore Bergman of the University of Michigan, who was not involved in the study. Bergman’s own research involveshuman-sounding lip smacks made by monkeys called geladas.

As for why marmosets and humans take turns, but not chimpanzees, Ghazanfar suspects it’s a function of our social systems. Marmosets are cooperative breeders: Group members take care of offspring unrelated to them, creating community-oriented dynamics of behavior and communication. Ancestral humans may have lived the same way.

Without a time machine, of course, questions about the origin of human language won’t ever be settled. As Bergmann noted, the findings don’t exclude the possible importance of gesture. It’s possible that human language arose from the fortuitous interactions of gesture, vocalization and social structure with evolutionary pressure.

Indeterminacy aside, though, it’s fun to speculate, and also to wonder whether the seeds of complex language now exist in animals other than ourselves. Many whales and dolphins, along with syntax-using monkeys and even prairie dogs, communicate in very sophisticated ways.

“If you went back 10 million years, you’d be hard-pressed to predict an ape would end up with the planet’s most complex vocal communication system,” said Thore Bergman. “Why that happened is a really big puzzle.”

Citation: “Coupled Oscillator Dynamics of Vocal Turn-Taking in Monkeys.” By Daniel Takahashi, Darshana Narayanan and Asif Ghazanfar. Current Biology, 17 October 2013.

Skull Fossil Suggests Simpler Human Lineage (New York Times)

The 1.8-million-year-old skull was found during a dig in the Republic of Georgia. Georgian National Museum


Published: October 17, 2013

After eight years spent studying a 1.8-million-year-old skull uncovered in the Republic of Georgia, scientists have made a discovery that may rewrite the evolutionary history of our human genus Homo.

A Simpler Family Tree?

A skull discovered in the Republic of Georgia may indicate that early species in the genus Homo, right, are actually more closely related members of a single evolutionary lineage.


An artist’s rendition of what the original owner of Skull 5 may have looked like.

Skull 5, which was discovered alongside the remains of four other hominids in Dmanisi, Georgia. Courtesy of Guram Bumbiashvili, Georgian National Museum

An aerial view of the Dmanisi excavation site (foreground) and a medieval town. Fernando Javier Urquijo

It would be a simpler story with fewer ancestral species. Early, diverse fossils — those currently recognized as coming from distinct species like Homo habilis, Homo erectus and others — may actually represent variation among members of a single, evolving lineage.

In other words, just as people look different from one another today, so did early hominids look different from one another, and the dissimilarity of the bones they left behind may have fooled scientists into thinking they came from different species.

This was the conclusion reached by an international team of scientists led by David Lordkipanidze, a paleoanthropologist at the Georgian National Museum in Tbilisi, as reported Thursday in the journal Science.

The key to this revelation was a cranium excavated in 2005 and known simply as Skull 5, which scientists described as “the world’s first completely preserved adult hominid skull” of such antiquity. Unlike other Homo fossils, it had a number of primitive features: a long, apelike face, large teeth and a tiny braincase, about one-third the size of that of a modern human being. This confirmed that, contrary to some conjecture, early hominids did not need big brains to make their way out of Africa.

The discovery of Skull 5 alongside the remains of four other hominids at Dmanisi, a site in Georgia rich in material of the earliest hominid travels into Eurasia, gave the scientists an opportunity to compare and contrast the physical traits of ancestors that apparently lived at the same location and around the same time.

Dr. Lordkipanidze and his colleagues said the differences between these fossils were no more pronounced than those between any given five modern humans or five chimpanzees. The hominids who left the fossils, they noted, were quite different from one another but still members of one species.

“Had the braincase and the face of Skull 5 been found as separate fossils at different sites in Africa, they might have been attributed to different species,” a co-author of the journal report, Christoph Zollikofer of the University of Zurich, said in a statement. Such was often the practice of researchers, using variations in traits to define new species.

Although the Dmanisi finds look quite different from one another, Dr. Zollikofer said, the hominids who left them were living at the same time and place, and “so could, in principle, represent a single population of a single species.” He and his Zurich colleague, Marcia Ponce de León, conducted the comparative analysis of the Dmanisi specimens.

“Since we see a similar pattern and range of variation in the African fossil record,” Dr. Zollikofer continued, “it is sensible to assume that there was a single Homo species at that time in Africa.” Moreover, he added, “since the Dmanisi hominids are so similar to the African ones, we further assume that they both represent the same species.”

But what species? Some team members simply call their finds “early Homo.” Others emphasized the strong similarities to Homo erectus, which lived between two million and less than one million years ago. Tim D. White, a paleoanthropologist at the University of California, Berkeley, called it “the most primitive H. erectus yet known,” noting that “it is more similar than any other yet found to early Homo from eastern Africa,” a group of hominids estimated to have lived 2.3 million years ago.

All five of the skulls and skeletal bones were found in underground dens, suggesting grisly scenes from the perilous lives these early Homos led. They resided among carnivores, including saber-toothed cats and an extinct giant cheetah. All five of the individuals had probably been attacked and killed by the carnivores, their carcasses dragged into the dens for the after-hunt feast, with nothing left but dinner scraps for curious fossil hunters.

Dr. White and other scientists not involved in the research hailed the importance of the skull discovery and its implications for understanding early Homo evolution. In an article analyzing the report, Science quoted Ian Tattersall of the American Museum of Natural History in New York as saying that the skull “is undoubtedly one of the most important ever discovered.”

A few scientists quibbled that the skull looks more like Homo habilis or questioned the idea that fossils in Africa all belong to Homo erectus, but there was broad recognition that the new findings were a watershed in the study of evolution. “As the most complete early Homo skull ever found,” Dr. White wrote in an e-mail, “it will become iconic for Dmanisi, for earliest Homo erectus and more broadly for how we became human.”

Dr. White, who has excavated hominid fossils in Ethiopia for years, said he was impressed with “the total evidentiary package from the site that is the really good news story here.” Further, he said, he hoped the discovery would “now focus the debate on evolutionary biology beyond the boring ‘lumpers vs. splitters’ ” — a reference to the tendencies of fossil hunters to either lump new finds into existing species or split them off into new species.

In their report, the Dmanisi researchers said the Skull 5 individual “provides the first evidence that early Homo comprised adult individuals with small brains but body mass, stature and limb proportions reaching the lower range limit of modern variation.”

Skeletal bones associated with the five Dmanisi skulls show that these hominids were short in stature, but that their limbs enabled them to walk long distances as fully upright bipeds. The shape of the small braincase distinguished them from the more primitive Australopithecus genus, which preceded Homo and lived for many centuries with Homo in Africa.

Could Ants Teach the Biofuel Industry a Thing or Two? (Quest)

Post by  , Producer for on Sep 26, 2013

Leafcutter ants, native to Central and South America, can't digest the leaves they rely on for food, so they cultivate these gardens of fungi and bacteria to break down plant matter for them.

Leafcutter ants, native to Central and South America, can’t digest the leaves they rely on for food, so they cultivate these gardens of fungi and bacteria to break down plant matter for them. Photo courtesy of Alex Wild; used with permission.

In the lobby of the Microbial Sciences building at the University of Wisconsin, leafcutter ants in adisplay colony hike back and forth. Improbably large leaf fragments wobble on their backs as the ants ferry them between a dwindling pile of oak leaves and a garden of fungus studded with leaves in assorted states of decay.

Made up of a single species of fungus and a handful of bacterial strains, the fungus garden breaks down the ants’ leafy harvest through an efficient natural process. It’s a process that researchers believe could be a model for producing biofuel in a more sustainable way.

As we transition away from petroleum dependence, ethanol-based biofuel has risen to the forefront as one of the most accessible sources of renewable energy. It’s produced by fermenting plant sugars, which are strung together into long chains called polysaccharides. Before the fermentation process can begin, these chains have to be snipped apart, a process that varies in difficulty depending on the type of plant being used.

Polysaccharide chains found in corn kernels — the primary biofuel crop in the U.S. — are relatively simple to break up. But corn depletes the soil and guzzles water and fertilizer, and using it for fuel siphons calories from the food supply to gas tanks.

On the other hand, perennial grasses and agricultural “waste” like cornstalks offer a biofuel source that has a lighter impact on the environment. But these woodier fibers — referred to as“cellulosic” biomass — are a tangle of robust polysaccharides that are trickier to deconstruct. Further complicating this problem, the molecular structure of plant biomass isn’t uniform. What breaks down the polysaccharides near the surface of a cornstalk or blade of grass might not work at all on those buried more deeply.


University of Wisconsin researcher Frank Aylward peers into one of the lab’s many leafcutter ant colonies.

But finding efficient ways to extract energy from plants and other forms of biomass is not a new problem. In fact, it’s a problem that Earth’s plant eaters solved millions of years ago. And according to University of Wisconsin researcher Frank Aylward, if you’re looking for a model system, you can’t do better than leafcutter ants.

They may not have the imposing mien of herbivores like giraffes or elephants, but in Central and South America, leafcutter ants dominate, munching through more of the region’s foliage than any other organism.

But the ants can’t digest leaves by themselves — they have to rely on the garden’s microbes. “We sort of think of the fungus gardens as being an external gut,” Aylward explains. The garden digests biomass and reconstitutes its molecules in little nutrient packets holding a cocktail of carbohydrates, lipids, and proteins.

“The ants are essentially doing what we want to do with biofuel,” says Aylward. “They’re taking all of this recalcitrant plant biomass that’s full of all of these really complicated polymers and they’re degrading it and converting it into energy.” The transformation from leafy greens to energy source is mediated by hundreds of enzymes produced by the fungus garden’s microbes. If these enzymes chow down so efficiently on the leaves of Central America, Aylward and his coworkers wondered, could they be just as effective at breaking apart the sugars of cellulosic biomass in an industrial setting?

One model for a commercial biofuel process patterned after the fungus garden could entail splicing the genetic codes for the garden’s most effective enzymes into other microbes, prompting them to churn out biomass-digesting proteins.

But first, scientists needed to identify which enzymes the garden uses to digest leaves for the ants and which microbial residents produce them. By sequencing the genomes of the fungus and bacteria and comparing that data to the garden’s enzyme soup, Aylward and his coworkers were able to identify a fungus called Leucoagaricus gongylophorus as the garden’s biomass-degrading workhorse.

Aylward extracts a fragment of the fungus garden. This segment was near the surface, and still shows visible leaf matter; the biomass  in the garden sinks as it's broken down.

Aylward extracts a fragment of the fungus garden. This segment was near the surface, and still shows visible leaf matter; the biomass in the garden sinks as it’s broken down.

They also found that the fungus calibrates its enzyme cocktail for different stages of leaf decay. The biomass profile changes at each level in the garden — the freshest leaves sit near the top and the mostly decomposed waste material at the bottom. And Aylward found that the garden’s enzymes changed, too. That insight could provide the biofuel industry with some clues about which enzymes might excel early in the polysaccharide-decomposition process and which ones to apply later on.

Incidentally, this division of labor also reveals which enzymes the garden deploys together at each level. This is a huge boon to anyone designing industrial applications, since enzymes tend to work much better in specific combinations — and the garden has had 50 million years of symbiosis with the ants to find the most efficient combinations.

Aylward has already been approached by companies interested in synthesizing some of the garden’s enzymes and using them in biofuel production.

“It’s difficult to think that we can actually find a process that improves on nature,” Aylward points out, “so it probably makes sense to learn from it.”

Ice Cap Shows Ancient Mines Polluted the Globe (New York Times)


Published: December 09, 1997

SAMPLES extracted from Greenland’s two-mile-deep ice cap have yielded evidence that ancient Carthaginian and Roman silver miners working in southern Spain fouled the global atmosphere with lead for some 900 years.

The Greenland ice cap accumulates snow year after year, and substances from the atmosphere are entrapped in the permanent ice. From 1990 to 1992, a drill operated by the European Greenland Ice-Core Project recovered a cylindrical ice sample 9,938 feet long, pieces of which were distributed to participating laboratories. The ages of successive layers of the ice cap have been accurately determined, so the chemical makeup of the atmosphere at any given time in the past 9,000 years can be estimated by analyzing the corresponding part of the core sample.

Using exquisitely sensitive techniques to measure four different isotopes of lead in the Greenland ice, scientists in Australia and France determined that most of the man-made lead pollution of the atmosphere in ancient times had come from the Spanish provinces of Huelva, Seville, Almeria and Murcia. Isotopic analysis clearly pointed to the rich silver-mining and smelting district of Rio Tinto near the modern city of Nerva as the main polluter.

The results of this study were reported in the current issue of Environmental Science & Technology by Dr. Kevin J. R. Rosman of Curtin University in Perth, Australia, and his colleagues there and at the Laboratory of Glaciology and Geophysics of the Environment in Grenoble, France.

One of the problems in their analyses, the authors wrote, was the very low concentrations of lead remaining in ice dating from ancient times — only about one-hundredth the lead level found in Greenland ice deposited in the last 30 years. But the investigators used mass-spectrometric techniques that permitted them to sort out isotopic lead composition at lead levels of only about one part per trillion.

Dr. Rosman focused on the ratio of two stable isotopes, or forms, of lead: lead-206 and lead-207. His group found that the ratio of lead-206 to lead-207 in 8,000-year-old ice was 1.201. That was taken as the natural ratio that existed before people began smelting ores. But between 600 B.C. and A.D. 300, the scientists found, the ratio of lead-206 to lead-207 fell to 1.183. They called that ”unequivocal evidence of early large-scale atmospheric pollution by this toxic metal.”

All ore bodies containing lead have their own isotopic signatures, and the Rio Tinto lead ratio is 1.164. Calculations by the Australian-French collaboration based on their ice-core analysis showed that during the period 366 B.C. to at least A.D. 36, a period when the Roman Empire was at its peak, 70 percent of the global atmospheric lead pollution came from the Roman-operated Rio Tinto mines in what is now southwestern Spain.

The Rio Tinto mining region is known to archeologists as one of the richest sources of silver in the ancient world. Some 6.6 million tons of slag were left by Roman smelting operations there.

The global demand for silver increased dramatically after coinage was introduced in Greece around 650 B.C. But silver was only one of the treasures extracted from its ore. The sulfide ore smelted by the Romans also yielded an enormous harvest of lead.

Because it is easily shaped, melted and molded, lead was widely used by the Romans for plumbing, stapling masonry together, casting statues and manufacturing many kinds of utensils. All these uses presumably contributed to the chronic poisoning of Rome’s peoples.

Adding to the toxic hazard, Romans used lead vessels to boil and concentrate fruit juices and preserves. Fruits contain acetic acid, which reacts with metallic lead to form lead acetate, a compound once known as ”sugar of lead.” Lead acetate adds a pleasant sweet taste to food but causes lead poisoning — an ailment that is often fatal and, even in mild cases, causes debilitation and loss of cognitive ability.

Judging from the Greenland ice core, the smelting of lead-bearing ore declined sharply after the fall of the Roman Empire but gradually increased during the Renaissance. By 1523, the last year for which Dr. Rosman’s group conducted its Greenland ice analysis, atmospheric lead pollution had reached nearly the same level recorded for the year 79 B.C., at the peak of Roman mining pollution.

How Scott Collis Is Harnessing New Data To Improve Climate Models (Popular Science)

The former ski bum built open-access tools that convert raw data from radar databases into formats that climate modelers can use to better predict climate change.

By Veronique Greenwood and Valerie Ross

Posted 10.16.2013 at 3:00 pm

Scott Collis (by Joel Kimmel)

Each year, Popular Science seeks out the brightest young scientists and engineers and names them the Brilliant Ten. Like the 110 honorees before them, the members of this year’s class are dramatically reshaping their fields–and the future. Some are tackling pragmatic questions, like how to secure the Internet, while others are attacking more abstract ones, like determining the weather on distant exoplanets. The common thread between them is brilliance, of course, but also impact. If the Brilliant Ten are the faces of things to come, the world will be a safer, smarter, and brighter place.–The Editors

Scott Collis

Argonne National Laboratory


Harnessing new data to improve climate models

Clouds are one of the great challenges for climate scientists. They play a complex role in the atmosphere and in any potential climate-change scenario. But rudimentary data has simplified their role in simulations, leading to variability among climate models. Scott Collis discovered a way to add accuracy to forecasts of future climate—by tapping new sources of cloud data.

Collis has extensive experience watching clouds, first as a ski bum during grad school in Australia and then as a professional meteorologist. But when he took a job at the Centre for Australian Weather and Climate Research, he realized there was an immense source of cloud data that climate modelers weren’t using: the information collected for weather forecasts. So Collis took on the gargantuan task of building open-access tools that convert the raw data from radar databases into formats that climate modelers can use. In one stroke, Collis unlocked years of weather data. “We were able to build such robust algorithms that they could work over thousands of radar volumes without human intervention,” says Collis.

When the U.S. Department of Energy caught wind of his project, it recruited him to work with a new radar network designed to collect high-quality cloud data from all over the globe. The network, the largest of its kind, isn’t complete yet, but already the data that Collis and his collaborators have collected is improving next-generation climate models.

Click here to see more from our annual celebration of young researchers whose innovations will change the world. This article originally appeared in the October 2013 issue of Popular Science.

Tool Accurately Predicts Whether A Kickstarter Project Will Bomb (Popular Science)

At about 76 percent accuracy, a new prediction model is the best yet. “Your chances of success are at 8 percent. Commence panic.”

By Colin Lecher

Posted 10.16.2013 at 2:00 pm


Ouya, A Popular Kickstarter Project 

Well, here’s something either very discouraging or very exciting for crowdfunding hopefuls: a Swiss team can predict, with about 76 percent accuracy and within only four hours of launch, whether a Kickstarter project will succeed.

The team, from the university École Polytechnique Fédérale de Lausanne, laid out a system in a paper presented at the Conference on Online Social Networks. By mining data on more than 16,000 Kickstarter campaigns and more than 1.3 million users, they created a prediction model based on the project’s popularity on Twitter, the rate of cash it’s getting, how many first-time backers it has, and the previous projects supporters have backed.

A previous, similar model built by Americans could predict a Kicktarter project’s success with 68 percent accuracy–impressive, but the Swiss project has another advantage: it’s dynamic. While the American model could only make a prediction before the project launched, the Swiss project monitors projects in real time. They’ve even built a tool, called Sidekick, that monitors projects and displays their chances of success.

Other sites, like Kicktraq, offer similar services, but the predictions aren’t as accurate as the Swiss team claims theirs are. If you peruse Sidekick, you can see how confident the algorithm is in its pass/fail predictions: almost all of the projects are either above 90 percent or below 10 percent. Sort of scary, probably, if you’re launching a project. Although there’s always a chance you could pull yourself out of the hole, it’s like a genie asking if you want to know how you die: Do you really want that information?


Carbon Cycle Models Underestimate Indirect Role of Animals (Science Daily)

Oct. 16, 2013 — Animal populations can have a far more significant impact on carbon storage and exchange in regional ecosystems than is typically recognized by global carbon models, according to a new paper authored by researchers at the Yale School of Forestry & Environmental Studies (F&ES). 

Wildebeests herd, Serengeti. Scientists found that a decline in wildebeest populations in the Serengeti-Mara grassland-savanna system decades ago allowed organic matter to accumulate, which eventually led to about 80 percent of the ecosystem to burn annually, releasing carbon from the plants and the soil, before populations recovered in recent years. (Credit: © photocreo / Fotolia)

In fact, in some regions the magnitude of carbon uptake or release due to the effects of specific animal species or groups of animals — such as the pine beetles devouring forests in western North America — can rival the impact of fossil fuel emissions for the same region, according to the paper published in the journal Ecosystems.

While models typically take into account how plants and microbes affect the carbon cycle, they often underestimate how much animals can indirectly alter the absorption, release, or transport of carbon within an ecosystem, says Oswald Schmitz, the Oastler Professor of Population and Community Ecology at F&ES and lead author of the paper. Historically, the role of animals has been largely underplayed since animal species are not distributed globally and because the total biomass of animals is vastly lower than the plants that they rely upon, and therefore contribute little carbon in the way of respiration.

“What these sorts of analyses have not paid attention to is what we call the indirect multiplier effects,” Schmitz says. “And these indirect effects can be quite huge — and disproportionate to the biomass of the species that are instigating the change.”

In the paper, “Animating the Carbon Cycle,” a team of 15 authors from 12 universities, research organizations and government agencies cites numerous cases where animals have triggered profound impacts on the carbon cycle at local and regional levels.

In one case, an unprecedented loss of trees triggered by the pine beetle outbreak in western North America has decreased the net carbon balance on a scale comparable to British Columbia’s current fossil fuel emissions.

And in East Africa, scientists found that a decline in wildebeest populations in the Serengeti-Mara grassland-savanna system decades ago allowed organic matter to accumulate, which eventually led to about 80 percent of the ecosystem to burn annually, releasing carbon from the plants and the soil, before populations recovered in recent years.

“These are examples where the animals’ largest effects are not direct ones,” Schmitz says. “But because of their presence they mitigate or mediate ecosystem processes that then can have these ramifying effects.”

“We hope this article will inspire scientists and managers to include animals when thinking of local and regional carbon budgets,” said Peter Raymond, a professor of ecosystem ecology at the Yale School of Forestry & Environmental Studies.

According to the authors, a more proper assessment of such phenomena could provide insights into management schemes that could help mitigate the threat of climate change.

For example, in the Arctic, where about 500 gigatons of carbon is stored in permafrost, large grazing mammals like caribou and muskoxen can help maintain the grasslands that have a high albedo and thus reflect more solar energy. In addition, by trampling the ground these herds can actually help reduce the rate of permafrost thaw, researchers say.

“It’s almost an argument for rewilding places to make sure that the natural balance of predators and prey are there,” Schmitz says. “We’re not saying that managing animals will offset these carbon emissions. What we’re trying to say is the numbers are of a scale where it is worthwhile to start thinking about how animals could be managed to accomplish that.”

Journal Reference:

  1. Oswald J. Schmitz, Peter A. Raymond, James A. Estes, Werner A. Kurz, Gordon W. Holtgrieve, Mark E. Ritchie, Daniel E. Schindler, Amanda C. Spivak, Rod W. Wilson, Mark A. Bradford, Villy Christensen, Linda Deegan, Victor Smetacek, Michael J. Vanni, Christopher C. Wilmers.Animating the Carbon CycleEcosystems, 2013; DOI:10.1007/s10021-013-9715-7

Software Uses Cyborg Swarm to Map Unknown Environs (Science Daily)

Oct. 16, 2013 — Researchers from North Carolina State University have developed software that allows them to map unknown environments — such as collapsed buildings — based on the movement of a swarm of insect cyborgs, or “biobots.”

Researchers from North Carolina State University have developed software that allows them to map unknown environments — such as collapsed buildings — based on the movement of a swarm of insect cyborgs, or “biobots.” (Credit: Image by Edgar Lobaton.)

“We focused on how to map areas where you have little or no precise information on where each biobot is, such as a collapsed building where you can’t use GPS technology,” says Dr. Edgar Lobaton, an assistant professor of electrical and computer engineering at NC State and senior author of a paper on the research.

“One characteristic of biobots is that their movement can be somewhat random,” Lobaton says. “We’re exploiting that random movement to work in our favor.”

Here’s how the process would work in the field. A swarm of biobots, such as remotely controlled cockroaches, would be equipped with electronic sensors and released into a collapsed building or other hard-to-reach area. The biobots would initially be allowed to move about randomly. Because the biobots couldn’t be tracked by GPS, their precise locations would be unknown. However, the sensors would signal researchers via radio waves whenever biobots got close to each other.

Once the swarm has had a chance to spread out, the researchers would send a signal commanding the biobots to keep moving until they find a wall or other unbroken surface — and then continue moving along the wall. This is called “wall following.”

The researchers repeat this cycle of random movement and “wall following” several times, continually collecting data from the sensors whenever the biobots are near each other. The new software then uses an algorithm to translate the biobot sensor data into a rough map of the unknown environment.

“This would give first responders a good idea of the layout in a previously unmapped area,” Lobaton says.

The software would also allow public safety officials to determine the location of radioactive or chemical threats, if the biobots have been equipped with the relevant sensors.

The researchers have tested the software using computer simulations and are currently testing the program with robots. They plan to work with fellow NC State researcher Dr. Alper Bozkurt to test the program with biobots.

The paper, “Topological Mapping of Unknown Environments using an Unlocalized Robotic Swarm,” will be presented at the International Conference on Intelligent Robots and Systems being held Nov. 3-8 in Tokyo, Japan. Lead author of the paper is Alireza Dirafzoon, a Ph.D. student at NC State. The work was supported by National Science Foundation grant CNS-1239243.

Economic Dangers of ‘Peak Oil’ Addressed (Science Daily)

Oct. 16, 2013 — Researchers from the University of Maryland and a leading university in Spain demonstrate in a new study which sectors could put the entire U.S. economy at risk when global oil production peaks (“Peak Oil”). This multi-disciplinary team recommends immediate action by government, private and commercial sectors to reduce the vulnerability of these sectors.

The figure above shows sectors’ importance and vulnerability to Peak Oil. The bubbles represent sectors. The size of the bubbles visualizes the vulnerability of a particular sector to Peak Oil according to the expected price changes; the larger the size of the bubble, the more vulnerable the sector is considered to be. The X axis shows a sector’s importance according to its contribution to GDP and on the Y axis according to its structural role. Hence, the larger bubbles in the top right corner represent highly vulnerable and highly important sectors. In the case of Peak Oil induced supply disruptions, these sectors could cause severe imbalances for the entire U.S. economy. (Credit: Image courtesy of University of Maryland)

While critics of Peak Oil studies declare that the world has more than enough oil to maintain current national and global standards, these UMD-led researchers say Peak Oil is imminent, if not already here — and is a real threat to national and global economies. Their study is among the first to outline a way of assessing the vulnerabilities of specific economic sectors to this threat, and to identify focal points for action that could strengthen the U.S. economy and make it less vulnerable to disasters.

Their work, “Economic Vulnerability to Peak Oil,” appears inGlobal Environmental Change. The paper is co-authored by Christina Prell, UMD’s Department of Sociology; Kuishuang Feng and Klaus Hubacek, UMD’s Department of Geographical Sciences, and Christian Kerschner, Institut de Ciència i Tecnologia Ambientals, Universitat Autònoma de Barcelona.

A focus on Peak Oil is increasingly gaining attention in both scientific and policy discourses, especially due to its apparent imminence and potential dangers. However, until now, little has been known about how this phenomenon will impact economies. In their paper, the research team constructs a vulnerability map of the U.S. economy, combining two approaches for analyzing economic systems. Their approach reveals the relative importance of individual economic sectors, and how vulnerable these are to oil price shocks. This dual-analysis helps identify which sectors could put the entire U.S. economy at risk from Peak Oil. For the United States, such sectors would include iron mills, chemical and plastic products manufacturing, fertilizer production and air transport.

“Our findings provide early warnings to these and related industries about potential trouble in their supply chain,” UMD Professor Hubacek said. “Our aim is to inform and engage government, public and private industry leaders, and to provide a tool for effective Peak Oil policy action planning.”

Although the team’s analysis is embedded in a Peak Oil narrative, it can be used more broadly to develop a climate roadmap for a low carbon economy.

“In this paper, we analyze the vulnerability of the U.S. economy, which is the biggest consumer of oil and oil-based products in the world, and thus provides a good example of an economic system with high resource dependence. However, the notable advantage of our approach is that it does not depend on the Peak-Oil-vulnerability narrative but is equally useful in a climate change context, for designing policies to reduce carbon dioxide emissions. In that case, one could easily include other fossil fuels such as coal in the model and results could help policy makers to identify which sectors can be controlled and/or managed for a maximum, low-carbon effect, without destabilizing the economy,” Professor Hubacek said.

One of the main ways a Peak Oil vulnerable industry can become less so, the authors say, is for that sector to reduce the structural and financial importance of oil. For example, Hubacek and colleagues note that one approach to reducing the importance of oil to agriculture could be to curbing the strong dependence on artificial fertilizers by promoting organic farming techniques and/or reducing the overall distance travelled by people and goods by fostering local, decentralized food economies.

Peak Oil Background and Impact

The Peak Oil dialogue shifts attention away from discourses on “oil depletion” and “stocks” to focus on declining production rates (flows) of oil, and increasing costs of production. The maximum possible daily flow rate (with a given technology) is what eventually determines the peak; thus, the concept can also be useful in the context of other renewable resources.

Improvements in extraction and refining technologies can influence flows, but this tends to lead to steeper decline curves after the peak is eventually reached. Such steep decline curves have also been observed for shale gas wells.

“Shale developments are, so we believe, largely overrated, because of the huge amounts of financial resources that went into them (danger of bubble) and because of their apparent steep decline rates (shale wells tend to peak fast),” according to Dr. Kerschner.

“One important implication of this dialogue shift is that extraction peaks occur much earlier in time than the actual depletion of resources,” Professor Hubacek said. “In other words, Peak Oil is currently predicted within the next decade by many, whereas complete oil depletion will in fact occur never given increasing prices. This means that eventually petroleum products may be sold in liter bottles in pharmacies like in the old days. ”

Journal Reference:

  1. Christian Kerschner, Christina Prell, Kuishuang Feng, Klaus Hubacek. Economic vulnerability to Peak OilGlobal Environmental Change, 2013; DOI:10.1016/j.gloenvcha.2013.08.015

Cockroach farms multiplying in China (L.A.Times)

Dried cockroaches are ready to be sold to pharmaceutical companies from a farm in Jinan, China. One farmer says the insects are easy to raise and profitable.

Farmers are pinning their future on the often-dreaded insect, which when dried goes for as much as $20 a pound — for use in Asian medicine and in cosmetics.




Oct. 15, 2013

This squat concrete building was once a chicken coop, but now it’s part of a farm with an entirely different kind of livestock — millions of cockroaches.

Inside, squirming masses of the reddish-brown insects dart between sheets of corrugated metal and egg cartons that have been tied together to provide the kind of dark hiding places they favor.

Wang Fuming kneels down and pulls out one of the nests. Unaccustomed to the light, the roaches scurry about, a few heading straight up his arm toward his short-sleeve shirt.

“Nothing to be afraid of,” Wang counsels visitors who are shrinking back into the hallway, where stray cockroaches cling to a ceiling that’s perilously close overhead.

Although cockroaches evoke a visceral dread for most people, Wang looks at them fondly as his fortune — and his future.

People laughed at me when I started, but I always thought that cockroaches would bring me wealth.”

— Zou Hui, cockroach farmer

The 43-year-old businessman is the largest cockroach producer in China (and thus probably in the world), with six farms populated by an estimated 10 million cockroaches. He sells them to producers of Asian medicine and to cosmetic companies that value the insects as a cheap source of protein as well as for the cellulose-like substance on their wings.

The favored breed for this purpose is the Periplaneta americana, or American cockroach, a reddish-brown insect that grows to about 1.6 inches long and, when mature, can fly, as opposed to the smaller, darker, wingless German cockroach.

Since Wang got into the business in 2010, the price of dried cockroaches has increased tenfold, from about $2 a pound to as much as $20, as manufacturers of traditional medicine stockpile pulverized cockroach powder.

“I thought about raising pigs, but with traditional farming, the profit margins are very low,” Wang said. “With cockroaches, you can invest 20 yuan and get back 150 yuan,” or $3.25 for a return of $11.

China has about 100 cockroach farms, and new ones are opening almost as fast as the prolific critters breed. But even among Chinese, the industry was little known until August, when a million cockroaches got out of a farm in neighboring Jiangsu province. The Great Escape made headlines around China and beyond, evoking biblical images of swarming locusts.

Big moneymaker

Business is booming at the Shandong Xin Da Ground Beetle Farm.

Only the prospect of all those lost earnings would faze Wang, a compact man with a wisp of a mustache and wire-rim glasses who looks like a scientist, but has no more than a high school education. After graduating, he went to work in a tire factory.

“I felt I would never get anywhere in life at the factory and I wanted to start a business,” he said.

As a boy he had liked collecting insects, so he started with scorpions and beetles, both used in traditional medicine and served as a delicacy. One batch of his beetle eggs turned out to be contaminated with cockroach eggs.

“I was accidentally raising cockroaches and then I realized they were the easiest and most profitable,” he said.

The start-up costs are minimal — Wang bought only eggs, a run-down abandoned chicken coop and the roofing tile. Notoriously hearty, roaches aren’t susceptible to the same diseases as farm animals. As for feeding them, cockroaches are omnivores, though they favor rotten vegetables. Wang feeds his brood with potato and pumpkin peelings discarded from nearby restaurants.

Cockroaches are survivors. We want to know what makes them so strong.”

— Li Shunan, professor of traditional medicine

Killing them is easy too: Just scoop or vacuum them out of their nests and dunk them in a big vat of boiling water. Then they’re dried in the sun like chile peppers.

Perhaps understandably, the cockroach business (“special farming,” as it is euphemistically called) is a fairly secretive industry. Wang’s farm, for instance, operates in an agribusiness industrial park under an elevated highway. The sign at the front gate simply reads Jinan Hualu Feed Co.

Some companies that use cockroaches don’t like to advertise their “secret ingredient.” And the farmers themselves are wary of neighbors who might not like a cockroach farm in their backyard.

“We try to keep a low profile,” said Liu Yusheng, head of the Shandong Insect Industry Assn., the closest thing there is to a trade organization. “The government is tacitly allowing us to do what we do, but if there is too much attention, or if cockroach farms are going into residential areas, there could be trouble.”

Liu worries about the rapid growth of an industry with too many inexperienced players and too little oversight. In 2007, a million Chinese lost $1.2 billion when a firm promoting ant farming turned out to be a Ponzi scheme and went bankrupt.

“This is not like raising regular farm animals or vegetables where the Agricultural Ministry knows who is supposed to regulate it. Nobody knows who is in charge here,” he said.

The low start-up costs make raising cockroaches an appealing business for wannabe entrepreneurs, who can buy cockroach eggs and complete how-to kits from promoters.

“People laughed at me when I started, but I always thought that cockroaches would bring me wealth,” said Zou Hui, 40, who quit her job at a knitting factory in 2008 after seeing a television program about raising cockroaches.

Wang Fuming, at his farm in Jinan, is the largest cockroach producer in China (and thus probably in the world), with six farms populated by an estimated 10 million cockroaches.

It’s not exactly a fortune, but the $10,000 she brings in annually selling cockroaches is decent money for her hometown in rural Sichuan province, and won her an award last year from local government as an “Expert in Getting Wealthy.”

“Now I’m teaching four other families,” Zou said. “They want to get rich like me.”

But inexperienced farmers can get into trouble, as Wang Pengsheng (no relation to fellow roach farmer Wang) found out after his cockroaches staged the Great Escape.

He had opened his farm just six months earlier in a newly constructed building that municipal code officials complained was too close to protected watershed land. At noon on Aug. 20, while workers were out for lunch, a demolition crew knocked down the building. The roaches made a run for it.

“They didn’t know I had cockroaches in there. They wouldn’t have demolished the building like that if there were cockroaches that would get out,” Wang Pengsheng said in a telephone interview.

After discovering the flattened building and homeless roaches scurrying among the rubble, he tried to corral the escapees but was unsuccessful. He called in local health officials, who helped him exterminate the roaches. Wang said he has received about $8,000 in compensation from local government and hopes to use the money to rebuild his farm elsewhere.

At least five pharmaceutical companies are using cockroaches for traditional Chinese medicine. Research is underway in China (and South Korea) on the use of pulverized cockroaches for treating baldness, AIDS and cancer and as a vitamin supplement. South Korea’s Jeonnam Province Agricultural Research Institute and China’s Dali University College of Pharmacy have published papers on the anti-carcinogenic properties of the cockroach.

Li Shunan, a 78-year-old professor of traditional medicine from the southwestern province of Yunnan who is considered the godfather of cockroach research, said he discovered in the 1960s that ethnic minorities near the Vietnamese border were using a cockroach paste to treat bone tuberculosis.

“Cockroaches are survivors,” Li said. “We want to know what makes them so strong — why they can even resist nuclear effects.”

Liu Yusheng, head of the Shandong Insect Industry Assn. eats fried cockroaches. Liu worries about the rapid growth of an industry with too many inexperienced players and too little oversight.

Li reels off an impressive, if implausible, list of health claims: “I lost my hair years ago. I made a spray of cockroaches, applied it on my scalp and it grew back. I’ve used it as a facial mask and people say I haven’t changed at all over the years.

“Cockroaches are very tasty too.”

Many farmers are hoping to boost demand by promoting cockroaches in fish and animal feed and as a delicacy for humans.

Chinese aren’t quite as squeamish as most Westerners about insects — after all, people here still keep crickets as pets.

In Jinan, Wang Fuming and his wife, who run the farm together, seem genuinely fond of their cockroaches and a little hurt that others don’t feel affection.

“What is disgusting about them?” Li Wanrong, Wang’s wife, asked as a roach scurried around her black leather pumps. “Look how beautiful they are. So shiny!”

Over lunch at a restaurant down the block from his farm, Wang placed a plate of fried cockroaches seasoned with salt on the table along with more conventional cuisine, and proceeded to nibble a few with his chopsticks. He expressed disapproval that visiting journalists refused to sample the roaches.

On saying goodbye at the end of the day, he added a final rejoinder.

“You will regret your whole life not trying them.”

Nicole Liu in The Times’ Beijing bureau contributed to this report.

FOR THE RECORD:Wednesday’s Column One story about cockroach farming in China misstated the value of 150 Chinese yuan as $11. It is equal to $24.

Salvamento de Beagles usados como cobaias no Instituto Royal

JC e-mail 4839, de 22 de outubro de 2013

Especialista da Fiocruz considera equívoco invasão ao Instituto Royal (Jornal da Ciência)

Para Marco Aurélio Martins, o ataque de ativistas aos experimentos científicos é uma tentativa de desinformar “irresponsavelmente” a população

É preocupante a invasão “equivocada” de grupos defensores de animais ao Instituto Royal, levando 178 cães da raça beagle, além de outras cobaias científicas. A afirmação é do pesquisador chefe do Laboratório de Inflamação da Fundação Oswaldo Cruz (Fiocruz), Marco Aurélio Martins. “É preocupante pelo discurso equivocado sobre a importância que a pesquisa tem”, diz ele, em entrevista ao Jornal da Ciência. A invasão aconteceu na madrugada da última sexta-feira (18), na instituição instalada em São Roque, a 51 km de São Paulo.

Para ele, o ataque de ativistas aos experimentos científicos é uma tentativa de desinformar “irresponsavelmente” a população em geral, leiga dos conhecimentos científicos. “Passar para população de que a experimentação animal é algo simplesmente cruel, que agride os animais, que só faz mal a eles sem nenhum benefício nem para os seres humanos, nem para os próprios animais, é desinformar”, declara.

Martins reforça que o uso de animais nos experimentos científicos ainda é necessário para estudar várias áreas da saúde pública,desde as doenças tropicais, como malária e outras mais graves, como câncer, asma e hipertensão. “Como podemos abrir mão de estudar esses problemas tão complexos se não tivermos ferramentas experimentais?”, pergunta. “Todos os medicamentos disponíveis nas prateleiras das farmácias e no mercado veterinário dependeram da experimentação animal, em algum momento.”

O pesquisador insiste em dizer que todos os testes científicos com animais obedecem às normas nacionais, previstas na Lei Arouca Nº 11.794, em vigor há três anos. De acordo com ele, o uso de animais nos experimentos científicos não é exclusividade do Brasil. Conforme entende o biólogo, todos os países avançados em ciência e tecnologia permanecem usando os animais em experimentação. “Não é verdadeiro dizer que não se faz mais uso de animais na Europa e nos Estados Unidos”, diz. A restrição é maior (apenas) para primatas, como macacos e chimpanzés.”

JC – O senhor conhece a política do Instituto Royal aplicada nos experimentos científicos de animais?
Martins – Sou ligado a um instituto nacional de ciência e tecnologia de fármacos, INCT-Inofar, do qual o Royal é um dos colaboradores. Conheço a reputação e a seriedade do Instituto. Mas nunca o visitei e nunca utilizei o centro como prestador de serviços.

Qual a sua avaliação sobre a invasão dos ativistas ao Instituto Royal?
Vejo com muita preocupação. É uma radicalização. Já tivemos iniciativas semelhantes no Brasil no passado, mas nada tão veemente. Na própria Fiocruz, por volta de 2000, houve uma invasão, quando pesquisadores foram processados pelo fato de gambás serem encontrados fora da caixa deles. Mas nunca vi algo tão radical, como agora, de ver o pessoal entrar e liberar os animais. Me preocupa muito este momento, no qual o Brasil vive uma tensão social, de manifestações, como os Black Blocs. Já vimos esse filme em outros países, em que esse ativismo levou a problemas enormes, de agressividade.

Esse cenário preocupa a área científica?
Preocupa pela desinformação irresponsável. Passar para a população em geral, leiga, de que a experimentação animal é algo simplesmente cruel, que agride aos animais, que só faz mal aos animais sem nenhum benefício para os seres humanos e nem para os próprios animais. Isso é desinformar. Não é difícil sensibilizar, sobretudo, as pessoas que não sabem como as pesquisas são realizadas. Ou informar, equivocamente, de que apenas o Brasil é o único país que utiliza os animais em experimentos científicos. Preocupa o discurso equivocado sobre a importância que a pesquisa tem. Os profissionais da ciência do Brasil se deparam hoje com uma responsabilidade muito grande. Temos de ser muito hábeis e contar com a colaboração da imprensa para que as palavras não sejam deturpadas. É preciso ter cuidado de passar para a população em geral, de tranquilizá-la, de que os centros de pesquisas estabelecidos no Brasil são de excelência, não são centros de terror.

Quais os benefícios que o experimento cientifico com animal traz para a população e para os próprios animais?
Todos os medicamentos disponíveis nas prateleiras das farmácias e no mercado veterinário dependeram da experimentação animal, em algum momento. O risco de não fazermos isso, de não fazer os experimentos é enorme para a população na hora de disponibilizar os potenciais medicamentos.

Os experimentos científicos com animais precisam atender à legislação interna…
Claro que a comunidade científica sabe que precisa seguir as regras. Somos obrigados a obter licenças, existem leis que controlam a experimentação animal, tanto no Brasil como no mundo. No Brasil, a legislação é a Lei Arouca, em vigor há três anos. No caso, se houvesse uma denúncia de maus tratos na Fiocruz ou mesmo no Instituto Royal, o Concea [Conselho Nacional de Controle de Experimentação Animal] tem o papel de receber a denúncia, de avaliar e investigar para tomar as atitudes. Os maus-tratos de animais de experimentação são passíveis de criminalização. Se tiver acontecendo irregularidade, isso tem de ser exemplarmente punido. O que não pode é autorizar que as pessoas saiam invadindo o local e liberando animais de experimentação. Isso trará prejuízos não apenas para o andamento das pesquisas científicas, mas para a credibilidade do desenvolvimento de novos fármacos no país, para a população e para os próprios animais. Se é que existem maus tratos aos animais que isso seja levado aos órgãos competentes e que se puna quem tiver agindo de maneira errada.

É o caso do Instituto Royal?
Não acredito que seja. Pelo que conheço sobre a reputação das pessoas responsáveis não tenho razão nenhuma para acreditar que tivesse ocorrendo algum tipo de irregularidade interna. Se tivesse acontecendo, numa hipótese terrível, hoje a nossa sociedade já dispõe de um canal, que é Concea.

As pesquisas ainda são necessárias com os animais?
Claro que são, porque precisamos de mecanismos para avançar nas formas de tratamento (de saúde) que temos hoje, na terapia. Ainda temos problemas enormes em várias áreas da saúde pública, desde as doenças tropicais, como malária e outras mais graves, como câncer, asma e hipertensão. Como podemos abrir mão de estudar esses problemas tão complexos se não tivermos ferramentas experimentais? Como impedir cientistas e especialistas, dentro das condições de boas práticas e de boa conduta ética, de entender as doenças e buscar uma forma de controlá-las? Isso seria interromper a investigação científica. Não se pode passar para a opinião pública a ideia de que não se pode mais usar os animais em experimentos científicos.

Outros países ainda usam animais em experimentos científicos?
Claro que usam. Todos os países considerados avançados em ciência e tecnologia continuam usando os animais em experimentação. Não é verdadeiro dizer que não se faz mais uso de animais na Europa ou nos Estados Unidos. A restrição é maior (apenas) para primatas, como macacos e chimpanzés.

Os protocolos proíbem a crueldade nos animais?
Não pode haver crueldade. Isso é crime. Ao montar um protocolo experimental o pesquisador tem de garantir que o animal esteja dentro das condições de bem estar, para que possa, inclusive, acreditar nos resultados a serem obtidos da experimentação.

(Viviane Monteiro – Jornal da Ciência)

Outras matérias sobre o assunto:

Revista Galileu

‘Um dia reduziremos. Mas acabar com testes em animais agora é impossível’,,EMI344225-17770,00-UM+DIA+REDUZIREMOS+MAS+ACABAR+COM+TESTES+EM+ANIMAIS+AGORA+E+IMPOSSIVEL.html

Folha de S.Paulo

Retirada de cães de instituto afeta pesquisa anticâncer, diz cientista

Experimentação animal

Deputado fica com ‘guarda’ e dá nome de filhas a beagles

O Globo

Ministério Público de SP espera investigação da polícia para decidir sobre beagles

O Estado de S.Paulo

Ladrões de cobaias,ladroes-de-cobaias-,1088290,0.htm

Instituto doará beagles que forem recuperados,instituto-doara-beagles-que-forem-recuperados,1088254,0.htm

Zero Hora

Sentimentalismo e direitos dos animais,1,1,,,13

Agência Câmara Notícias

Comissão investigará denúncias de maus-tratos contra animais no Instituto Royal

*   *   *

22/10/2013 – 03h00

Retirada de cães de instituto afeta teste anticâncer, diz cientista (Folha de S.Paulo)



A retirada de 178 cães da raça beagle de um laboratório em São Roque (a 66 km de São Paulo) comprometeu experimentos avançados de um medicamento para tratamento contra câncer –além de fitoterápicos para usos diversos.

A informação é do médico Marcelo Marcos Morales, um dos secretários da Sociedade Brasileira para o Progresso da Ciência e coordenador do Concea (Conselho Nacional de Controle de Experimentação Animal), ligado ao Ministério da Ciência e Tecnologia.

“Um trabalho que demorou anos para ser produzido, que tinha resultados promissores para o desenvolvimento do país, foi jogado no lixo”, disse ele, em referência à invasão do Instituto Royal por ativistas na semana passada.

“O prejuízo é incalculável para a ciência e para o benefício das pessoas”, afirmou.

O cientista não revelou o nome do medicamento desenvolvido, que é protegido por contrato, nem para qual tipo de câncer ele seria usado. Mas informou que se tratava de um tipo de remédio produzido fora do país e que teve a patente quebrada.

Sala é encontrada com objetos revirados no Instituto Royal, em São Roque (SP)

O Royal também não detalha os experimentos alegando restrição contratual.

Os fitoterápicos eram baseados em plantas da flora nacional e poderiam ser usados, por exemplo, para combater dor e inflamações.

Ativistas dizem que os cães sofriam maus-tratos. O instituto nega. Ontem ele disse que, quando recuperados, receberão tratamento e podem “ser colocados para doação”.

Doutor em biofísica, Morales afirma que os cientistas “também não querem trabalhar com animais”, mas que o método é ainda o mais eficaz para testes de tratamentos médicos e vacinas.

“Seria possível não nos alimentarmos mais com carne? Com pesquisa é a mesma relação. Deixamos de usar animais e vamos testar vacinas em nossas crianças?”

Para Morales, as pessoas estão “confundindo” animais domésticos com cães que nasceram dentro de biotérios, sob condições controladas e rígidas para o uso científico.

“O apelo do cão é muito grande, tanto é que levaram todos os beagles, mas deixaram todos os ratos.”

A autoridade brasileira responsável por aprovar pesquisas com humanos, a Conep (Comissão Nacional de Ética em Pesquisa), não avaliza projetos de drogas que não tenham passado por testes de segurança em animais.

Cachorros estão em uma parcela pequena de experimentos científicos –nos quais os camundongos respondem por 74% dos animais. A maioria dos cães é usada para averiguar a toxicidade de medicamentos.

Editoria de arte/Folhapress


Citizen Scientists Gather Data on Urban Bees (Quest)

Post by  , Guest Contributor for  on Sep 13, 2013

Bumblebee Feature Photo

Image courtesy of Benson Kua.

Around the world, bees are dying in unprecedented numbers. While scientists hypothesize pesticides and habitat loss are to blame, the exact causes are still unclear. Gardeners and farmers are concerned about the fate of their bee-pollinated food and looking to the scientific community for information about how and why the bee populations are declining.

Unfortunately, money is tight as scientists struggle to gain the funding and resources for extensive bee studies.

Marie Clifford and Susan Waters, graduate researchers at the University of Washington in Seattle, have found a way to get around scarce research funding: citizen scientists. The Urban Pollination Project (UPP), co-founded in 2011, takes Seattle community gardeners and trains them to collect data on local bees. Tapping into citizen scientist efforts, Clifford and Waters can gather data from 35 Seattle community gardens – a scale of research otherwise outside of their resources and funding capabilities.

“Citizen science,” Clifford says, “allows scientists to address much broader scale questions than they might be able to address themselves.”

The citizen scientist gardeners at the Urban Pollination Project measure, count, and weigh tomatoes to understand how varying degrees of pollination affect tomato growth. They also pollinate the tomato flowers using a tuning fork, and are trained in bee identification. Their observations provide insight into what species of bees visit various Seattle community gardens.

Observations like these led to a sighting of the Western Bumblebee — a native bumblebee thought to be extinct– by bee enthusiast, Will Peterman. With citizen scientists performing observations around the city, Clifford and Waters hope to better understand which bees are pollinating our cities.

In about five years, Clifford and Waters hope to have enough data to make conclusions about what bumblebees need to survive in urban environments, like how much and what kind of habitat availability is required. As the project continues, Clifford and Waters want to get more gardeners involved.

Tuning Fork SV

Both bumblebees and a 128 Hertz tuning fork vibrate at the perfect frequency to pollinate tomato plants. The vibration can literally “shake” the pollen out of tomato plant flowers. Photo credit: Sarah Vaira.

While UPP works with Seattle gardeners to track where bumblebees nest and forage, other citizen projects such asiNaturalist andeBird, allow anyone with a smartphone or digital camera to help identify plants and animals. These kinds of identification projects can help scientists predict animal and plant behavior.

“[With citizen science] you can achieve things that you would never be able to achieve with a more standard set of funds and time and energy,” says Waters, “[This is] a kind of knowledge that is ultimately really useful … and it connects people to their local environment.”

Unregulated, Agricultural Ammonia Threatens U.S. National Parks’ Ecology (Science Daily)

Oct. 10, 2013 — Thirty-eight U.S. national parks are experiencing “accidental fertilization” at or above a critical threshold for ecological damage, according to a study published in the journal Atmospheric Chemistry and Physicsand led by Harvard University researchers. Unless significant controls on ammonia emissions are introduced at a national level, they say, little improvement is likely between now and 2050.

Foggy Tremont River, Great Smoky Mountains National Park. In Great Smoky Mountains National Park, the deposition of nitrogen compounds from pollution far exceeds a critical threshold for ecological damage. (Credit: © Dave Allen / Fotolia)

The environmental scientists, experts in air quality, atmospheric chemistry, and ecology, have been studying the fate of nitrogen-based compounds that are blown into natural areas from power plants, automobile exhaust, and — increasingly — industrial agriculture. Nitrogen that finds its way into natural ecosystems can disrupt the cycling of nutrients in soil, promote algal overgrowth and lower the pH of water in aquatic environments, and ultimately decrease the number of species that can survive.

“The vast majority, 85 percent, of nitrogen deposition originates with human activities,” explains principal investigator Daniel J. Jacob, Vasco McCoy Family Professor of Atmospheric Chemistry and Environmental Engineering at the Harvard School of Engineering and Applied Sciences (SEAS). “It is fully within our power as a nation to reduce our impact.”

Existing air quality regulations and trends in clean energy technology are expected to reduce the amount of harmful nitrogen oxides (NOx) emitted by coal plants and cars over time. However, no government regulations currently limit the amount of ammonia (NH3) that enters the atmosphere through agricultural fertilization or manure from animal husbandry, which are now responsible for one-third of the anthropogenic nitrogen carried on air currents and deposited on land.

“Ammonia’s pretty volatile,” says Jacob. “When we apply fertilizer in the United States, only about 10 percent of the nitrogen makes it into the food. All the rest escapes, and most of it escapes through the atmosphere.”

The team of scientists — comprising researchers from Harvard SEAS, the National Park Service, the USDA Forest Service, the U.S. Environmental Protection Agency, and the University of California, Irvine — presents evidence that unchecked increases in nitrogen deposition are already threatening the ecology of federally protected natural areas.

In many previous studies, environmental scientists have identified the nitrogen levels that would be ecologically harmful in various settings. The new Harvard-led study uses a high-resolution atmospheric model called GEOS-Chem to calculate nitrogen deposition rates across the contiguous United States, and compares those rates to the critical loads.

The findings suggest that many parks may already be suffering.

In Eastern temperate forests, like those in Great Smoky Mountains National Park, the most sensitive elements of the ecosystem are the hardwood trees, which start to suffer when nitrogen deposition reaches approximately 3 to 8 kilograms per hectare, per year. According to the new study, the actual rate of deposition — 13.6 kg/ha/yr — far exceeds that threshold. In the forests of Mount Rainier National Park, it’s the lichens that suffer first; their critical load is between 2.5 and 7.1 kg/ha/yr, and the deposition rate there is at a troubling 6.7 kg/ha/yr.

“The lichens might not be noticed or particularly valued by someone walking around a national park, but they’re integral for everything else that’s dependent on them,” explains lead author Raluca A. Ellis, who conducted the research as a postdoctoral fellow at Harvard SEAS. She now directs the Climate and Urban Systems Partnership at the Franklin Institute.

Jacob, Ellis, and their collaborators predict that NOx emissions from the United States will decrease significantly by 2050 (globally, those decreases may be offset to some extent by increases in industrialization overseas). But for ammonia, the story is different. The team predicts significant increases in the amount and density of agricultural land in the Midwest and the West — to feed a growing population and to meet an anticipated demand for biofuels — requiring more and more fertilizer.

“Even if anthropogenic NOx emissions were globally zero, avoiding [critical load] exceedance at all national parks would require a 55% reduction of anthropogenic NH3 emissions,” their report states.

How such a reduction would be achieved is a matter for further study.

“Air quality regulations in the United States have always focused on public health, because air pollution leads to premature deaths, and that’s something you can quantify very well. When you try to write regulations to protect ecosystems, however, the damage is much harder to quantify,” says Jacob. “At least in the national parks you can say, ‘There’s a legal obligation here.'”

The project was funded by the NASA Applied Sciences Program through the Air Quality Applied Sciences Team, which is led by Jacob at Harvard and includes 23 researchers from numerous institutions. The National Park Service has been studying nitrogen deposition for some time now, typically in focused studies such as those at Rocky Mountain National Park and Grand Teton National Park. The new collaboration has enabled many different research teams to unify their efforts and benefit from shared resources like the GEOS-Chem model, which was first developed at Harvard and has become an international standard for modeling atmospheric chemistry over time.

Actual levels of future nitrogen deposition will depend on a complex interplay of economic, legal, and environmental factors.

“The point is, in the decades ahead, the problem in our national parks is not going to be solved by the reduction of NOxemissions alone,” explains Ellis. “It will require a targeted effort to control ammonia.”

“It’s a national issue, and I think that’s why having the national perspective was so important,” Jacob adds. “We’ve shown that most of the nitrogen deposition to parks in the United States is coming from domestic sources. It’s not coming from China; it’s not coming from Canada — it’s something we can deal with, but we need to deal with it at the national level.”

Journal Reference:

  1. R. A. Ellis, D. J. Jacob, M. P. Sulprizio, L. Zhang, C. D. Holmes, B. A. Schichtel, T. Blett, E. Porter, L. H. Pardo, J. A. Lynch. Present and future nitrogen deposition to national parks in the United States: critical load exceedancesAtmospheric Chemistry and Physics, 2013; 13 (17): 9083 DOI: 10.5194/acp-13-9083-2013

The Reasons Behind Crime (Science Daily)

Oct. 10, 2013 — More punishment does not necessarily lead to less crime, say researchers at ETH Zurich who have been studying the origins of crime with a computer model. In order to fight crime, more attention should be paid to the social and economic backgrounds that encourage crime.

Whether a person turns criminal and commits a robbery depends greatly on the socio-economic circumstances in which he lives. (Credit: © koszivu / Fotolia)

People have been stealing, betraying others and committing murder for ages. In fact, humans have never succeeded in eradicating crime, although — according to the rational choice theory in economics — this should be possible in principle. The theory states that humans turn criminal if it is worthwhile. Stealing or evading taxes, for instance, pays off if the prospects of unlawful gains outweigh the expected punishment. Therefore, if a state sets the penalties high enough and ensures that lawbreakers are brought to justice, it should be possible to eliminate crime completely.

This theory is largely oversimplified, says Dirk Helbing, a professor of sociology. The USA, for example, often have far more drastic penalties than European countries. But despite the death penalty in some American states, the homicide rate in the USA is five times higher than in Western Europe. Furthermore, ten times more people sit in American prisons than in many European countries. More repression, however, can sometimes even lead to more crime, says Helbing. Ever since the USA declared the “war on terror” around the globe, the number of terrorist attacks worldwide has increased, not fallen. “The classic approach, where criminals merely need to be pursued and punished more strictly to curb crime, often does not work.” Nonetheless, this approach dominates the public discussion.

More realistic model

In order to better understand the origins of crime, Helbing and his colleagues have developed a new so-called agent-based model that takes the network of social interactions into account and is more realistic than previous models. Not only does it include criminals and law enforcers, like many previous models, but also honest citizens as a third group. Parameters such as the penalties size and prosecution costs can be varied in the model. Moreover, it also considers spatial dependencies. The representatives of the three groups do not interact with one another randomly, but only if they encounter each other in space and time. In particular, individual agents imitate the behaviour of agents from other groups, if this is promising.

Cycles of crime

Using the model, the scientists were able to demonstrate that tougher punishments do not necessarily lead to less crime and, if so, then at least not to the extent the punishment effort is increased. The researchers were also able to simulate how crime can suddenly break out and calm down again. Like the pig cycle we know from the economic sciences or the predator-prey cycles from ecology, crime is cyclical as well. This explains observations made, for instance, in the USA: according to the FBI’s Uniform Crime Reporting Program, cyclical changes in the frequency of criminal offences can be found in several American states. “If a state increases the investments in its punitive system to an extent that is no longer cost-effective, politicians will cut the law enforcement budget,” says Helbing. “As a result, there is more room for crime to spread again.”

“Many crimes have a socio-economic background”

But would there be a different way of combatting crime, if not with repression? The focus should be on the socio-economic context, says Helbing. As we know from the milieu theory in sociology, the environment plays a pivotal role in the behaviour of individuals. The majority of criminal acts have a social background, claims Helbing. For example, if an individual feels that all the friends and neighbours are cheating the state, it will inevitably wonder whether it should be the last honest person to fill in the tax declaration correctly.

“If we want to reduce the crime rate, we have to keep an eye on the socio-economic circumstances under which people live,” says Helbing. We must not confuse this with soft justice. However, a state’s response to crime has to be differentiated: besides the police and court, economic and social institutions are relevant as well — and, in fact, every individual when it comes to the integration of others. “Improving social conditions and integrating people socially can probably combat crime much more effectively than building new prisons.”

Journal Reference:

  1. Matjaž Perc, Karsten Donnay, Dirk Helbing. Understanding Recurrent Crime as System-Immanent Collective BehaviorPLoS ONE, 2013; 8 (10): e76063 DOI:10.1371/journal.pone.0076063

Mosquitos transgênicos no céu do sertão (Agência Pública)


10/10/2013 – 10h36

por Redação da Agência Pública

armadilhas 300x199 Mosquitos transgênicos no céu do sertão

As armadilhas são instrumentos instalados nas casas de alguns moradores da área do experimento. As ovitrampas, como são chamadas, fazem as vezes de criadouros para as fêmeas. Foto: Coletivo Nigéria

Com a promessa de reduzir a dengue, biofábrica de insetos transgênicos já soltou 18 milhões de mosquitos Aedes aegypti no interior da Bahia. Leia a história e veja o vídeo.

No começo da noite de uma quinta-feira de setembro, a rodoviária de Juazeiro da Bahia era o retrato da desolação. No saguão mal iluminado, funcionavam um box cuja especialidade é caldo de carne, uma lanchonete de balcão comprido, ornado por salgados, biscoitos e batata chips, e um único guichê – com perturbadoras nuvens de mosquitos sobre as cabeças de quem aguardava para comprar passagens para pequenas cidades ou capitais nordestinas.

Assentada à beira do rio São Francisco, na fronteira entre Pernambuco e Bahia, Juazeiro já foi uma cidade cortada por córregos, afluentes de um dos maiores rios do país. Hoje, tem mais de 200 mil habitantes, compõe o maior aglomerado urbano do semiárido nordestino ao lado de Petrolina – com a qual soma meio milhão de pessoas – e é infestada por muriçocas (ou pernilongos, se preferir). Os cursos de água que drenavam pequenas nascentes viraram esgotos a céu aberto, extensos criadouros do inseto, tradicionalmente combatidos com inseticida e raquete elétrica, ou janelas fechadas com ar condicionado para os mais endinheirados.

Mas os moradores de Juazeiro não espantam só muriçocas nesse início de primavera. A cidade é o centro de testes de uma nova técnica científica que utiliza Aedes aegypti transgênicos para combater a dengue, doença transmitida pela espécie. Desenvolvido pela empresa britânica de biotecnologia Oxitec, o método consiste basicamente na inserção de um gene letal nos mosquitos machos que, liberados em grande quantidade no meio ambiente, copulam com as fêmeas selvagens e geram uma cria programada para morrer. Assim, se o experimento funcionar, a morte prematura das larvas reduz progressivamente a população de mosquitos dessa espécie.

A técnica é a mais nova arma para combater uma doença que não só resiste como avança sobre os métodos até então empregados em seu controle. A Organização Mundial de Saúde estima que possam haver de 50 a 100 milhões de casos de dengue por ano no mundo. No Brasil, a doença é endêmica, com epidemias anuais em várias cidades, principalmente nas grandes capitais. Em 2012, somente entre os dias 1º de janeiro e 16 de fevereiro, foram registrados mais de 70 mil casos no país. Em 2013, no mesmo período, o número praticamente triplicou, passou para 204 mil casos. Este ano, até agora, 400 pessoas já morreram de dengue no Brasil.

Em Juazeiro, o método de patente britânica é testado pela organização social Moscamed, que reproduz e libera ao ar livre os mosquitos transgênicos desde 2011. Na biofábrica montada no município e que tem capacidade para produzir até 4 milhões de mosquitos por semana, toda cadeia produtiva do inseto transgênico é realizada – exceção feita à modificação genética propriamente dita, executada nos laboratórios da Oxitec, em Oxford. Larvas transgênicas foram importadas pela Moscamed e passaram a ser reproduzidas nos laboratórios da instituição.

Os testes desde o início são financiados pela Secretaria da Saúde da Bahia – com o apoio institucional da secretaria de Juazeiro – e no último mês de julho se estenderam ao município de Jacobina, na extremidade norte da Chapada Diamantina. Na cidade serrana de aproximadamente 80 mil habitantes, a Moscamed põe à prova a capacidade da técnica de “suprimir” (a palavra usada pelos cientistas para exterminar toda a população de mosquitos) o Aedes aegypti em toda uma cidade, já que em Juazeiro a estratégia se mostrou eficaz, mas limitada por enquanto a dois bairros.

“Os resultados de 2011 e 2012 mostraram que [a técnica] realmente funcionava bem. E a convite e financiados pelo Governo do Estado da Bahia resolvemos avançar e irmos pra Jacobina. Agora não mais como piloto, mas fazendo um teste pra realmente eliminar a população [de mosquitos]”, fala Aldo Malavasi, professor aposentado do Departamento de Genética do Instituto de Biociências da Universidade de São Paulo (USP) e atual presidente da Moscamed. A USP também integra o projeto.

Malavasi trabalha na região desde 2006, quando a Moscamed foi criada para combater uma praga agrícola, a mosca-das-frutas, com técnica parecida – a Técnica do Inseto Estéril. A lógica é a mesma: produzir insetos estéreis para copular com as fêmeas selvagens e assim reduzir gradativamente essa população. A diferença está na forma como estes insetos são esterilizados. Ao invés de modificação genética, radiação. A TIE é usada largamente desde a década de 1970, principalmente em espécies consideradas ameaças à agricultura. O problema é que até agora a tecnologia não se adequava a mosquitos como o Aedes aegypti, que não resistiam de forma satisfatória à radiação

O plano de comunicação

As primeiras liberações em campo do Aedes transgênico foram realizadas nas Ilhas Cayman, entre o final de 2009 e 2010. O território britânico no Caribe, formado por três ilhas localizadas ao Sul de Cuba, se mostrou não apenas um paraíso fiscal (existem mais empresas registradas nas ilhas do que seus 50 mil habitantes), mas também espaço propício para a liberação dos mosquitos transgênicos, devido à ausência de leis de biossegurança. As Ilhas Cayman não são signatárias do Procolo de Cartagena, o principal documento internacional sobre o assunto, nem são cobertas pela Convenção de Aarthus – aprovada pela União Europeia e da qual o Reino Unido faz parte – que versa sobre o acesso à informação, participação e justiça nos processos de tomada de decisão sobre o meio ambiente.

Ao invés da publicação e consulta pública prévia sobre os riscos envolvidos no experimento, como exigiriam os acordos internacionais citados, os cerca de 3 milhões de mosquitos soltos no clima tropical das Ilhas Cayman ganharam o mundo sem nenhum processo de debate ou consulta pública. A autorização foi concedida exclusivamente pelo Departamento de Agricultura das Ilhas. Parceiro local da Oxitec nos testes, a Mosquito Research & Control Unit (Unidade de Pesquisa e Controle de Mosquito) postou um vídeo promocional sobre o assunto apenas em outubro de 2010, ainda assim sem mencionar a natureza transgênica dos mosquitos. O vídeo foi divulgado exatamente um mês antes da apresentação dos resultados dos experimentos pela própria Oxitec no encontro anual daAmerican Society of Tropical Medicine and Hygiene (Sociedade Americana de Medicina Tropical e Higiene), nos Estados Unidos.

A comunidade científica se surpreendeu com a notícia de que as primeiras liberações no mundo de insetos modificados geneticamente já haviam sido realizadas, sem que os próprios especialistas no assunto tivessem conhecimento. A surpresa se estendeu ao resultado: segundo os dados da Oxitec, os experimentos haviam atingido 80% de redução na população de Aedes aegypti nas Ilhas Cayman. O número confirmava para a empresa que a técnica criada em laboratório poderia ser de fato eficiente. Desde então, novos testes de campo passaram a ser articulados em outros países – notadamente subdesenvolvidos ou em desenvolvimento, com clima tropical e problemas históricos com a dengue.

Depois de adiar testes semelhantes em 2006, após protestos, a Malásia se tornou o segundo país a liberar os mosquitos transgênicos entre dezembro de 2010 e janeiro de 2011. Seis mil mosquitos foram soltos num área inabitada do país. O número, bem menor em comparação ao das Ilhas Cayman, é quase insignificante diante da quantidade de mosquitos que passou a ser liberada em Juazeiro da Bahia a partir de fevereiro de 2011. A cidade, junto com Jacobina mais recentemente, se tornou desde então o maior campo de testes do tipo no mundo, com mais de 18 milhões de mosquitos já liberados, segundo números da Moscamed.

“A Oxitec errou profundamente, tanto na Malásia quanto nas Ilhas Cayman. Ao contrário do que eles fizeram, nós tivemos um extenso trabalho do que a gente chama de comunicação pública, com total transparência, com discussão com a comunidade, com visita a todas as casas. Houve um trabalho extraordinário aqui”, compara Aldo Malavasi.

Em entrevista por telefone, ele fez questão de demarcar a independência da Moscamed diante da Oxitec e ressaltou a natureza diferente das duas instituições. Criada em 2006, a Moscamed é uma organização social, sem fins lucrativos portanto, que se engajou nos testes do Aedes aegypti transgênico com o objetivo de verificar a eficácia ou não da técnica no combate à dengue. Segundo Malavasi, nenhum financiamento da Oxitec foi aceito por eles justamente para garantir a isenção na avaliação da técnica. “Nós não queremos dinheiro deles, porque o nosso objetivo é ajudar o governo brasileiro”, resume.

Em favor da transparência, o programa foi intitulado “Projeto Aedes Transgênico” (PAT), para trazer já no nome a palavra espinhosa. Outra determinação de ordem semântica foi o não uso do termo “estéril”, corrente no discurso da empresa britânica, mas empregada tecnicamente de forma incorreta, já que os mosquitos produzem crias, mas geram prole programada para morrer no estágio larval. Um jingle pôs o complexo sistema em linguagem popular e em ritmo de forró pé-de-serra. E o bloco de carnaval “Papa Mosquito” saiu às ruas de Juazeiro no Carnaval de 2011.

No âmbito institucional, além do custeio pela Secretaria de Saúde estadual, o programa também ganhou o apoio da Secretaria de Saúde de Juazeiro da Bahia. “De início teve resistência, porque as pessoas também não queriam deixar armadilhas em suas casas, mas depois, com o tempo, elas entenderam o projeto e a gente teve uma boa aceitação popular”, conta o enfermeiro sanitarista Mário Machado, diretor de Promoção e Vigilância à Saúde da secretaria.

As armadilhas, das quais fala Machado, são simples instrumentos instalados nas casas de alguns moradores da área do experimento. As ovitrampas, como são chamadas, fazem as vezes de criadouros para as fêmeas. Assim é possível colher os ovos e verificar se eles foram fecundados por machos transgênicos ou selvagens. Isso também é possível porque os mosquitos geneticamente modificados carregam, além do gene letal, o fragmento do DNA de uma água-viva que lhe confere uma marcação fluorescente, visível em microscópios.

Desta forma, foi possível verificar que a redução da população de Aedes aegypti selvagem atingiu, segundo a Moscamed, 96% em Mandacaru – um assentamento agrícola distante poucos quilômetros do centro comercial de Juazeiro que, pelo isolamento geográfico e aceitação popular, se transformou no local ideal para as liberações. Apesar do número, a Moscamed continua com liberações no bairro. Devido à breve vida do mosquito (a fêmea vive aproximadamente 35 dias), a soltura dos insetos precisa continuar para manter o nível da população selvagem baixo. Atualmente, uma vez por semana um carro deixa a sede da organização com 50 mil mosquitos distribuídos aos milhares em potes plásticos que serão abertos nas ruas de Mandacaru.

“Hoje a maior aceitação é no Mandacaru. A receptividade foi tamanha que a Moscamed não quer sair mais de lá”, enfatiza Mário Machado.

O mesmo não aconteceu com o bairro de Itaberaba, o primeiro a receber os mosquitos no começo de 2011. Nem mesmo o histórico alto índice de infecção pelo Aedes aegypti fez com que o bairro periférico juazeirense, vizinho à sede da Moscamed, aceitasse de bom grado o experimento. Mário Machado estima “em torno de 20%” a parcela da população que se opôs aos testes e pôs fim às liberações.

“Por mais que a gente tente informar, ir de casa em casa, de bar em bar, algumas pessoas desacreditam: ‘Não, vocês estão mentindo pra gente, esse mosquito tá picando a gente’”, resigna-se.

Depois de um ano sem liberações, o mosquito parece não ter deixado muitas lembranças por ali. Em uma caminhada pelo bairro, quase não conseguimos encontrar alguém que soubesse do que estávamos falando. Não obstante, o nome de Itaberaba correu o mundo ao ser divulgado pela Oxitec que o primeiro experimento de campo no Brasil havia atingido 80% de redução na população de mosquitos selvagens.

Supervisora de campo da Moscamed, a bióloga Luiza Garziera foi uma das que foram de casa em casa explicando o processo, por vezes contornando o discurso científico para se fazer entender. “Eu falava que a gente estaria liberando esses mosquitos, que a gente liberava somente o macho, que não pica. Só quem pica é a fêmea. E que esses machos quando ‘namoram’ – porque a gente não pode falar às vezes de ‘cópula’ porque as pessoas não vão entender. Então quando esses machos namoram com a fêmea, os seus filhinhos acabam morrendo”.

Este é um dos detalhes mais importantes sobre a técnica inédita. Ao liberar apenas machos, numa taxa de 10 transgênicos para 1 selvagem, a Moscamed mergulha as pessoas numa nuvem de mosquitos, mas garante que estes não piquem aqueles. Isto acontece porque só a fêmea se alimenta de sangue humano, líquido que fornece as proteínas necessárias para sua ovulação.

A tecnologia se encaixa de forma convincente e até didática – talvez com exceção da “modificação genética”, que requer voos mais altos da imaginação. No entanto, ainda a ignorância sobre o assunto ainda campeia em considerável parcela dos moradores ouvidos para esta reportagem. Quando muito, sabe-se que se trata do extermínio do mosquito da dengue, o que é naturalmente algo positivo. No mais, ouviu-se apenas falar ou arrisca-se uma hipótese que inclua a, esta sim largamente odiada, muriçoca.

A avaliação dos riscos

Apesar da campanha de comunicação da Moscamed, a ONG britânica GeneWatch aponta uma série de problemas no processo brasileiro. O principal deles, o fato do relatório de avaliação de riscos sobre o experimento não ter sido disponibilizado ao público antes do início das liberações. Pelo contrário, a pedido dos responsáveis pelo Programa Aedes Transgênico, o processo encaminhado à Comissão Técnica Nacional de Biossegurança (CTNBio, órgão encarregado de autorizar ou não tais experimentos) foi considerado confidencial.

“Nós achamos que a Oxitec deve ter o consentimento plenamente informado da população local, isso significa que as pessoas precisam concordar com o experimento. Mas para isso elas precisam também ser informadas sobre os riscos, assim como você seria se estivesse sendo usado para testar um novo medicamento contra o câncer ou qualquer outro tipo de tratamento”, comentou, em entrevista por Skype, Helen Wallace, diretora executiva da organização não governamental.

Especialista nos riscos e na ética envolvida nesse tipo de experimento, Helen publicou este ano o relatório Genetically Modified Mosquitoes: Ongoing Concerns (“Mosquitos Geneticamente Modificados: atuais preocupações”), que elenca em 13 capítulos o que considera riscos potenciais não considerados antes de se autorizar a liberação dos mosquitos transgênicos. O documento também aponta falhas na condução dos experimentos pela Oxitec.

Por exemplo, após dois anos das liberações nas Ilhas Cayman, apenas os resultados de um pequeno teste haviam aparecido numa publicação científica. No começo de 2011, a empresa submeteu os resultados do maior experimento nas Ilhas à revista Science, mas o artigo não foi publicado. Apenas em setembro do ano passado o texto apareceu em outra revista, a Nature Biotechnology, publicado como “correspondência” – o que significa que não passou pela revisão de outros cientistas, apenas pela checagem do próprio editor da publicação.

Para Helen Wallace, a ausência de revisão crítica dos pares científicos põe o experimento da Oxitec sob suspeita. Mesmo assim, a análise do artigo, segundo o documento, sugere que a empresa precisou aumentar a proporção de liberação de mosquitos transgênicos e concentrá-los em uma pequena área para que atingisse os resultados esperados. O mesmo teria acontecido no Brasil, em Itaberaba. Os resultados do teste no Brasil também ainda não foram publicados pela Moscamed. O gerente do projeto, Danilo Carvalho, informou que um dos artigos já foi submetido a uma publicação e outro está em fase final de escrita.

Outro dos riscos apontados pelo documento está no uso comum do antibiótico tetraciclina. O medicamento é responsável por reverter o gene letal e garantir em laboratório a sobrevivência do mosquito geneticamente modificado, que do contrário não chegaria à fase adulta. Esta é a diferença vital entre a sorte dos mosquitos reproduzidos em laboratório e a de suas crias, geradas no meio ambiente a partir de fêmeas selvagens – sem o antibiótico, estão condenados à morte prematura.

A tetraciclina é comumente empregada nas indústrias da pecuária e da aquicultura, que despejam no meio ambiente grandes quantidades da substância através de seus efluentes. O antibiótico também é largamente usado na medicina e na veterinária. Ou seja, ovos e larvas geneticamente modificados poderiam entrar em contato com o antibiótico mesmo em ambientes não controlados e assim sobreviverem. Ao longo do tempo, a resistência dos mosquitos transgênicos ao gene letal poderia neutralizar seu efeito e, por fim, teríamos uma nova espécie geneticamente modificada adaptada ao meio ambiente.

laboratorio 300x186 Mosquitos transgênicos no céu do sertãoA hipótese é tratada com ceticismo pela Oxitec, que minimiza a possibilidade disto acontecer no mundo real. No entanto, documento confidencial tornado público mostra que a hipótese se mostrou, por acaso, real nos testes de pesquisador parceiro da empresa. Ao estranhar uma taxa de sobrevivência das larvas sem tetraciclina de 15% – bem maior que os usuais 3% contatos pelos experimentos da empresa –, os cientistas da Oxitec descobriram que a ração de gato com a qual seus parceiros estavam alimentando os mosquitos guardava resquícios do antibiótico, que é rotineiramente usado para tratar galinhas destinadas à ração animal.

O relatório da GeneWatch chama atenção para a presença comum do antibiótico em dejetos humanos e animais, assim como em sistemas de esgotamento doméstico, a exemplo de fossas sépticas. Isto caracterizaria um risco potencial, já que vários estudos constataram a capacidade do Aedes aegypti se reproduzir em águas contaminadas – apesar de isso ainda não ser o mais comum, nem acontecer ainda em Juazeiro, segundo a Secretaria de Saúde do município.

Além disso, há preocupações quanto a taxa de liberação de fêmeas transgênicas. O processo de separação das pupas (último estágio antes da vida adulta) é feito de forma manual, com a ajuda de um aparelho que reparte os gêneros pelo tamanho (a fêmea é ligeiramente maior). Uma taxa de 3% de fêmeas pode escapar neste processo, ganhando a liberdade e aumentando os riscos envolvidos. Por último, os experimentos ainda não verificaram se a redução na população de mosquitos incide diretamente na transmissão da dengue.

Todas as críticas são rebatidas pela Oxitec e pela Moscamed, que dizem manter um rigoroso controle de qualidade – como o monitoramento constante da taxa de liberação de fêmeas e da taxa de sobrevivências das larvas sem tetraciclina. Desta forma, qualquer sinal de mutação do mosquito seria detectado a tempo de se suspender o programa. Ao final de aproximadamente um mês, todos os insetos liberados estariam mortos. Os mosquitos, segundo as instituições responsáveis, também não passam os genes modificados mesmo que alguma fêmea desgarrada pique um ser humano.

Mosquito transgênico à venda

Em julho passado, depois do êxito dos testes de campo em Juazeiro, a Oxitec protocolou a solicitação de licença comercial na Comissão Técnica Nacional de Biossegurança (CTNBio). Desde o final de 2012, a empresa britânica possui CNPJ no país e mantém um funcionário em São Paulo. Mais recentemente, com os resultados promissores dos experimentos em Juazeiro, alugou um galpão em Campinas e está construindo o que será sua sede brasileira. O país representa hoje seu mais provável e iminente mercado, o que faz com que o diretor global de desenvolvimento de negócios da empresa, Glen Slade, viva hoje numa ponte aérea entre Oxford e São Paulo.

“A Oxitec está trabalhando desde 2009 em parceria com a USP e Moscamed, que são parceiros bons e que nos deram a oportunidade de começar projetos no Brasil. Mas agora acabamos de enviar nosso dossiê comercial à CTNBio e esperamos obter um registro no futuro, então precisamos aumentar nossa equipe no país. Claramente estamos investindo no Brasil. É um país muito importante”, disse Slade numa entrevista por Skype da sede na Oxitec, em Oxford, na Inglaterra.

A empresa de biotecnologia é uma spin-out da universidade britânica, o que significa dizer que a Oxitec surgiu dos laboratórios de uma das mais prestigiadas universidades do mundo. Fundada em 2002, desde então vem captando investimentos privados e de fundações sem fins lucrativos, tais como a Bill & Melinda Gates, para bancar o prosseguimento das pesquisas. Segundo Slade, mais de R$ 50 milhões foram gastos nesta última década no aperfeiçoamento e teste da tecnologia.

O executivo espera que a conclusão do trâmite burocrático para a concessão da licença comercial aconteça ainda próximo ano, quando a sede brasileira da Oxitec estará pronta, incluindo uma nova biofábrica. Já em contato com vários municípios do país, o executivo prefere não adiantar nomes. Nem o preço do serviço, que provavelmente será oferecido em pacotes anuais de controle da população de mosquitos, a depender o orçamento do número de habitantes da cidade.

“Nesse momento é difícil dar um preço. Como todos os produtos novos, o custo de produção é mais alto quando a gente começa do que a gente gostaria. Acho que o preço vai ser um preço muito razoável em relação aos benefícios e aos outros experimentos para controlar o mosquito, mas muito difícil de dizer hoje. Além disso, o preço vai mudar segundo a escala do projeto. Projetos pequenos não são muito eficientes, mas se tivermos a oportunidade de controlar os mosquitos no Rio de Janeiro todo, podemos trabalhar em grande escala e o preço vai baixar”, sugere.

A empresa pretende também instalar novas biofábricas nas cidades que receberem grandes projetos, o que reduzirá o custo a longo prazo, já que as liberações precisam ser mantidas indefinidamente para evitar o retorno dos mosquitos selvagens. A velocidade de reprodução do Aedes aegypti é uma preocupação. Caso seja cessado o projeto, a espécie pode recompor a população em poucas semanas.

“O plano da empresa é conseguir pagamentos repetidos para a liberação desses mosquitos todo ano. Se a tecnologia deles funcionar e realmente reduzir a incidência de dengue, você não poderá suspender estas liberações e ficará preso dentro desse sistema. Uma das maiores preocupações a longo prazo é que se as coisas começarem a dar errado, ou mesmo se tornarem menos eficientes, você realmente pode ter uma situação pior ao longo de muitos anos”, critica Helen Wallace.

O risco iria desde a redução da imunidade das pessoas à doença, até o desmantelamento de outras políticas públicas de combate à dengue, como as equipes de agentes de saúde. Apesar de tanto a Moscamed quanto a própria secretaria de Saúde de Juazeiro enfatizarem a natureza complementar da técnica, que não dispensaria os outros métodos de controle, é plausível que hajam conflitos na alocação de recursos para a área. Hoje, segundo Mário Machado da secretaria de Saúde, Juazeiro gasta em média R$ 300 mil por mês no controle de endemias, das quais a dengue é a principal.

A secretaria negocia com a Moscamed a ampliação do experimento para todo o município ou mesmo para toda a região metropolitana formada por Juazeiro e Petrolina – um teste que cobriria meio milhão pessoas –, para assim avaliar a eficácia em grandes contingentes populacionais. De qualquer forma e apesar do avanço das experiências, nem a organização social brasileira nem a empresa britânica apresentaram estimativas de preço pra uma possível liberação comercial.

“Ontem nós estávamos fazendo os primeiros estudos, pra analisar qual é o preço deles, qual o nosso. Porque eles sabem quanto custa o programa deles, que não é barato, mas não divulgam”, disse Mário Machado.

Em reportagem do jornal britânico The Observer de julho do ano passado, a Oxitec estimou o custo da técnica em “menos de” £6 libras esterlinas por pessoa por ano. Num cálculo simples, apenas multiplicando o número pela contação atual da moeda britânia frente ao real e desconsiderando as inúmeras outras variáveis dessa conta, o projeto em uma cidade de 150 mil habitantes custaria aproximadamente R$ 3,2 milhões por ano.

Se imaginarmos a quantidade de municípios de pequeno e médio porte brasileiros em que a dengue é endêmica, chega-se a pujança do mercado que se abre – mesmo desconsiderando por hora os grandes centros urbanos do país, que extrapolariam a capacidade atual da técnica. Contudo, este é apenas uma fatia do negócio. A Oxitec também possui uma série de outros insetos transgênicos, estes destinados ao controle de pragas agrícolas e que devem encontrar campo aberto no Brasil, um dos gigantes do agronegócio no mundo.

Aguardando autorização da CTNBio, a Moscamed já se preparara para testar a mosca-das-frutas transgênica, que segue a mesma lógica do Aedes aegypti. Além desta, a Oxitec tem outras 4 espécies geneticamente modificadas que poderão um dia serem testadas no Brasil, a começar por Juazeiro e o Vale do São Francisco. A região é uma das maiores produtoras de frutas frescas para exportação do país. 90% de toda uva e manga exportadas no Brasil saem daqui. Uma produção que requer o combate incessante às pragas. Nas principais avenidas de Juazeiro e Petrolina, as lojas de produtos agrícolas e agrotóxicos se sucedem, variando em seus totens as logos das multinacionais do ramo.

“Não temos planos concretos [além da mosca-das-frutas], mas, claro, gostaríamos muito de ter a oportunidade de fazer ensaios com esses produtos também. O Brasil tem uma indústria agrícola muito grande. Mas nesse momento nossa prioridade número 1 é o mosquito da dengue. Então uma vez que tivermos este projeto com recursos bastante, vamos tentar acrescentar projetos na agricultura.”, comentou Slade.

Ele e vários de seus colegas do primeiro escalão da empresa já trabalharam numa das gigantes do agronegócio, a Syngenta. O fato, segundo Helen Wallace, é um dos revelam a condição do Aedes aegypti transgênico de pioneiro de todo um novo mercado de mosquitos geneticamente modificados: “Nos achamos que a Syngenta está principalmente interessada nas pragas agrícolas. Um dos planos que conhecemos é a proposta de usar pragas agrícolas geneticamente modificadas junto com semestres transgênicas para assim aumentar a resistências destas culturas às pragas”.

“Não tem nenhum relacionamento entre Oxitec e Syngenta dessa forma. Talvez tenhamos possibilidade no futuro de trabalharmos juntos. Eu pessoalmente tenho o interesse de buscar projetos que possamos fazer com Syngenta, Basf ou outras empresas grandes da agricultura”, esclarece Glen Slade.

Em 2011, a indústria de agrotóxicos faturou R$14,1 bilhões no Brasil. Maior mercado do tipo no mundo, o país pode nos próximos anos inaugurar um novo estágio tecnológico no combate às pestes. Assim como na saúde coletiva, com o Aedes aegypti transgênico, que parece ter um futuro comercial promissor. Todavia, resta saber como a técnica conviverá com as vacinas contra o vírus da dengue, que estão em fase final de testes – uma desenvolvida por um laboratório francês, outra pelo Instituto Butantan, de São Paulo. As vacinas devem chegar ao público em 2015. O mosquito transgênico, talvez já próximo ano.

Dentre as linhagens de mosquitos transgênicos, pode surgir também uma versão nacional. Como confirmou a professora Margareth de Lara Capurro-Guimarães, do Departamento de Parasitologia da USP e coordenadora do Programa Aedes Transgênico, já está sob estudo na universidade paulista a muriçoca transgênica. Outra possível solução tecnológica para um problema de saúde pública em Juazeiro da Bahia – uma cidade na qual, segundo levantamento do Sistema Nacional de Informações sobre Saneamento (SNIS) de 2011, a rede de esgoto só atende 67% da população urbana.

* Publicado originalmente no site Agência Pública.

(Agência Pública)

O Brasil na contramão (IPS)

Inter Press Service – Reportagens

11/10/2013 – 09h20

por Fabíola Ortiz, da IPS

transito1 O Brasil na contramão

Tráfego na avenida 23 de Maio, em São Paulo. Foto: Photostock/IPS

Rio de Janeiro, Brasil, 11/10/2013 – Nos últimos cinco anos, em plena crise econômica internacional, o Brasil passou a integrar o grupo dos grandes poluidores mundiais, cuja fonte principal de gases-estufa é a queima de combustíveis fósseis. Esse país está assumindo um perfil de contaminação climática próprio do primeiro mundo, segundo o cientista José Marengo, um dos autores do Quinto Informe de Avaliação do Grupo Intergovernamental de Especialistas sobre a Mudança Climática (IPCC), cujo primeiro volume sem editar foi divulgado no dia 30 de setembro.

E isto se deve, em parte, a uma simples razão de fenômeno industrial e de consumo. As isenções de impostos para estimular a venda de automóveis e motocicletas tiveram um efeito positivo no crescimento econômico. Contudo, ao mesmo tempo, criaram um aumento vertiginoso do parque automotivo. A quantidade de automóveis duplicou em uma década, passando de 24,5 milhões em 2001 para 50,2 milhões em 2012, segundo o informe Evolução da Frota de Automóveis e Motos no Brasil – Relatório 2013, divulgado ontem.

As motocicletas tiveram um aumento ainda mais espetacular no mesmo período, passando de 4,5 milhões para 19,9 milhões. O Brasil “terminou 2012 com uma frota total de 76.137.125 veículos automotores. Em 2001, havia aproximadamente 31,8 milhões de unidades. Houve, portanto, aumento de 138,6%”, afirma o documento publicado pelo Observatório das Metrópoles. “Vale recordar que o crescimento populacional do país entre os últimos censos (2000 e 2010) foi de 11,8%”, acrescenta.

“É preocupante, porque sempre criticamos os países desenvolvidos por isso”, observou Marengo, que dirige o Centro de Ciência do Sistema Terrestre do Instituto Nacional de Pesquisas Espaciais. Esse aspecto contrasta com a redução do intenso desmatamento no país, amplamente divulgado pelas autoridades brasileiras.

Em 27 de setembro, quando o IPCC divulgou o Resumo para Responsáveis por Políticas, o secretário de Pesquisa e Desenvolvimento do Ministério de Ciência e Tecnologia, Carlos Nobre, dizia à IPS que este país conseguiu reduzir em 38,4% suas emissões de gases-estufa entre 2005 e 2010, devido à redução no desmatamento da Amazônia.

O Brasil se comprometeu em 2009 a reduzir suas emissões de gases-estufa entre 36,1% e 38,9%, segundo dois cenários de crescimento do produto interno bruto. O governo garante que já avançou 62% rumo a essa meta, graças à acentuada redução do desmatamento. Até 2009, o desmatamento era a causa de 60% da contaminação climática do Brasil, enquanto o uso de combustíveis fósseis estava em segundo lugar. Agora emergem novos problemas.

“Se tivéssemos um sistema de transportes de massa confiável e confortável, as pessoas deixaram seus carros em casa. Mas, viajar em certas horas do dia no metrô de São Paulo ou do Rio de Janeiro (duas das maiores cidades do país) é uma humilhação”, disse Marengo à IPS. “Isso precisa mudar, e a única forma é fomentar um transporte público decente”.

Para o diretor de políticas públicas do Greenpeace Brasil, Sergio Leitão, essa mudança de perfil também coincide com a prioridade que se dá a novos empreendimentos, como a prospecção e exploração das jazidas de petróleo do pré-sal, a mais de sete mil metros de profundidade na plataforma submarina. “Estamos começando a exploração do pré-sal e nossas grandes cidades estão abarrotadas de carros”, pontuou Leitão. Enquanto o mundo caminha para novos modelos energéticos, o Brasil segue na contramão, segundo o ativista, tornando impossível que este país seja “amigo do planeta”, afirmou.

O informe do IPCC diz que as mudanças observadas desde 1950 não têm precedentes e demonstram que a ação do homem é uma causa inequívoca do aquecimento global registrado desde meados do século 20. O informe assinala que a humanidade deve fazer todos os esforços para manter o clima do planeta nas coordenadas do cenário mais otimista, com o aquecimento global não superando os dois graus neste século.

Para conseguir isso, segundo Leitão, as “medidas fundamentais, urgentes e inevitáveis” são mudar o modelo de produção e reduzir drasticamente o consumo de petróleo, gás e carvão. “Nos preocupa o fato de no Brasil o pré-sal ser visto como a grande oportunidade econômica do futuro”, afirmou. Na área energética, os grandes volumes de investimentos são destinados a viabilizar a exploração do petróleo no pré-sal, com até US$ 340 milhões até 2020, ressaltou.

Por outro lado, Leitão disse que “seria preciso adotar um rumo diferente, de pesquisas em energias renováveis e limpas. O Brasil se destaca em abundância de sol e vento. É necessário dinamizar essas vertentes e criar substitutos tecnológicos para os combustíveis fósseis”.

Marengo destacou que, se o mundo inteiro deixasse de emitir gases-estufa hoje, seriam necessários 20 anos para frear as transformações climáticas já desatadas. “O IPCC fala de aproximadamente duas décadas, pois foram centenas de anos acumulando dióxido de carbono (CO2). Os processos de fotossíntese nas florestas podem ajudar a absorver CO2, mas isso não é imediato e exige décadas de inércia”, destacou.

As medidas de mitigação – para reduzir a quantidade de gases lançados na atmosfera – são caras e seus efeitos são de longo prazo, mas são as únicas que permitirão minimizar os impactos futuros, acrescentou Marengo, para quem os impactos mais severos começarão a ser sentidos depois de 2040.

Adaptar-se a essas alterações é possível, mas a mensagem que o IPCC pretende dar à próxima cúpula mundial do clima, que se reunirá em novembro em Varsóvia, é que devem tomar medidas para evitar os cenários mais pessimistas, com elevações da temperatura média acima dos dois graus.

Marengo lamentou que a agenda ambiental tenha passado para segundo plano desde que começou a crise econômica e financeira mundial em 2008. “É impossível um país com uma situação econômica ruim aderir a um tratado ambiental, pois este terá um custo social elevado”, enfatizou.


Building Cyberinfrastructure Capacity for the Social Sciences (American Anthropological Association)

Posted on October 9, 2013 by Joslyn O.

Today’s guest blog post is by Dr. Emilio Moran. Dr. Moran is Distinguished Professor Emeritus, Indiana University and Visiting Hannah Distinguished Professor, Michigan State University.

emilio-moran_profileThe United States and the world are changing rapidly.  These new conditions challenge the ability of the social, behavioral and economic sciences to understand what is happening at a national scale and in people’s daily local lives.   Forces such as globalization, the shifting composition of the economy, and the revolution in information brought about by the internet and social media are just a few of the forces that are changing Americans’ lives.  Not only has the world changed since data collection methods currently used were developed, but the ways now available to link information and new data sources have radically changed. Expert panels have called for increasing the cyber-infrastructure capability of the social, behavioral, and economic (SBE) sciences so that our tools and research infrastructure keep pace with these changing social and informational landscapes.  A series of workshops for the past three years has met to address these challenges and they now invite you to provide them with feedback on the proposal below and you are invited to attend a Special Event at this year’s AAA meeting in Chicago, Saturday, November 23, 2013 from 1215 to 1:30 pm at the Chicago Hilton Boulevard C room.

Needed is a new national framework, or platform, for social, behavioral and economic research that is both scalable and flexible; that permits new questions to be addressed; that allows for rapid response and adaptation to local shocks (such as extreme weather events or natural resource windfalls); and that facilitates understanding local manifestations of national phenomena such as economic downturns.  To advance a national data collection and analysis infrastructure, the approach we propose —  building a network of social observatories — is a way to have a sensitive instrument to measure how local communities respond to a range of natural and social conditions over time.  This new scientific infrastructure will enable the SBE sciences to contribute to societal needs at multiple levels and will facilitate collaboration with other sciences in addressing questions of critical importance.

Our vision is that of a network of observatories designed from the ground up, each observatory representing an area of the United States.  From a small number of pilot projects the network would develop (through a national sampling frame and protocol) into a representative sample of the places where people live and the people who live there. Each observatory would be an entity, whether physical or virtual, that is charged with collecting, curating, and disseminating data from people, places, and institutions in the United States.  These observatories must provide a basis for inference from what happens in local places to a national context and ensure a robust theoretical foundation for social analysis.  This is the rationale for recommending that this network of observatories be built on a population-based sample capable of addressing the needs of the nation’s diverse people but located in the specific places and communities where they live and work.  Unlike most other existing research platforms, this population and place-based capability will ensure that we understand not only the high-density urban and suburban places where the majority of the population lives, but also the medium- and low-density exurban and rural places that represent a vast majority of the land area in the nation.

To accomplish these objectives, we propose to embed in these regionally-based observatories a nationally representative population-based sample that would enable the observatory data to be aggregated in such a way as to produce a national picture of the United States on an ongoing basis.  The tentative plan would be to select approximately 400 census tracts to represent the U.S. population while also fully capturing the diversity that characterizes local places. The individuals, institutions and communities in which these census tracts are embedded will be systematically studied over time and space by observatories spread across the country. During the formative stages the number of census tracts and the number of observatories that might be needed, given the scope of the charge that is currently envisioned, will be determined.

These observatories will study the social, behavioral and economic experiences of the population in their physical and environmental context at fine detail. The observatories are intended to stimulate the development of new directions and modes of inquiry.  They will do so through the use of diverse complementary methods and data sources including ethnography, experiments, administrative data, social media, biomarkers, and financial and public health record. These observatories will work closely with local and state governments to gain access to administrative records that provide extensive data on the population in those tracts (i.e. 2 million people) thereby providing a depth of understanding and integration of knowledge that is less invasive and less subject to declining response rates than survey-derived data.

To attain the vision proposed here we need the commitment and enthusiasm of the community to meet these challenges and the resolve to make this proposed network of observatories useful to the social sciences and society. For more details on our objectives and reports from previous meetings, visit Please contribute your ideas at the site so that the proposal can benefit from your input and come to Chicago for the Special Event on Saturday, November 23, 2013. We are particularly interesting in hearing how this platform could help you in your future research. This is an opportunity for anthropological strengths in ethnography and local research to contribute its insights in a way that will make a difference for local people and for the nation.

Emilio F. Moran, co-Chair of the SOCN
Distinguished Professor Emeritus, Indiana University and
Visiting Hannah Distinguished Professor, Michigan State University

Chimpanzees of a Feather Sit Together: Friendships Are Based On Similar Personalities (Science Daily)

Oct. 9, 2013 — Like humans, many animals have close and stable friendships. However, until now, it has been unclear what makes particular individuals bond. Cognitive Biologists of the University of Vienna, Austria, and the University of Zurich, Switzerland, explored the question and found that chimpanzees choose their friendships based on similarity of personality.

The results of this study appear in the scientific journal Evolution and Human Behaviour.

Chimpanzee Tushi. (Credit: (Copyright: Jorg Massen)

Jorg Massen (University of Vienna) and Sonja Koski (University of Zurich) together measured chimpanzee personality in two zoos with behavioural experiments and years of observations of chimpanzee behaviour. They also carefully logged which chimpanzee sat in body contact with whom most. “This is a clear sign of friendship among chimpanzees,” explains Jorg Massen. Subsequently, the researchers tested, if those chimpanzees who sit together frequently have similar or different personality types.

“We found that, especially among unrelated friends, the most sociable and bold individuals preferred the company of other highly sociable and bold individuals, whereas shy and less sociable ones spent time with other similarly aloof and shy chimpanzees,” says the researcher. The researchers argue that such a strong preference for self-like individuals is probably adaptive, because frequent cooperation becomes more reliable when both partners have similar behavioural tendencies and emotional states.

This finding strongly resembles the known “similarity effect” in humans: We tend to make friends with people who are equally extraverted, friendly and bold as ourselves. “It appears that what draws and keeps both chimpanzee and human friends together is similarity in gregariousness and boldness, suggesting that preference for self-like friends dates back to our last common ancestor,” ends Jorg Massen.

Journal Reference:

  1. Jorg J.M. Massen, Sonja E. Koski. Chimps of a feather sit together: chimpanzee friendships are based on homophily in personalityEvolution and Human Behavior, 2013; DOI: 10.1016/j.evolhumbehav.2013.08.008

Terrestrial Ecosystems at Risk of Major Shifts as Temperatures Increase (Science Daily)

Oct. 8, 2013 — Over 80% of the world’s ice-free land is at risk of profound ecosystem transformation by 2100, a new study reveals. “Essentially, we would be leaving the world as we know it,” says Sebastian Ostberg of the Potsdam Institute for Climate Impact Research, Germany. Ostberg and collaborators studied the critical impacts of climate change on landscapes and have now published their results inEarth System Dynamics, an open access journal of the European Geosciences Union (EGU).

This image shows simulated ecosystem change by 2100, depending on the degree of global temperature increase: 2 degrees Celsius (upper image) or five degrees Celsius (lower image) above preindustrial levels. The parameter “ (Gamma) measures how far apart a future ecosystem under climate change would be from the present state. Blue colours (lower “) depict areas of moderate change, yellow to red areas (higher “) show major change. The maps show the median value of the “ parameter across all climate models, meaning at least half of the models agree on major change in the yellow to red areas, and at least half of the models are below the threshold for major change in the blue areas. (Credit: Ostberg et al., 2013)

The researchers state in the article that “nearly no area of the world is free” from the risk of climate change transforming landscapes substantially, unless mitigation limits warming to around 2 degrees Celsius above preindustrial levels.

Ecosystem changes could include boreal forests being transformed into temperate savannas, trees growing in the freezing Arctic tundra or even a dieback of some of the world’s rainforests. Such profound transformations of land ecosystems have the potential to affect food and water security, and hence impact human well-being just like sea level rise and direct damage from extreme weather events.

The new Earth System Dynamics study indicates that up to 86% of the remaining natural land ecosystems worldwide could be at risk of major change in a business-as-usual scenario (see note). This assumes that the global mean temperature will be 4 to 5 degrees warmer at the end of this century than in pre-industrial times — given many countries’ reluctance to commit to binding emissions cuts, such warming is not out of the question by 2100.

“The research shows there is a large difference in the risk of major ecosystem change depending on whether humankind continues with business as usual or if we opt for effective climate change mitigation,” Ostberg points out.

But even if the warming is limited to 2 degrees, some 20% of land ecosystems — particularly those at high altitudes and high latitudes — are at risk of moderate or major transformation, the team reveals.

The researchers studied over 150 climate scenarios, looking at ecosystem changes in nearly 20 different climate models for various degrees of global warming. “Our study is the most comprehensive and internally consistent analysis of the risk of major ecosystem change from climate change at the global scale,” says Wolfgang Lucht, also an author of the study and co-chair of the research domain Earth System Analysis at the Potsdam Institute for Climate Impact Research.

Few previous studies have looked into the global impact of raising temperatures on ecosystems because of how complex and interlinked these systems are. “Comprehensive theories and computer models of such complex systems and their dynamics up to the global scale do not exist.”

To get around this problem, the team measured simultaneous changes in the biogeochemistry of terrestrial vegetation and the relative abundance of different vegetation species. “Any significant change in the underlying biogeochemistry presents an ecological adaptation challenge, fundamentally destabilising our natural systems,” explains Ostberg.

The researchers defined a parameter to measure how far apart a future ecosystem under climate change would be from the present state. The parameter encompasses changes in variables such as the vegetation structure (from trees to grass, for example), the carbon stored in the soils and vegetation, and freshwater availability. “Our indicator of ecosystem change is able to measure the combined effect of changes in many ecosystem processes, instead of looking only at a single process,” says Ostberg.

He hopes the new results can help inform the ongoing negotiations on climate mitigation targets, “as well as planning adaptation to unavoidable change.”


Even though 86% of land ecosystems are at risk if global temperature increases by 5 degrees Celsius by 2100, it is unlikely all these areas will be affected. This would mean that the worst case scenario from each climate model comes true.

Journal Reference:

  1. S. Ostberg, W. Lucht, S. Schaphoff, D. Gerten. Critical impacts of global warming on land ecosystemsEarth System Dynamics, 2013; 4 (2): 347 DOI: 10.5194/esd-4-347-2013

Explosive Dynamic Behavior On Twitter and in the Financial Market (Science Daily)

Oct. 7, 2013 — Over the past 10 years, social media has changed the way that people influence each other. By analysing data from the social networking service, Twitter, and stock trading in the financial market, researchers from the Niels Bohr Institute have shown that events in society bring rise to common behaviour among large groups of people who do not otherwise know each other The analysis shows that there are common features in user activity on Twitter and in stock market transactions in the financial market.

The results are published in the scientific journal, PNAS, Proceedings of the National Academy of Sciences.

The figure shows how often the international brands IBM, Pepsi and Toyota have been mentioned during a five-week period on Twitter. The activity is during long periods relatively steady and is then interrupted by sudden activity spikes. The research from NBI shows that there are common features in user activity on Twitter and in stock market transactions in the financial market. (Credit: Niels Bohr Institute)

“The whole idea of the study is to understand how social networks function. The strength of using the popular social media, Twitter, is that they are more than 200 million users worldwide, who write short messages about immediate experiences and impressions. This means that you can directly study human behaviour in crowds on the web. Twitter can be seen as a global social sensor,” explains Joachim Mathiesen, Associate Professor of physics at the Niels Bohr Institute at the University of Copenhagen.

Dynamic Twitter behaviour

Joachim Mathiesen developed a programme that could follow the use of Twitter constantly. He could see that there were periods with relatively steady activity and then there would be a very abrupt and intense upswing in activity. Suddenly there was an event that everyone had to respond to and there was an explosion in the amount of online activity.

“There arises a collective behaviour between people who otherwise do not know each other, but who are coupled together via events in society,” explains Joachim Mathiesen.

The analysis also took into account how frequently approximately 100 international brands, like Pepsi, IBM, Apple, Nokia, Toyota, etc. occurred in messages on Twitter. Here too, the level is characterised by days of steady activity, which is interrupted by sudden, but brief explosions of activity.

“Something happens that encourages people to write on Twitter, and suddenly the activity explodes. This is a kind of horde behaviour that is driven by an external even and gets crowds to react,” says Joachim Mathiesen.

But why is a physicist concerning himself with social behaviour?

“As physicists, we are good at understanding large amounts of complex data and we can create systems in this sea of coincidences. Complex systems are seen in many contexts and are simply learning about human behaviour in large social groups,” he explains.

The model calculations shed light on the statistical properties of large-scale user activity on Twitter and the underlying contexts. Similarly, he analysed the fluctuations in the activity of trading shares on the financial market.

“We saw prolonged intervals with steady activity, after which there was sudden and almost earthquake like activity. An even starts an avalanche in the trading. Statistically, we see the same characteristic horde behaviour in the financial markets as we do on Twitter, so the two social systems are not that different,” concludes Joachim Mathiesen.

Journal Reference:

  1. J. Mathiesen, L. Angheluta, P. T. H. Ahlgren, M. H. Jensen.Excitable human dynamics driven by extrinsic events in massive communitiesProceedings of the National Academy of Sciences, 2013; DOI: 10.1073/pnas.1304179110

Politics and Perceptions: Social Media, Politics Collide in New Study (Science Daily)

Oct. 8, 2013 — It bothered Lindsay Hoffman and colleagues to see other researchers making broad yet vague claims about the role social media plays in political participation.

So they decided to study it.

In a paper to be published in November in the journal Computers in Human Behavior, the University of Delaware associate professor in the departments of Communication and of Political Science and International Relations and her co-authors explored how people perceive their own political behaviors online. It is part of a larger goal to better understand why people engage in politics both on- and offline.

The study is titled “Does My Comment Count? Perceptions of Political Participation in an Online Environment.” Dannagal Young, associate professor of communication, and Philip Jones, associate professor of political science and international relations, teamed up with Hoffman for the study.

It was built around the question of whether, when people engage in political behavior online — “liking” a candidate’s Facebook page, tweeting their thoughts about a political platform, signing a virtual petition — they see their activities as having influence on the functions of government (participation) or as communication with others.

“A lot of people in the 2008 elections were participating on Facebook and on blogs,” Hoffman said (Twitter didn’t play as strong a role then). .” .. We were interested in which is participatory and which is seen as communication.”

Hoffman said many claims had been made about the substantial role social media has played in mobilizing people to become more politically active. Some also believe online political engagement is replacing traditional, offline forms of political behavior, prompting people to play a less active role when it comes to activities like voting.

But without a way to define how people perceived what they were doing when they engaged in politics online, Hoffman and her co-authors were skeptical.

The UD researchers relied on a survey of roughly 1,000 randomly selected American adults to assess what people were doing politically on- and offline, what they had done in the past, to what extent they thought their activities were a good way to influence the government and to what extent they thought their actions were a good way to communicate with others.

The survey, which was completed in the summer of 2010, focused on 11 political behaviors, including voting in an election, communicating online about politics, signing up for online political information, friending or “liking” a candidate or politician and putting up a yard sign or wearing a political shirt.

The work led the researchers to conclude people have a realistic notion of what they are doing when they engage in politics online.

“People are more savvy than we think they are,” Hoffman said. “They viewed every type of behavior mentioned except voting as communication.”

People in the study perceived their on- and offline behaviors as playing different political roles. They seemed not to be replacing traditional, offline political engagement with online behaviors, Hoffman and her co-authors found.

“They are not duped into thinking they can influence government or take a hands-off approach” just by being involved online, she said.

Those in the study who reported being more confident in government and their ability to have an impact were even more motivated to engage in online political activities when they perceived it as communication, the study also found.

“If people see it as communication, they are more likely to participate,” Hoffman said. “Communication is a key cornerstone in political involvement.”

This study was one of the first funded through a $50,000 Innovations through Collaborations Grant awarded by the College of Arts and Sciences’ Interdisciplinary Humanities Research Center, which supports cross-disciplinary work.

Hoffman first met Jones in 2009, when he joined the University. They discovered they shared academic interests, and the grant helped bring their ideas together. The collaboration between the three researchers also resulted in a second publication examining the impact of candidate emotion on political participation. That studied was published online in the journal New Media and Society in December 2012.

“It was the summer of 2010 when we did the online survey asking about how people participate in politics online,” said Hoffman. “There were a lot of high emotions, the tea party was forming, and we wondered how that might impact certain types of political behaviors.”

The study worked toward filling a void in the literature, where few have looked at the effect a candidate’s emotions — like anger, anxiety and hopefulness — have on how people engage in politics. It also challenged the notion that emotional candidates sway voters, particularly those least involved or least knowledgeable about politics.

The researchers found that the online emotional appeal of a candidate did not influence a person’s likelihood of participating on that candidate’s behalf, unless that person was already highly engaged and knowledgeable. The particular emotion expressed was unimportant.

Hoffman is pleased the collaboration with Young and Jones proved so fruitful.

“You hope for the best: to have good data and results that are interesting and compelling,” said Hoffman. “I am really proud of our collaborations.”

Journal Reference:

  1. Lindsay H. Hoffman, Philip Edward Jones, Dannagal Goldthwaite Young. Does my comment count? Perceptions of political participation in an online environmentComputers in Human Behavior, 2013; 29 (6): 2248 DOI: 10.1016/j.chb.2013.05.010

Transgendered Males Seen as an Asset to Some Ancestral Societies (Science Daily)

Oct. 2, 2013 — Transgendered androphilic males were accepted in traditional hunter-gatherer cultures because they were an extra set of hands to support their families. Conversely, by investing in and supporting their kin, these males ensured that their familial line — and therefore also their own genetic make-up — passed on to future generations despite their not having children of their own. This is according to an ethnographic study led by Doug VanderLaan of the Centre for Addiction and Mental Health in Canada, published in Springer’s journal Human Nature. The study reports that this “kin selection” is still at play in pro-transgender societies today.

‘Androphilia’ refers to a predominant sexual attraction towards adult males, and takes on one of two possible gender roles depending on the cultural context: sex-gender congruent male androphilia (the typical male gender role) or transgendered androphilia (a gender role markedly similar to that of females in a given culture). Typically one of these variations is dominant within a society. For example, sex-gender congruency is more common in Western cultures, whereas the transgendered form is more typical of non-Western cultures, such as that of the Polynesian island nation of Samoa. The researchers also wanted to test predictions that enhanced kin-directed altruism is prominent in societies in which transgendered male androphilia is predominant.

To answer this question, VanderLaan and his colleagues compared the sociocultural environment of contemporary transgendered societies with ancestral small-group hunter-gatherers. Ancestral group size, sociopolitical systems, religious beliefs and patterns of residency were analyzed in 146 non-transgendered societies, and 46 transgender societies.

The analysis utilized ethnographic information about well-described nonindustrial societies from the Standard Cross-Cultural Sample.VanderLaan and his colleagues found that transgendered male androphilia is an ancestral phenomenon typically found in communities with certain ancestral sociocultural conditions, such as “bilateral descent.” This term refers to societies in which the families of both one’s father and mother are equally important for emotional, social, spiritual and political support, as well as the transfer of property or wealth.

Also, the acceptance and tolerance of same-sex behavior evolved within a suitable, accepting environment in which discrimination against transgendered males was rare. Importantly, kin selection might have played a vital part in maintaining genes for male androphilia these societies. For example, it continues to be a driving force in contemporary Samoan fa’afafine transgender communities.Unless transgendered androphilic males are accepted by their families, the opportunities for them to invest in kin are likely limited. What was true of our ancestors still holds true. A society’s specific social organization and its general acceptance of transgenderism and homosexuality is even important today. When supported by society, transgendered males invest their time and energy in their kin in turn.

Journal Reference:

  1. Doug P. VanderLaan, Zhiyuan Ren, Paul L. Vasey. Male Androphilia in the Ancestral EnvironmentHuman Nature, 2013; DOI: 10.1007/s12110-013-9182-z